The Life of a Flag

The Life of a Flag


YouTube allows people anywhere to share their stories with the world so that everyone can access information, gain deeper understanding, break down barriers and come together around shared passions. When you use YouTube, you join a community. That’s why we have Community Guidelines—they
help us keep YouTube fun and safe for everyone. When people feel content violates our Community Guidelines, they can report it…to YouTube. We call this “flagging.” When you flag a video, you have an opportunity to indicate which policy you believe the content violates. YouTube also receives flags from subject-matter experts. These individuals, NGOs, government organizations and academics all over the world help us flag content related to the most sensitive subjects, like child safety, violent extremism and hate speech. But did you know the vast majority of YouTube’s
flags don’t come from people? They come from technology. We’ve developed powerful machine learning
that detects content that may violate our policies and sends it for human review. In some cases, that same machine learning automatic ally takes an action, like removing spam videos. When human review is needed, one of our reviewers on YouTube’s Policy and Enforcement Team takes a look. Reviewers evaluate flagged videos against
all of our Community Guidelines and policies. This includes evaluating metadata like the
title, description and tags. Reviewers also assess if the content contains ‘educational, documentary, scientific, or artistic intent. If the intent of the video fits into one of
these categories, we generally leave it up. After carefully consulting the guidelines,
the reviewer takes action. For a flagged video, a reviewer can take several
actions, Including: restricting it, removing it, or keeping it live. An example of a restriction is age-gating,
which means the video stays on YouTube, but users must be over 18 and logged in to view
it. Removing a video removes it from YouTube. This is done when the content violates our
policies. We send the uploader an email explaining which
of our Community Guidelines the video violated. When a video is found to be in violation of
our guidelines, a user receives a strike. Accumulated strikes may result in a channel’s
termination—three strikes and you’re out! If a strike is particularly egregious or a
whole channel is found in violation of YouTube’s Community Guidelines, we may remove the channel
and its videos immediately. Uploaders are notified and may have the opportunity
to appeal the reviewer’s decision. If a YouTube reviewer encounters content that
poses an urgent threat, they can escalate the flag and YouTube may notify appropriate
external parties like local law enforcement. Our processes and policies were developed
to protect our community and ensure that YouTube continues to be a place where creators, advertisers,
and viewers can thrive. Thanks for helping us keep the YouTube community
safe.

Leave a Reply

Your email address will not be published. Required fields are marked *