Recently, YouTube has been frequently criticized for leading viewers to content promoting disinformation, conspiracy theories, and radical thought through the site’s ‘Recommended for you’ and ‘Up next’ features, which suggest videos for users to watch. As reported by The Wall Street Journal in February 2018, YouTube’s algorithm for recommendations is responsible for more than 70% of videos viewed, making it the main determinant for what people end up watching when using the video streaming site [1]. In a recent Buzzfeed News report, it was also found that watching a nonpartisan video could lead a user to a video promoting extremist perspectives in a matter of just six recommendations, which automatically play after the searched video ends [2].
As with most Google products (YouTube’s parent company), the site’s algorithm that determines recommendations is intentionally opaque, making it unclear as to why radicalized videos are seemingly always within view. During a test of the platform, one journalist found that the search term ‘vegetarian’ would shortly present videos promoting veganism, while flu shot searches would quickly lead to recommendations making anti-vaccination arguments [3]. These, of course, are tame examples; in many instances, viewers have been led to videos promoting political extremist ideologies, and conspiracy theories regarding major historical events (such as 9/11 or the Holocaust).
Regardless of your personal politics, YouTube seems to present the incendiary view as something you may be interested in. For many, YouTube often functions as a news, entertainment, and information source. Is it not the platform’s social responsibility to ensure disinformation, conspiracies, and extremist propaganda are not pushed upon unknowing viewers? This is particularly important, with consideration to the tool’s use by younger demographics [4]. Shouldn’t YouTube’s role in content recommendation for these users require moderation?
As a response to continued criticism, last week YouTube announced that the company would be modifying their algorithm so as to make conspiracy videos and ‘borderline’ content more difficult to find. While YouTube claims to regularly update their recommendations system, this update is said to focus on content that comes close to violating Community Guidelines without technically crossing the line. Videos presenting topics like miracle cures for serious illnesses, claiming the earth is flat, or making false statements about events like the 9/11 attacks will no longer be recommended, though they will still be accessible through the site. On their official blog, YouTube states, “[w]e think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users” [5].
As the New York Times reports, YouTube did not clarify what would specifically qualify videos as ‘borderline,’ but maintains that the change will rely on both human intervention and machine learning [6]. There has been no announced date as to when this update will occur; instead, changes will gradually be made over time.
What do you think?
- What should YouTube’s role be in moderating videos?
- Have you ever been suggested an unexpected video while using the platform?
- Do you think ‘borderline’ content should still be accessible?
- What are your expectations of videos (and products) recommended for you?
Be sure to leave a comment in the discussion box below!
For more information, check out these articles:
YouTube says it will recommend fewer videos about conspiracy theories | The Verge
People said…