Don’t Fall Down the Rabbit Hole:
The Importance of YouTube Privacy
Guest Post by Ying Chen
In 2019, Digital Tattoo covered a news story on how YouTube updated its algorithm to make extremist content harder to find after growing criticism that the platform was leading its users towards misleading content, conspiracy theories and radical thought [1]. Though it may seem that YouTube is improving its algorithm and taking control of extremist views, there are still privacy implications to YouTube’s recommendation system.
A Brief Guide to YouTube’s Recommendation System
YouTube quickly turned into a platform for the commercial monetization of videos from a site that started out sharing user-generated materials [2]. YouTube’s recommendation system is then designed to retrieve, rank and recommend videos based on the videos’ number of likes, watch time or channel subscribers to facilitate its growing business [3]. Ultimately, YouTube profits off of user engagement – the higher the engagement rates, the more the platform gains in ad revenue.
Many studies have shown YouTube promoting videos of “controversy and dissent” through its recommendation system to boost views [3]. According to YouTube’s Chief Product Officer, the platform’s recommendation system is responsible for over 70 percent of the time that users spend on it [4]. This also leads to “YouTube’s Rabbit Hole,” where watching a 3-minute video transpires into a 3 hour binge-watch on the platform and possibly into darker corners of the web that contain misinformation [5].
Privacy Implications of YouTube’s Recommendation System
YouTube has been collecting users’ data including ratings, comments, watch and search history, page views, location and device information to improve its recommendation system [6, 7]. Even though the algorithm is thus able to keep users on the platform with highly-personalized video recommendations, several issues related to users’ privacy arise:
- Users are not fully aware of all the data being collected [6, 7]
- Users underestimate the power of the data collected and the system’s learning patterns [7]
- Personalization data may come from unwanted data collection or even data shared from another recommendation system [8]
- Users may learn about another’s personal information if they share the same account or device, thus contributing to the invasion of privacy themselves [8]
Bloomberg revealed in 2019 that YouTube’s management was “so focused on maximizing usage statistics that they looked the other way when employees raised concerns about the company’s recommendation system” [9]. Furthermore, in 2020, a crowd-sourced study called “YouTube Regrets” showed that YouTube’s recommendation system generally directs users to videos with higher potentials of going viral, even if the videos contain potentially harmful content [10]. In extreme cases, some users are even “pulled into a far-right universe, watching thousands of videos filled with conspiracy theories, misogyny and racism” [5].
What Can Users Do?
Technology websites such as Beebom [11] suggest practical ways of supporting healthy YouTube engagement, including:
- Using web browser extensions such as “YouTube Rabbit Hole” to limit your interactions with the platform’s content
- Setting break and bedtime reminders
- Using app timers to limit your time on YouTube
- Eliminating triggers by re-evaluating your channel subscriptions and whether a channel’s video deserves your immediate attention
Points to Consider
Evaluating YouTube’s treatment of your data is also crucial. As Organization for Economic Co-operation and Development (OECD) [12] declared back in 1981, the Fair Information Practices (FIPS) ensures the safe treatment of personal data online by establishing 8 categories to determine data security:
- Collection Limitation
- Data Quality
- Purpose Specification
- Use Limitation
- Security Safeguards
- Openness
- Individual Participation
- Accountability
Discussion
How would you measure YouTube’s 8 FIPS categories? Does the platform violate any of them? What protocols do you think YouTube should implement to tackle this issue? How about your own intentions when sharing personal data on the platform? Feel free to discuss below!
References
[1] Hill, E. (2019, February 5). In the News: YouTube updates algorithm to make extremist content harder to find. Digital Tattoo. https://digitaltattoo.ubc.ca/2019/02/05/in-the-news-youtube-updates-algorithm-to-make-extremist-content-harder-to-find/
[2] Cooper, P. (2021, June 21). How Does the YouTube Algorithm Work in 2021? The Complete Guide. Hootsuite. Retrieved November 1, 2021, from https://blog.hootsuite.com/how-the-youtube-algorithm-works/
[3] Rieder, B., Matamoros-Fernández, A., & Coromina, Ò. (2018). From ranking algorithms to ‘ranking cultures’: Investigating the modulation of visibility in YouTube search results. Convergence, 24(1), 50–68. https://doi.org/10.1177/1354856517736982
[4] Solsman, J. E. (n.d.). Ever get caught in an unexpected hourlong YouTube binge? Thank YouTube AI for that. CNET. Retrieved November 30, 2021, from https://www.cnet.com/news/youtube-ces-2018-neal-mohan/
[5] Roose, K. (2019, June 8). The Making of a YouTube Radical. The New York Times. https://www.nytimes.com/interactive/2019/06/08/technology/youtube-radical.html
[6] Haselton, T. (2017, December 6). How to find out what Google knows about you and limit the data it collects. CNBC. https://www.cnbc.com/2017/11/20/what-does-google-know-about-me.html
[7] Zhang, B., Wang, N., & Jin, H. (2014). Privacy Concerns in Online Recommender Systems: Influences of Control and User Data Input. 159–173. Symposium On Usable Privacy and Security, Menlo Park, CA, United States. https://www.usenix.org/conference/soups2014/proceedings/presentation/zhang
[8] Friedman, A., Knijnenburg, B. P., Vanhecke, K., Martens, L., & Berkovsky, S. (2015). Privacy Aspects of Recommender Systems. F. Ricci, L. Rokach, & B. Shapira (Eds.), Recommender Systems Handbook, 649–688. Springer US. https://doi.org/10.1007/978-1-4899-7637-6_19 [Note: This source is not open access]
[9] Bergen, M. (2019, April 2). YouTube Executives Ignored Warnings, Letting Toxic Videos Run Rampant. Bloomberg.com. https://www.bloomberg.com/news/features/2019-04-02/youtube-executives-ignored-warnings-letting-toxic-videos-run-rampant
[10] Nguyen, J. (2021, July 25). YouTube is more likely to serve problematic videos than useful ones, study (and common sense) finds. Mashable. https://mashable.com/article/youtube-algorithm-problematic-content-mozilla-study
[11] B, S. (2021, January 19). How to Stop Wasting Time on YouTube. Beebom. https://beebom.com/stop-wasting-time-on-youtube/
[12] Organization for Economic Co-operation and Development. (n.d.) OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data. Retrieved December 28, 2021, from https://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm#part5
Written By: Ying Chen, UBC, School of Information
Edited By: Brittanny Dzioba & Alex Kuskowski
Image Credit: Creative Commons photo by Esther Vargas under CC BY-SA 2.0 (from Flickr)
People said…