Personalized Personal Lives: Students vs. Filter Bubbles
by Joe Wright
We’ve all likely seen and been told about how Google’s now seamless integration into our lives sometimes results in a blurring of our lines of privacy, but another hidden tactic of Google (and many other platforms) is ‘personalizing’ the information we’re shown to appeal to our preferences.
Whether companies do this to help you find information, or simply to show you things you’re more likely to click on, we should be considering how personalization affects the information we access.
For instance, what if I told you that your age, gender, sexual/political orientation, race, and occupation are among 14 attributes that can be successfully predicted from your digital footprint [1]? Or that more than 1 in 10 of your Google Search results could be put there just for you, based on these attributes [2]?
The effect of this personalization of content brings about the risk of ‘filter bubbles,’ spheres of algorithmically imposed ignorance that mean we don’t know how the content we’re seeing might be biased to please us and protect us from information that challenges our views [3]. There are some that play down their significance, but personalization effects have been shown to exist: it’s just the opaqueness of Google and other similar companies’ algorithms that make it hard to tell the extent.
In case you need reminding, the abundance of Google-run services that most of us use include Google Search, YouTube, Google Scholar, Google run advertisements, Google Home/Assistant, Chromecast, Google News, Google Play podcasts and more! Combine this with the fact that logging into a Google service also logs you into Chrome by default [4], and your filter bubbles might be shaping your reality more than you realize.
Maybe you’ve experienced this effect yourself. Let’s say you hear about clinical psychologist ‘Jordan Peterson’ in class, and you haven’t heard of him, so you Google him then watch some of his YouTube videos. It’s not outside the realm of possibility that you’ll be recommended more videos or news stories and even see more frequent search results supporting views similar to Peterson’s, including anti-feminist agendas or claims that white privilege is “a Marxist lie” [5], since algorithms are now showing you things that they know Peterson’s fans like.
The point is, when you don’t have any background in a topic that you’re interested in, it’s easier to be misled. This is especially true if you’re given only one side of the argument everywhere you look, including in your personal life, in ads, or browsing YouTube. YouTube claims to have tried to address the issue of extremist video recommendations [6], but journalists creating a new Google account and viewing just a few conservative leaning videos still find more misleading or hateful content recommendations [7]. Misinformation and intolerance exist in liberal contexts too, and while your results may not be extreme, they might still be warping your perception of certain issues.
How can I pop my potential bubbles?
Use your awareness of personalization to consider whether the information you’re being shown is balanced and genuine, making sure to check a variety of sources and utilize the techniques listed in this Digital Tattoo post: Fake News! Who Cares!
Next, you could stop Chrome logging you in automatically, delete your browser data and activity (using this helpful guide), or start using a search engine more focused on privacy like DuckDuckGo, which is reliably personalization free [2]. For more in-depth information, you could check out this ex-Google employee’s website that shows you Google’s current autosuggestions and where YouTube’s recommendations might take you: https://algotransparency.org
How is this LEGAL?
US politicians are pushing the ‘Filter Bubble Transparency Act’ to force companies to disclose when algorithms use information to personalize results, but it’s been criticized for being vague and not actually disclosing how the information is used [8]. Canada doesn’t appear to have similar laws in place yet but watch this space. Proposed lawsuits were filed with the British Columbia, Ontario, and Quebec Supreme Courts in September 2020 to prevent the collection of users’ search histories and other data for creating personal profiles without their knowledge [9].
Filter bubbles are clearly recognized as a real potential threat to the neutrality of the information we access, but the extent of their influence doesn’t yet seem to be fully understood. Have you experienced any filter bubble effects? How do you think filter bubbles and your digital identity might influence each other? Do you think the right solution is to eradicate filter bubbles, or are they just a trade-off for more efficient searches?
References
[1] Hinds, J., & Joinson, A. N. (2018). What demographic attributes do our digital footprints reveal? A systematic review. PLOS ONE, 13(11), e0207112. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0207112
[2] Hannák, A., Sapieżyński, P., Molavi Khaki, A., Lazer, D., Mislove, A., & Wilson, C. (2017). Measuring Personalization of Web Search. ArXiv, 1–29. https://arxiv.org/abs/1706.05011
[3] Pariser, E. (2011, February). Beware online “filter bubbles” [TED Talk]. TED2011, Long Beach, CA. https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles#t-525308
[4] Cheung, J. (2018, November 6). In the News: Google Automatically Logs Users Into Chrome Browser on Google Sites. Digital Tattoo. https://digitaltattoo.ubc.ca/2018/11/06/in-the-news-google-automatically-logs-users-into-chrome-browser-on-google-sites/
[5] Lynskey, D. (2019, July 9). How dangerous is Jordan B Peterson, the rightwing professor who “hit a hornets” nest’? The Guardian. https://www.theguardian.com/science/2018/feb/07/how-dangerous-is-jordan-b-peterson-the-rightwing-professor-who-hit-a-hornets-nest
[6] Hill, E. (2019, February 5). In the News: YouTube updates algorithm to make extremist content harder to find. Digital Tattoo. https://digitaltattoo.ubc.ca/2019/02/05/in-the-news-youtube-updates-algorithm-to-make-extremist-content-harder-to-find/
[7] Boyd, K. (2019, August 13). YouTube and the Filter Bubble. The Prindle Post. https://www.prindlepost.org/2019/08/youtube-and-the-filter-bubble/
[8] Robertson, A. (2019, November 5). The Senate’s secret algorithms bill doesn’t actually fight secret algorithms. The Verge. https://www.theverge.com/2019/11/5/20943634/senate-filter-bubble-transparency-act-algorithm-personalization-targeting-bill
[9] Branch MacMaster LLP. (2020, September 3). Google Faces Class Action in Canada Alleging it Turns Canadians’ Electronics into Tracking Devices Without Their Consent [Press release]. https://www.newswire.ca/news-releases/google-faces-class-action-in-canada-alleging-it-turns-canadians-electronics-into-tracking-devices-without-their-consent-887865918.html
Written by Joe Wright
Edited by Rachael Bradshaw
Featured image Social Media Icon from geralt used under Pixaby License
People said…