Book Review: Shareveillance

Shareveillance: The Dangers of Openly Sharing and Covertly Collecting Data

Clare Birchall (2017)

Digital Tattoo Rating: 2/5


What does it mean to share in the Digital Age? In her book Shareveillance: The Dangers of Openly Sharing and Covertly Collecting Data, Clare Birchall investigates the intricacies of digital sharing on personal and state levels, and attempts to find out who is benefitting from all of our collective data.


The Top 5


1. What is -veillance? (Chapter 1)

Birchall is a Critical Theory scholar who posits that contemporary ‘sharing culture’ is enabling surveillance. This book focuses on the idea that by posting our data on social media sites, and by having our information collected by data professionals, we are becoming a society of aggregate data points instead of individuals. This in itself is not what bothers Birchall, instead, the trouble is in the inequity between the data that we make available, and the data that is made available to us. “Government practices that share data with citizens involve veillance because they call on citizens to monitor and act on that data – we are envisioned (watched and hailed) as auditing and entrepreneurial subjects” (Birchall, Chapter 1*). Shareveillance is the state in which the government is able to aggregate and use our data, without having to share data in return. Indeed, Birchall goes so far as to theorize that the small amount of data provided to us by the government is purposely made public, not for public good, but as incentive for the public to give-up even more of their data.  Confused? So was I. Let’s break these concepts down a bit.


2. Data (Chapter 2)

The trouble with data is that it isn’t neutral. We like to think that it is, but an entity’s access to data can contribute to their agency. For example, if one country has access to the polio vaccine, and another country doesn’t, the country that can eradicate polio has an advantage over the country that can’t. The same can be said on an individual level. Someone with access to data may inherently have an advantage over someone who does not. By giving governments and organizations access to all of the public’s data, while receiving little in return, we are giving them an advantage over us.


3. Open & Closed Data (Chapter 5)

Open data is data that had been made available for public use. Examples of open data include the City of Toronto’s Annual Energy Consumption Report  (via t


he City of Toronto’s Open Data Catalogue), and Graffiti Site Data(via the City of Vancouver’s Open Data Catalogue). Closed data is data that has not been made available for public use. Examples of closed data include security files, defense strategies, etc. When a hacker or whistle-blower like Edward Snowden releases ‘government secrets’, they are releasing closed data to the public. The difference between open and closed data is significant, because it signals what information we, as a public, have access to. The information that we have access to directly impacts the political agency that we have.

Birchall explains that while we like to think of open data practices as ‘good’, and closed data practices as ‘bad’, even open government data is not neutral. Birchall’s book explores the ways in which open data provided by government organizations can be used as a lure for individuals to give up more of their own data, and how challenging it can be to make sense of the massive quality of poorly organized data made available to publics. Neutrality and accessibility are not guarantees in open data practices.


4. Citizens as Data (Chapters 3 & 4)

Birchall is skeptical of hyper-connectivity and current government open-data practices.  The concern is that through data-professionals’ ability to aggregate and profit from our data, our value as citizens is not in our ability to meaningfully contribute to democracy, but instead our ability to contribute to data sets. Birchall worries that by reducing the public to data sources, the public will, in turn, lose its political agency. This is further complicated by failings of the open and closed data system in which data is withheld from the public, or made available in ways that are not neutral or accessible.


5. So, What Now? (Chapter 6)

How do we get out of the trap of becoming non-agent citizens for overly surveillant capitalist governments and organizations? Well, the short answer is that we have to take it one step at a time. Birchall suggests supporting hacking, whistle-blowing, and open-data initiatives. On a smaller scale, Birchall focuses on the importance of grassroots movements (like U of Ts Guerrilla Archiving event), and the significance of informed choices by an educated public. Try out de-centralized storage for your files, using encryption technology, and ensuring your right to opacity by using servicesthat limit how much of your data is shared. By making small changes, and keeping aware, you can “cut” into the system of shareveillance and ensure equitable access to information, as well as the right to keep your information opaque.



While this book was a very interesting read, it was really dense. Birchall rushed through topics in the interest of keeping the book short (86 pages), but sacrificed accessibility. While the information in this book could be a wonderful tool for young adults who are digitally and politically engaged, the barrier to entry makes this book relatively inaccessible for the average undergraduate student. Further, the examples given are specifically for professors and long-time scholars. While writing a book by scholars, for scholars is not inherently negative (and I recognize that the goal of the book was not to cater to young adults), I feel that making a longer, more accessible version of this book would be of great benefit to scholars, students, and young adults alike.

*This book was read as a digital copy and therefore did not have accurate page numbers. This review will use chapter numbers for citation purposes.

Guest Blog Post: Disconnection

Until very recently, connection was always framed in terms of individual choice. If people found issue with being connected, perpetually distracted, or dulled to their immediate surroundings by their devices, it was the users themselves, and their lack of self-control that was to blame. Any solution or “escape” was to be found in the individual’s approach to the technology, and not the technology itself.

Yet with the release of books like J Twenge’s i-gen there has been a public turn to the designs of our devices and systems of connection themselves to explain the negative features and consequences of our extremely connected lives. Connectivity has now begun to transition from consumer preference and self-control to an issue of public mental health and structural coercion.

Under a new scrutiny, Facebook, Google and other tech companies have begun to take some claim of responsibility for the present state of things. They have vowed to alter their systems to be less distracting, addictive and consuming, to make it easier for us find disconnection amongst our connected world. Yet these companies’ business models depend upon the very user interaction they claim they are attempting curb: that hunger for connection that means that every idle (and not so idle) moment can be turned into a micro-transaction or bit of minable-data for future ad-targeting. We may wonder if the whole affair is simply a matter of fine-tuning our dependency to be less overtly noticeable.

Outside moderation perhaps holds the most promise. France’s bold move to ban phones from schools (for children under 15) presents a kind of structural counter-measure to constant connectivity, marking out a patch of disconnected space and time for young minds. Similarly, German legislation limiting the checking of work email to business hours suggests another carving out of (partially) disconnected space in favour of the work-life balance. In Silicon Valley, the Centre for Humane Technology growing out of the Time Well Spent movement represent a direct attempt to create industry wide standards and practices in the tech field to promote a more measured and less-addictive relationship with our connected devices.

Yet even with such developments at play, how free are we still to make the choice to disconnect? Just as access to healthy food, and the leisure and means to exercise, are in practical terms not equally accessible, there is now a class of those who can afford to disconnect, and those who can’t. Those whose jobs offer the stability and defined working environment to permit concerted disconnection, or the affluence to make a complete retreat from the digital, and the “gigging” whose very livelihood depends on constant digital presence, constant hustle and “flexibility” within a shifting a market. The Uber and Ubered.

This is part of the Digital Tattoo’s Guest Blog Series! You can read Henry’s previous post about Connection here.

If you’re interested in contributing a guest blog post to Digital Tattoo, please contact our editorial team at:

Digital Identity Digest (November 2018)

Have you ever lied on a dating app?

Online dating apps are very popular, but they can have some pitfalls. Irina Manta, a law professor at New York’s Hofstra University and founder of its Center for Intellectual Property Law, wrote to the Washington Post that obtaining sex through fraud on dating apps should be legally penalized. In an interview to CBC, Manta explained that little lies, like weight or height, are not the problem. What she is concerned about are more substantial lies, such as a person’s marital status. To punish those predators, Manta is asking state lawmakers to create a civil sanction. She brings up the case of Anna Rowe, a woman in the U.K. who was deceived by a married man for over one year. Rowe explained in an interview to CBC, that he created social media accounts with false information, and had even a phone dedicated to his affairs. So far, she found other 13 women that were deceived by the same man.

In November, another case of catfishing — creating a false identity online to pursue deceptive romances — hit headlines. The New York Times reported that a 26-year-old Norwegian man pretended to be a teenage girl to meet boys and young men on online chat forums. He was charged with sexually abusing more than 300 people.

Do you want to keep the discussion going? You can share your thoughts on Digital Tattoo’s online dating section.

Is Facebook’s content removal process fair?

Image Mark Zuckerberg_By_Alessio Jacona Used Under CC BY-SA 2.0

Over 80 organizations called on Facebook for more transparency and accountability on their process to remove content. “Facebook’s content takedown policies too often backfire and silence the very people that should have their voices heard on the platform,” said the Electronic Frontier Foundation (EFF) in a press release. They request the release of transparency reports, and the implementation of appealing standards that could be easily followed by average users. But as Motherboard highlights, this comes after a year of Facebook claiming to expand its content moderation process to protect its users from hate speech. Therefore, the company has to find solutions that would work well for both situations. This could be the creation of an independent body responsible for working on users’ appeals, as Mark Zuckerberg announced recently, according to Business Insider.

Have you ever had a post removed by Facebook that you considered a mistake? If you want to share your story, send us an email:

Should data from smart speakers be released to law enforcement?

CBS News reports that prosecutors in New Hampshire, U.S., requested Amazon to release recordings from an Echo device at the scene of an alleged murder. However, the company is resisting to turn over any data “without a valid and binding legal demand properly served on us.” As the Washington Post writes, this puts the company in the centre of a debate around public safety and privacy. Amazon and other tech companies have been questioned about privacy concerns, so they tend to protect customers’ data from external requests. On the other side, according to EFF senior staff attorney Nate Cardozo, law enforcement has been increasingly requesting the release of data from connected devices.

As Internet of Things (IoT) devices are becoming more integrated into our daily routines, we should reflect on how information about us is being collected and used. In August, CNBC reported that colleges are expanding the use of smart speakers, including installing them in dorm rooms.

Earlier this year Digital Tattoo published a reading guide to authors and filmmakers examining some of those issues. It’s worth taking a look!

Image Amazon Echo_By_Quote Catalog Used Under CC BY 2.0

Read our most recent blog posts:

What’s happening in December:

Lecture – Policy Solutions for Big Data and AI Innovation in Health

When? Dec. 13, from 4 pm to 5:30 pm (EST)

Where? U of T St. George (Downtown) Campus – Women’s College Hospital Conference Centre Auditorium, 2nd Floor, 76 Grenville Street

Written by: Monique Rodrigues

Edited by: Elyse Hill

Guest Blog Post: Cluster What?

Cluster What?

Written By: Henry St. Clair


Information systems literacy comes front and center in a new work of theatre that was unveiled at Toronto’s 2018 Fringe Festival. The play, written by the University of Toronto’s Faculty of Information’s own Professor David Philips, is affectionately titled “Cluster F****d”. Philp’s piece attempts to lead the audience into the mechanics and consequences of commercial surveillance practice, as they are currently deployed. The play holds particular focus on both the practicality and the absurdity of subjecting populations to cluster analysis.


What is Cluster Analysis?

 Cluster analysis is a common statistical process employed in data-mining by which disparate sources of data on individuals can be made meaningful. This meaning is extrapolated by forming data points into clusters of affinity, and then naming and/or demarcating those clusters.  For instance, such disparate information like gender, age, education level, and brand of car owned, can be so arranged to maximize the clotting of individuals into groups along various axis. The trick (and mysticism) of cluster analysis is deciding and weighting the attributes of an analysis so as to create distinct clumps of individuals united by similar attributes, while ensuring that each grouping is distinct from other clumps of individuals

Image used under CC BY 2.0 from Wikipedia Commons User: Wgabrie

As with any classification practice, cluster analysis does not identify pre-existing groups, but brings those very groups into existence through naming clusters that have been made to occur within the data. From these clusters are created “populations”. It is expected that by sharing certain attributes, each member of a population will statistically follow along certain paths and preferences. In short, cluster analysis performs the same kind of stereotyping of people which individuals unconsciously perform daily, but on massive industrial scale with information no individual would be able to make sense of.


What is “Cluster F****d”?

Full of gumption, Philip’s “Cluster F****d” ran at a mile a minute through the perpetual indignity we are subjected to, as we are speculated upon and (arbitrarily) classified via cluster analysis, for the purposes of better ad-targeting. Upon a minimal but innovatively employed set of a few modular chairs, four plain-clothed actors bombarded the audience with tasty factoids lifted directly from the professor’s own “Interventions in Surveillance Workshop” offered at the Faculty of Information. Perhaps in an attempt to mirror the ceaseless data we forfeit every minute to massive corporations, the play itself appeared to have no beginning, middle, or end. Rather, it was a flaying mass of information, leaving even theinformation professionalsin the audience unclear on the logic of what was occurring. All of this combined into an uneasy simulacrum of a University seminar intermixed with the over-enthused antics of an improv night and the self-righteousness of a slam-poetry gathering.

The highlight of the play was a scene conjuring the bridge dynamics of golden-age Star Trek, as the actors embodied a faceless advertising “they” who were analysing a women’s Spotify activity to determine what to sell her. The woman’s eventual joy at being sold (and receiving) a container of Häagen-Dazsice-cream seemed to sum up the essential dual nature of the advertising systems that surround us; we are certainly being manipulated, but what else is there to do if our sole means of self-expression is the acquisition and consumption of products?


Will it blend?

 The Roman poet Horace advised that the supreme goal of art was to “delight and instruct.” Yet, rarely do the motivations of pleasing spectacle and public education align in a satisfying effect. “Cluster F****d” did indeed reveal (as stated in the production’s own words) “what happens when a university professor writes a play”.

This is part of the Digital Tattoo’s Guest Blog Series! You can read Henry’s previous post here.

If you’re interested in contributing a guest blog post to Digital Tattoo, please contact our editorial team at:

In the News: Facebook pulls VPN from Apple App Stores over accusations of data harvesting

Virtual Private Networks, or VPNs, allow user  internet traffic to be encrypted and redirected through a private server, creating added security for user data. Many people use these forms of anonymous browsing to transfer information, or mask geographic location. But what about the companies running the VPNs? Are they not able to see the traffic they are tasked with encrypting and protecting?

This was the issue behind Facebook’s recent decision to pull their VPN app, Onavo, from Apple’s App store. The app relies on analyzing mobile traffic in order to improve Facebook products and services, while providing users with a private server to access the Internet. Facebook’s analysis included the practice of collecting data on its users and their friends, through sites visited or information kept in iPhone users’ address books. As well, Onavo’s data was used for Facebook’s product and acquisitions strategy, aiding in the decision to purchase WhatsApp and move towards live video streaming capabilities, the Wall Street Journal reports [1]. After the app store updated their privacy guidelines in June in an effort to prohibit apps that collect data – a move that is believed to have targeted Onavo [2] – Facebook agreed to voluntarily remove the app from Apple app stores (though still remaining available through Google Play Store). While it is no longer available for purchase, users are still able to operate Onavo, though without any forthcoming updates.

This is the latest news of a long list of issues regarding Facebook’s failure to protect user data, occurring within the same year as their reputation-damaging Cambridge Analytica scandal. While Facebook maintains that users were made aware of the app’s usage and abilities, it was a telling decision for the company to choose to remove the app instead of updating it to suit Apple’s guidelines.

What are your thoughts on Facebook? Do you use the social media platform, or any of their products? How do you navigate concerns surrounding privacy and their misuse of user data?

Be sure to leave a comment in the discussion box below.

For more information, check out these articles:

Facebook pulls its Onavo Protect VPN app from Apple App Stores | The Next Web

Facebook pulls its data harvesting VPN app from app stores | Gizmodo

Facebook pulls Onavo Protect from app store after Apple finds it violates privacy policy | Apple Insider

Was this helpful?

Comments are closed, but trackbacks and pingbacks are open.