Blog

The Digital Tattoo Podcast Project: Coming Soon

We’ve been working on bringing you the same exciting digital identity content in a new format.

Image from publicdomainspictures.net and courtesy of user Circe Denyer

Introducing: The Digital Tattoo Podcast Project

The Digital Tattoo Podcast Project explores digital identity issues through interviews and investigations in an engaging audio format. Our first topic is copyright and open access at Canadian Universities.

We’ll explore these themes after considering the life of Aaron Swartz, an American political activist and programmer who co-created the website Reddit, helped launched Creative Commons, and faced a federal indictment for illegally downloading academic journal articles.

Swartz was a passionate advocate for open access. He believed that knowledge is the property of everyone and shouldn’t be hidden behind expensive subscription fees and paywalls. In 2013, after downloading millions of articles from JSTOR, and while facing a $1 million fine and 35-year prison sentence, he committed suicide.

We’ll ask the questions: Could what happened to Aaron Swartz happen in Canada? How well do we understand the laws around copyright? And what is open access all about? We dive into these issues through interviews with leading experts like Michael Geist, open access journals like the University of Toronto’s Medical Journal, and copyright experts both inside and outside of Universities, and the experiences of students at UBC and U of T.

If you’re confused how the laws around copyright work in Canada, this podcast is for you.

Our first episode will be available for download in early September.

Citizen Lab: Leaks, Hacking, and Fake News

Image used under CC BY 2.0 from Flickr user Marcie Casas

 

The terms leakshacking and fake news have been tossed around frequently by reporters, government officials, and activists. What do these terms mean, and how do they impact our digital identities? This May, the University of Toronto’s Citizen Lab released a report that sheds some light on these terms, and answers some questions that we often think, but forget to ask.

 

 

Who is Citizen Lab?

The University of Toronto’s Citizen Lab is an “interdisciplinary laboratory based at the Munk School of Global Affairs” [1], who focus on “advanced research and development of intersection of Information and Communication Technologies (ICTs), human rights, and global security” [2]. As a self-described “hactivist hothouse”, the Citizen Lab works toward uncovering information that matters most to digital citizens [3].

 

The Report

On May 25, 2017, the Citizen Lab released a report, Tainted Leaks: Disinformation and Phishing with a Russian Nexus. They detail what started as a small investigation of a phishing attack against journalist David Satter that exploded into an investigation of nearly 200 related phishing attacks and tainted leaks, which may have ties to Russia. The report includes a summary and four segments: how tainted leaks are made; tiny discovery; connections to publicly reported operations; and a discussion. What concerns the reporters most is not the presence of hacking, phishing and disinformation campaigns, but instead, how these campaigns affect the relationship between reporters and civil societies, as well as the operations of daily life for citizens [4].

 

Tainted Leaks

The term tainted leaks refers to a specific technique used by hackers to facilitate the spread of false news stories. To create a tainted leak, hackers lure victims to input their email addresses and passwords into false login pages through phishing attacks . Once the hacker has access to the victim’s email address and password, they can steal documents found in the victim’s email accounts, and selectively release them. While some of the stolen documents are leaked without tampering, others are slightly altered to promote certain ideologies, ideas, or theories. It is this blending of authentic and falsified information that characterizes a tainted leak. Once these leaked documents are released, they are often picked up by sensationalist news organizations and reported as authentically leaked documents. These leaks produce a mill of fake news that is difficult to disprove until the authentic documents are released. The tainted leaks scheme that targeted journalist David Satter is the focal point of the Citizen Lab’s report.

 

David Satter

David Satter, a prominent American journalist, received an email on the 5th of October 2016 that looked exactly like a Google security warning; however, the email was a cleverly crafted phishing campaign in disguise. While Satter wasn’t fooled by the first email, he did fall victim to a second email in this campaign that he received on October 7, 2016. The malicious email prompted him to change his Gmail account password by following a shortened URL created with Tiny.cc (a URL shortening service). After following the link, Satter input his login information, changed his password, and continued with his business. It wasn’t until Google registered an unauthorized login to Satter’s account, that he noticed something was wrong. This unauthorized access was found to have Romanian origins, and is presumed to have been the source behind the theft of Satter’s email documents.

Once stolen, a selection of Satter’s documents were released by CyberBerkut, “a pro-Russian hacktivist collective” [5]. While most documents were released unaltered, “one document showed extensive evidence of tainting” [6]. In the manipulated document, Satter appeared to have been “paying Russian journalists and anti-corruption activists to write stories critical of the Russian Government” [7]. After the release of the authentic document by Satter, these claims were demonstrably disproven.

 

Related Attacks

Using the Satter investigation as a case study, the Citizen Lab team was able to isolate a series of related attacks using variations of the tiny.cc phishing scheme. These campaigns all relied on URL shorting services, and could be seen targeting 198 email addresses. Targets of the phishing campaign included: “a former Russian Prime Minister, a global list of government ministers, ambassadors, military and government personnel, CEOs of oil companies, and members of civil society”[8]. Though there is no concrete link between this series of attacks, and the attacks on the US and French National elections, there are similarities, including the use of URL shortening services, and ties to tainted leaks.

 

Conclusion

What concerns Citizen Lab most about these reports is their impact on “civil society”, or democratic societies in general. While tainted leaks may seem like an issue exclusive to National governments, they have real impacts for all digital citizens. Considering the Citizen Lab’s report, there are two important takeaways to consider:

 

  1. Phishing attacks can happen to anyone

While it may seem like government officials are the only targets of these attacks, the Citizen Lab’s report proves that members of the civil society, including journalists, can be targets. Ask yourself, are you prepared in the event that you are targeted for a phishing attack? As hackers get better at providing convincing false links, it is more important than ever to pay attention to the links follow, and to use strategies to protect ourselves from malicious schemes.

  1.    Fact Checking is crucial

Tainted leaks can make it very difficult to spot disinformation. When a leak makes headlines, it is important to think critically about its origin. You can help stop the spread of fake news by engaging critically with the information you read, finding corroborative sources, and trusting the journalistic integrity of large news organizations whose professional practices promote the production of authentic news.

 

To learn more about tainted leaks and the Citizen Lab, visit:

The Report on Tainted Leaks: Disinformation and Phishing With a Russian NexusThe Citizen Lab’s Website, and Ronald J. Deibert’s TEDx Talk.

 

 

30.1: The Politics of Privacy


This is the third post in a blog series that questions the risks that we’re willing to assume and examines the hazards that are present in the current information technology landscape. Although it’s never a one-size-fits-all situation, British Columbia’s current legal framework has a specific provision that affects everyone in the province. You can read the first post, here, and the second, here


We’ve heard the arguments for amending the section BC’s Freedom of Information and Protection of Privacy act that states that all personal information must reside on Canadian soil. Now we’ll explore some of the reasons why this section shouldn’t be changed.

The emerging need for data sovereignty

When the act was passed into law in the early 1990s, concerns over personal information flowing into the United States were mostly speculative. Section 30.1, at that point, was a protective measure against a possible foreign government that didn’t respect privacy.

The act goes under review quite regularly. Although the recommendations the special committees present to government are non-binding, they’ve consistently seen the importance in keeping section 30.1 in the act.

And, in the 20 plus years since the act was passed, we’ve seen vast changes in technology that have unimaginably increased the amount of data that we create about ourselves. We’ve also entered an epoch where governments invade personal privacy in the name of national security.

As technology rapidly advances and political rhetoric distorts national and international values, 30.1 has become one measure that, in theory, protects the privacy of millions of people living in British Columbia. But as we’ve seen in the last post, it does so at a cost.

I spoke with Mike Larsen, the president of BC’s Freedom of Information and Privacy Association, and Vincent Gogolek, its executive director, to get a better sense of what might happen if that section of the act were to be changed.

What would be the impact of amending 30.1?

When it comes to Mr. Hancock’s argument about the need to access cloud computing services, Mr. Larsen said that the advantages are not entirely clear to him—noting that amending the act would remove the incentive to create servers in Canada.

Mr. Gogolek added that, because of a demand in the private sector, Google has recently opened server farms in Canada. If 30.1 were amended, there wouldn’t be a market for creating this kind of infrastructure in Canada for the public sector.

When it comes to the ‘illusion of consent,’ both Mr. Larsen and Gogolek said that has historically been the work-around for public institutions; they’re able to circumnavigate 30.1 by asking for permission without being able to offer a reasonable alternative.

Just like with QLess, the automated que system at UBC that sends personal information into the U.S. and originally prompted my interest in section 30.1, public bodies are not effectively able to offer reasonable alternatives.

Does sitting in Brock Hall for 45 minutes and waiting for your name to be called seem equivalent to being able to leave and receiving a text message when your turn has arrived?

Likewise, if UBC and the other research universities began sending the personal information of students into the U.S. or elsewhere, is it reasonable to expect them to enrol at other universities?

“You don’t have a choice,” said Mr. Gogolek, “You don’t get to go to UVIC, or deal with a different ministry of finance, or health ministry.”

So right now, in order to bypass 30.1, there has to be, at the very least, the illusion of consent. But if the proposed amendment was made, that would be gone. And our personal information could be fed directly into “the big NSA suck hole in Utah,” said Mr. Gogolek.

What effect will this have on students?

“We know what will happen in terms of the Patriot Act,” Mr. Gogolek said. “Something in your American politics paper gets sent down there and suddenly you can’t get into the United States.”

The politics of privacy

Although there was a statutory review of FIPPA in 2015, when people, including Mr. Hancock, came and presented on both sides of 30.1, the government has yet to make a determination on the issue. The government is also currently without a privacy commissioner and a new, coalition government is entering office.

And now, with the NDP and Green party in power, the future of section 30.1 remains unclear. Prior to the election, FIPA asked each party about their respective privacy policies. Here’s what the NDP and Green parties had to say:

“A BC NDP government will defend the privacy of British Columbians against any move by the Trump administration to undermine these rights and will maintain BC’s requirement that government and other public sector data be stored in Canada. Recent steps in Congress to weaken U.S. privacy provisions only reinforces the need for BC to remain firm.” – NDP

 

“With numerous data breaches in the last 4 years, it is clear that BC must re-examine the legislation that governs the data storage requirements it uses for housing private personal information. Beyond international trade, BC needs to update its policies and practices to catch up to and increasingly digitized world. A robust review of data policy would take place with a mandate to find and implement needed reforms.” – Green

And with the election of President Trump in the United States, Mr. Hancock said that “the amendments to 30.1 are very unlikely. There’s a lot of concern about data sovereignty. For whatever government that takes power. It is difficult for any government to justify what may be seen as weakening protections in privacy law.”

But, when it comes to section 30.1, the future is uncertain. And that means that the future of data storage and privacy in British Columbia remains unclear.

When Seeing Isn’t Believing

Textual literacy skills in the North American public have been on the rise since the dawn of the 20th century. More people than ever before are able to understand, and critically interpret written statements; however, fewer people are taught visual literacy skills. It is easy to believe the things that we see. The human mind is more likely to believe statements or memories that are accompanied by images, even if the images are unrelated to that which is being presented [1]. Our brains are hardwired to trust that which we can see, and photos achieve this, although often without context. With the popularization of data visualization techniques, “live” social media updates, accessible cameras and editing equipment, and deceptive advertising, it is more difficult than ever to discern whether or not an image is telling the truth. Here are the three most common forms of images we view, and some questions you can ask to engage critically with the information they present.

Data Visualizations

Image used under CC BY 2.0 from Flickr user nerissa’s ring

Data visualizations (including info graphics, charts, and graphs) have been used by the scientific community to disseminate information for centuries and, over the past few decades, the practice has been adopted by the popular media. In his book The Visual Display of Quantitative Information, Edward Tufte unravels the mystery associated with visualization creations, and outlines the common pitfalls that can deceive viewers: Misleading Axes, and Proportional Ink. As summarized by Bergstrom and West in their online course: Calling Bullshit, data visualizations are meant to tell a story, and these stories can be compromised by poorly created visualizations, incorrect research methodologies, or chartjunk [2]. When observing a visualization, ask yourself the following questions; then test your graph literacy with the BBC, and explore common examples of poorly made graphs with PBS.

  • Does this image include all of the relevant data? If not, what is it excluding? Many graphics that demonstrate financial impact over time do not account for inflation.
  • Does this image feature more information than is necessary to answer the question it is posing? Sometimes the inclusion of unnecessary information can overwhelm the reader, or be used to obscure findings.
  • Does this graphic present information in a way that is intuitively linked to my reading habits? Data can be presented in obscure arrangements to mislead readers. This can take the form of graphics which move backwards in time from left to right, or presents ascending numbers from top to bottom of the page.
  • Can I find corroborative sources that speak to the validity and reliability of the information presented in this graphic? Checking for corroborative sources is good practice for any fact-checking endeavor.

 

Photography/ Film

Image used under CC BY 2.0 from Flickr user Curtis Perry

There have been debates about the authenticity of photographs and film since the invention of the camera [3]. While photographs can seem authentic, they can be altered, staged, or presented without context to be deliberately misleading. In 2014, Beyoncé was accused of altering a photo from her Instagram account. When investigating photographs, ask yourself the following questions:

  • Is this photograph presented in the correct context within the work?
  • Can I find other sources that speak to the validity of this photo?
  • Is this photo trying to sell me, or convince me of, something? If so, what?
  • Does this photo feature evidence of altering?

 

Social Media

Image used under CC BY 2.0 from Flickr user Jason Howle

Aspirational content, and social media brand cultivation, can lead to highly presentational media identities. While we bear witness to our entire complex lives, social media offers us a “highlights reel” of best moments from the lives of our peers. To further complicate our social media literacy, the lines between fact and fiction are easily blurred in digital spaces. Navigating these spheres to determine what is advertised or fictional content is difficult. In 2006, a YouTuber called LonelyGirl15 was outed as a fraud, and opened the door for discussion about authenticity and reliability online. Ask yourself:

  • Is this post telling me the whole story? What is this content leaving out of the narrative?
  • Is this a personal account, or is it an account run by a business? Beware of aspirational content creators, or social media personalities who promote, partner with, or are sponsored by businesses.
  • Is this “live” content, actually happening live?
  • Is this “live” content being edited, or curated in a way that promotes a certain ideology or narrative? What is that narrative saying, and how is it constructed through these images?

 

Scholars in all fields are studying our changing relationship with images and visualizations; however, some of the most exciting experiments are coming from the art world. Artists around the globe are exploring how we mediate images, technology, and identity. The work of RJ Andrews explores how data graphics tell stories, and our own Artist Project Series investigates how our experience with images influences, informs, and disguises surveillance practices. Ultimately, it is our job as digital citizens to recognize how images impact our lives, and to think critically before sharing the narratives they present with our social networks.

 

Your Take

Do you feel that images are more trustworthy than text? Why or why not? Tell us in the comments below, or join the discussion on Facebook and Twitter.

 

 

30.1: The Argument for Change

This is the second post in a blog series that questions the risks that we’re willing to assume and examines the hazards that are present in the current information technology landscape. Although it’s never a one-size-fits-all situation, British Columbia’s current legal framework has a specific provision that affects everyone in the province. You can read the first post, here.

In this blog post, we’re going to review the perspective of those arguing to amend section 30.1 of B.C.’s FIPPA. The Research Universities Council of British Columbia (RUCBC) is, amongst some regional health authorities, arguing to make amendments to the act.

But before we get into that, I want to explain how my interest in a relatively unknown section of a provincial privacy act originated.

Why I care

I’m a UBC student and one day I needed to pick something up from the Enrolment Services Professionals in Brock Hall. So I dropped by and was prepared to wait in a line as I had done on the few occasions that I had been there before. But this time they had a new system called QLess. Instead of waiting in a line, you would input your name and phone number into a computer and receive a text message when it was your turn to be served.

Obviously, this is a better system. But there was a problem. While registering, I noticed that I needed to consent to have my personal information sent into the United States, where the system was hosted. In Canada, our names are considered personal information. I would argue that so should our phone numbers.

Because of section 30.1, UBC was required to ask for consent before sending that information to the U.S., however, because there didn’t seem to be an equivalent alternative to using that system. So: Could I really be giving my consent freely? Or was this an example of what is often called “forced consent”? If so, is this type of consent otherwise not meaningful?

From there, I began wondering how UBC was dealing with section 30.1 in other contexts and I began discovering that it was causing significant difficulties in many settings. What follows, is my investigation into a relatively unknown section of B.C.’s provincial privacy act and its impact on higher education.

Arguments for change

The RUCBC is comprised of the University of British Columbia, Simon Fraser University, the University of Victoria, the University of Northern British Columbia, Royal Roads University, and Thompson Rivers University.

But they’re not alone. B.C.’s health authorities have also joined the fight. So it’s representatives from both the public health and education sectors that are arguing for a change.

UBC counsel for information and privacy, Paul Hancock, on behalf of four other members of the Research Universities Council of British Columbia (SFU was initially absent but later signed on), gave a presentation to a special committee at the provincial legislature whose mandate is to review FIPPA every seven years.

Mr. Hancock’s presentation outlined the arguments for change as the following:

  • Administrative efficiency and security
  • International engagement and student recruitment
  • Online learning offerings
  • Academic integrity

Mr. Hancock made that presentation in November of 2015. Although later the special committee and B.C.’s former privacy commissioner both recommended that section 30.1 not be amended. However, last month, in an interview, Mr. Hancock reaffirmed that the RUCBC’s position hasn’t changed and that section 30.1 still needs to be amended.

Let’s have a look at each of the points individually.

Administrative Efficiency and Security

One of the greatest restrictions introduced through 30.1 is the inability to use cloud based computing services. This causes some administrative delays around storing and accessing information through applications that make use of the cloud, like those that deal with payroll and human resource processing.

But Mr. Hancock makes the argument that our current information technology infrastructure that meets the requirements of section 30.1 is less secure than a system that fully encrypts data that’s in motion and still. Meaning, an amendment to 30.1 would mean student data would be more secure than it currently is.

He said that student privacy is at risk from hacking attempts from foreign and state actors with the current system. An amendment to 30.1, according to Mr. Hancock, would mean student data could be kept safer by making the act more intelligent and encouraging higher security.

Additionally, when it comes to attracting faculty to UBC, the limitations that are created from 30.1 impacts the way these world-class educators and researchers are able to work and communicate. Perhaps, Mr. Hancock says, this has an effect on UBC’s international reputation and ability to compete, which brings us to the second the point.

International engagement and student recruitment

It’s no secret that UBC recruits a lot of international students. 14,434 to be exact. These efforts are enabled by proactive strategies led by the International Student Initiative, which sends recruiters all around the world. But because of 30.1, overseas offices and recruiters have difficulties accessing information that needs to be stored on Canadian soil.

What does this mean for students? Likely nothing. But for recruiters, they need to employ complicated systems to circumnavigate or appease section 30.1, but for the everyday student, even international students, there doesn’t seem to be much of an inconvenience.

Online learning offerings

This a big one. UBC, specifically, is relying on outdated technology. Blackboard Connect is the learning management system currently in use at UBC. It’s an outdated, invasive, and clunky system that is patched together to supposedly rival the cloud-based systems that other universities outside of British Columbia can access.

UBC is currently looking at alternatives as the license for Blackboard Connect is expiring. Due to the limitations created by section 30.1, UBC is in a tougher position to find a new learning management system.

Academic integrity

The point relates to the use of an online plagiarism detection service called Turnitin.com. You may not know this, but most universities scan every essay electronically to detect plagiarism. You can implement this quite simply through a built-in version hosted through a learning management system like Blackboard Connect.

However, Mr. Hancock says that UBC can’t use this system because Turnitin, a U.S. company, has the authority to reach into the system and scoop out data at any time, including names and other personal information, which contradicts section 30.1. Therefore, UBC can’t use Turnitin in this way.

There’s a much more complicated, manual way of using the system. This involves having students manually submit their essays through an online portal with the option of using a fake name, but nonetheless their essays, that content, and their IP address (which is the subject of some debate over whether it can be considered personal information) are all travelling outside of Canada.

I’ve also heard of another work-around solution where Teaching Assistants are asked to remove the names from submitted essays, assign a unique identifier, scan them into the system and check for plagiarism, translate the unique identifier back to the name, and then submit them for grading with the instructor. But, as you can imagine, this involves a lot of work and time. You might not be surprised to find out that not all the Teaching Assistants are diligent enough to protect the identities of the students and take shortcuts that sacrifice privacy and contravene the law.

What do you think?

There’s a lot of compelling arguments that have been made for amending section 30.1 of B.C.’s FIPPA. A change would mean that the university would access to an array of new tools and technology that could enhance the learning experience at UBC and increase administrative efficiency. But would be the sacrifice?

In the next instalment of the series, I’ll explore arguments that have been made against amending section 30.1 by an array of concerned parties, including privacy advocates and journalists. Since the act was passed, many have rightfully argued that section 30.1 has only increased in its importance, especially considering the current political situation in the United States.

But based on what you know right now, are you in favour of amending section 30.1?

 

Was this helpful?
59

Comments are closed, but trackbacks and pingbacks are open.