Me and my shadow is a new initiative by Tactical Tech that will examine different aspects of the digital traces we leave behind us online. The project has developed out of a growing concern for the way that privacy issues are impacting social networking users and owners of online and mobile devices. These are important issues for everyone, but they have a particularly serious impact on rights and transparency advocates, independent journalists, and activists who can be targets of surveillance, censorship and control.
Why do we have digital shadows and should we be afraid of them?
Back when our digital life was dominated by Microsoft we paid for our tools and services (or pirated them) and all our data sat on our machine. As we entered the dynamism of the online world and web 2.0, we thought it was a fair trade to hand over our personal information to anyone who would help us stay in touch with our friends, make a call, or share our pictures without paying. The easier it was for us to do these things, and the less we had to fight with our computers to make them happen, the happier we were to hand over our information and click yes. The end result is that we don't pay for most services and tools, but we also don't hold most of our information.We have come to want the same of our digital devices and online services as we do of our washing machines; we don't really want 54 different options and we don't even want to know how it works, we just want to put the things in, press a button and have them emerge the other side ready for use. The problem is, aside from their love of electricity, there's obviously very little that digital devices connecting us to the internet, such as mobile phones and laptops, have in common with digital devices that help us do our daily chores, like washing machines. That means we have to start the messy job of figuring out how these things work and what part we play when they go wrong.
We have been trained to expect a 'data in return for free services' trade-off without wondering what the hitch is. Now that we've put all our data out there, we're finding the job of removing the traces of our online actions very tricky, and we've noticed that after a period of time they start to look like snail trails, creating an unparalleled birds eye view of our habits, networks, ticks and movements over time.
The business model of 'your data for our free services', as emphasised by this cartoon from the New York Times, via http://www.reputationdefenderblog.com
For some people this is all very useful. When they take a new photograph on their synchronised smart phone, it magically appears in the collection of photographs that acts as a screensaver on their laptop. When they mention to a friend in their Gmail chat list they are going to Rome, the next day they get adverts in their browser for hotels in Rome. People they hardly know wish them happy birthday and according to KLM, soon their fellow airline passengers will be using Facebook to choose if to sit next to them, or not http://www.wired.com/autopia/2011/12/klm-passengers-can-use-facebook-for-meet-seat/. For some people the world just got smaller, their devices more helpful and their social life easier. For others everything just got incredibly creepy, they can't figure out where some of this information about them is coming from and they can't figure out a way to 'opt out'. For some, however it is much worse, in fact it is incredibly dangerous.
If you are a rights or transparency advocate, an activist, independent journalist, blogger, or any other kind of politically active individual trying to increase accountability and address injustice, these features designed to make technology use smoother and more appealing are exactly the opposite of helpful. Combined with the business model of 'your data/content for our free services', these services quickly morph from being things that open new doors to things that create new traps.
This is emphasised by the Intellectual Property Rights journalist Cory Doctorow http://www.guardian.co.uk/commentisfree/video/2011/apr/18/cory-doctorow-networking-technologies-video when talking about privacy and social networking sites; “The fact that Facebook was used in the Middle East uprisings should be treated as a bug that gets fixed over time. Facebook isn't suited to that kind of use... Facebook, convenient as it is, is a bit of a honey pot and once you're stuck in it there's going to be negative consequences”. Facebook has come under a lot of pressure for the degree to which it exposes its users and is consequently known amongst some as 'stalkbook'.
Grafitti in Colegialies in Buenos Aires, the grafitti says 'se lo que haces' which translates in to English as 'I know what you are doing' via http://www.buenosairesstreetart.com
So why don't activists use different tools, designed just for them?
The paradox is this. The tools that make it the easiest for activists to create, publish, coordinate and collaborate are the ones that collect the most data in the most opaque way. Activists are faced with a serious dilemma, they need these tools to help them work at speed and scale, with limited resources and to get around otherwise restricted channels of communication, but unless carefully managed and controlled, the data about them and their networks gathered on these tools can be their undoing. Moreover, by design they criss-cross to weave extremely detailed maps of our actions and associations. Many have commented, such as Evgeny Morozov in his book the Net Delusion, that these patterns are effectively doing the work of the intelligence services for them. If activists move to special tools and special spaces, they would be the only ones there and that would defeat the point of using these virtual spaces for awareness raising and mobilising people, in addition, they could just become another way of marking them out as different.
So the short answer to the question 'what are digital shadows and should we be afraid of them?' is this: digital shadows refers to the data we leave behind us, sometimes intentionally; or sometimes unintentionally, like when we post a photo on our Facebook page, ; and like when we select the delete option on our mobile phone, but the information is actually still there. Should we be afraid of these digital shadows? The growing trend says, yes. How much you should be afraid of them and the degree to which you need to start thinking differently about your behaviour, all depends on your world view, who you are, where you are and what you are doing.
What can we do about this?
Activists have always taken risks, and should take risks. Being surveilled, censored, tracked and intimidated is usually a problem of success; you are really only a direct target when you are on to something. Acknowledging that activists need to take calculated risks is important, it takes us away from the assumption that they need water tight protection towards a more nuanced approach to the complicated relationship between visibility, privacy and activism.
For example, if you are an investigative journalist following a story you may need absolute privacy for your contacts and sources, but may choose a different approach related to your own identity depending on where you are from and how well known you are. This can also be complex, it's easy to assume that an investigative journalist working in a repressive environment should always remain absolutely anonymous. There are times, however, that the risky strategy of being absolutely public can be safer; staying in the public eye in the right circumstances can bring an element of protection to extremely high profile figures. Because of the complex nature of these decisions and the fact that there is no one solution all the time, activists need to be able to make calculated decisions in response to their immediate environment. They need to be able to choose when to be intentionally public and when to stay private, and they need to be able to control which information they give away to whom and when. This is the crux of the problem; when using digital devices and services, activists are losing their ability to control their risks and make their own decisions and this is extremely dangerous.
Why this is happening is also pretty complicated. There are a range of forces experimenting with technologies and legal boundaries that result in the continuously diminishing control we have over our data. These range from data hungry marketeers through to the somewhat veiled world of the new 'arms industry' of digital surveillance. The fact that the universe of things we can't control has become so large and so difficult to influence, makes it even more important that we start with the simple steps of thinking about what can be done at a practical level and focus on what is within our control. This can be broken up in to three interrelated parts: our behaviour, our understanding of the services and tools we are using, and our questioning of the what is rapidly becoming the norm.
We need to take better care of how we set up our devices and services so that we - as users - are controlling them and not the other way around.
Make the time for informed decisions about your data and devices. Taking care not to just go with the default settings, but to make small adjustments that could change the way we look when watched from the outside.
We have to make more careful choices about when to give information away and what tools to use and not to use. We can also develop more nuanced strategies for engaging with different services, differentiating more carefully between interpersonal communications, anonymous information gathering and publishing.
Activists need to take more responsibility for the communities they can unintentionally expose through their work. Many of these tools have a double edge to them that needs to be carefully managed; a photograph taken on a mobile phone and distributed on the internet can both be an act of witnessing and a source of identification for the person who took the photograph, which increases the risk, especially if you are witnessing abuse or injustice.
Understanding Tools & Services
Taking the time to better understand the services and tools we are using is the only way to make more informed decisions. The learning curve is very steep for most of us, especially in a rapidly changing environment, but we have to begin to understand them because:
These tools have become easy to use and ubiquitous at a breathtaking speed; the consequence of this is that rules, regulations, terms and conditions change in a way that it's very hard for us to keep control of our information. We can't influence this but we can stay aware of it and use this to devise more artful strategies for using different devices and services.
Users need to be aware of information politics that are stretching and pulling their personal data into new areas of risk. Whilst we wait for regulators to put stronger checks and balances in place, we need to be proactive in working around the weaknesses and holes in the system. It is a difficult challenge, but we need to not let the unpredictability of the sector act as a freeze for using tools, but as a motivation for devising more nuanced strategies.
Questioning the Norm
Lastly, we can start to engage in efforts to challenge or question what has become normal for many users. We can begin to support, follow and even start projects that question the potential data nightmare that looms over us. This includes a broad range of initiatives such as:
The 'Web 2.0 Suicide Machine' that helps users 'sign-out forever'.
The Europe versus Facebook project which aims to increase the transparency of facebook, initiate legal action and help users request their data.
The German Green Party politician Malte Spitz who filed a law suit against Deutsche Telecom to get his data on his own mobile phone use and then published this broadly as an interactive map about his own movements and how they were tracked from August 2009 to February 2010 – www.zeit.de/digital/datenschutz/2011-03/data-protection-malte-spitz.
Over the next few months, through Me and My Shadow, we will start working on these three different levels: helping find ways to change our behaviour, providing insights and overviews that help us better understand how these technologies work and promoting and reviewing initiatives that challenge and question the status quo.