Author: Morana Miljanovic
Date: 25th February 2015
Anna* types “Lego gender mini figures” into Google's search engine. She follows a link to Guardian's article “Lego Still builds Gender Stereotypes.” Anna is 30 years old, has no kids. She doesn't play games that much but she talked to her sister yesterday and they remembered the occasion when they fought over lego parts. She exercises at a local gym and tracks her condition through Fitbit. However, she takes breaks of several weeks and even months at times, to work on her new book in her parents' house out of the city. She takes a train and pays a fare that she does not know is personalized – an offer calculated on how much a company believes she would be willing to pay. She is single and uses dating sites to meet new people, mostly OKCupid. Years ago, Anna was looking for a scholarship. She answered a questionnaire by a company that matches students with potential scholarship-givers, that included questions on her sexual orientation and her parent's disease histories. She had no idea that the company sells this information to other companies. Anna is applying for jobs. There was an opening she believed was a perfect fit. She did not get it. She does not know it but the company was screening her and bought her profile that suggested that she has little discipline and a Borderline Personality Disorder. This comes from matching her web browsing history, her survey answers, Facebook likes, OKCupid answers, LinkedIn connections, her answers about her parents' medical histories, and her Fitbit data. Anna is a fictional character and this story is fictional. However, it describes practices that take place in reality.
If an artist made a saw out of glass it would be ugly despite the beauty of its appearance, because it could not fulfil its cutting function. (Thomas Acquinas, 1274, Summa Theologica, I, Article 3, 91)
When you query Google for images of a “mask”, its algorithm places the Guy Fawkes mask adopted by the Anonymous on top of the results page across European countries. This mask has become a symbol of anonymity in the digital age. Ironically, anonymity is in short supply for “netizens” - persons online.
Masks have been used by humans for protection, disguise and performance. Today, a new brand of digital masks is manufactured by companies, for profit. These masks – known as profiles - represent us, as users and consumers. Unlike the paper or velvet mask used to disguise one's identity and assume another one, this sort of mask serves the function of disguise no better than the saw made of glass is fit for cutting.
Crafting Glass Masks as data profiles
Imagine a mask made of glass. An Individualized Glass Mask is a profile constructed of data that is collected about an individual: a patchwork of one's past actions, attitudes and connections to other people. It is a mosaic made of glass pieces (tesserae), a mosaic made of different pieces of data, that can be individual's photos, hobbies, ethnicity, religion, social network activity, map of her movements across a city and across countries, her networks, political opinions, purchase history, full name, physical address, phone number, age (range), email address, and more, or some of these. The mask that holds this data is used by the data industry to predict individuals' desires and capabilities, to shape consumer behaviour.
Group Glass Masks, group profiles, are created on the basis of specific characteristics of individuals which are grouped together by algorithms. These characteristics can, for example, be a preference for a certain type of bar and an interest in Greek politics. Group masks are used to predict desires and capabilities of individuals to whom they are attached. These can be both individuals who were grouped together by the algorithm and individuals who are outside the original group but have some identical characteristics. The action of attaching a group mask to persons is based on a premise that individuals sharing certain “attributes” are similar. The probability of someone behaving in a certain way can be predicted based on their past behaviour and be extended to other individuals who display some of the same attributes. In both cases - individualized and group profiling - user data originating in the analogue world (offline data)¹ and data born digital (online data)² are combined – matched and poured together into melting pots from which machines shape glass masks.³
It is not artists who craft the glass masks but a new sort of scientists - “data scientists.” Educated in statistics, they make use of large volumes of data to solve a specific business problem, by “training” datasets so that algorithms can construct patterns (correlations) in data. They decide that some patterns are more interesting than others, more relevant and more useful, to create profiles. Their decisions are informed by their previous (domain) knowledge and constraints, biases, goals and the culture of the organization they are working at. The patterns in data are the chosen pieces of glass from which a Group Mask is made. Then a Group Mask is attached to others. For example, an insurance company applies a group profile defined by smoking to guess what other attributes a potential individual client might have. These attributes can be anything at all: for example, a tendency to spend recklessly. Another example is someone living in neighbourhood where majority is poor and Muslim will wear a mask of a person who is poor and/or Muslim. Once this specific profile is sold to marketers, employers and state organizations, it will determine what information she is (not) shown and (not) offered.
Manufacturers, wearers and users
In ancient Rome, the word persona had two meanings: a mask, and a full citizen. A person online, however, is de facto not a full citizen with rights such as a right to equal and effective protection against discrimination. For example, where it is illegal to discriminate on a basis such as religion (e.g. in France), there are proxies (neighbourhood, e.g.) on which the glass mask industry relies. More fundamentally, as a human, your freedom to determine and control your own identities is impaired. You are not the one who decides which mask to wear or not to wear for which occasions. As long as “the internet” is tied to marketing, and offers “free” services for hidden costs, knowledge of the manufacturing process and the marketplace for transparent masks is hidden from us, and we have no control over how these glass data masks get attached to our faces. Lack of knowledge and control are built into the system where profiling through big data mining is a business model. As you apply for job, your potential employer might see you as a mask of references you never asked for, or a mask of a drug dealer, attached by mistake, or they might see online ads suggesting that someone with your name has an arrest record. Your glass masks will also have implications when you are hoping for a raise, getting housing, a phone contract, insurance, a medical treatment, or entering a university. It may be used by law enforcement, military and secret services agencies. Once amassed data becomes “big data”, many future and previously unimagined uses of data profiles can become possible. Already now, companies like Palantir are starting to use platforms for collecting and mining shopping data alongside with similar anaytics platforms used by law enforcement agencies. Whatever the use, a wearer of the glass mask does not see it. She will feel its weight but not know where it came from. Thus, she cannot react when treated unfairly.
Data has come to be seen as the new valuable resource, an economic asset, and thus highly political. As “users”, we do not get a say in how our glass masks, our data profiles, are used. At best, our consent is manufactured through having us “agree” to terms of service of companies whose privacy policies rely on incomprehensible legal text that allows for much more than it seems to the lonely reader. The politics of what happens around this new resource and who benefits from it does not provide for our meaningful participation.
Oblivious of the glass digital masks, we are often promised opacity and anonymity by companies and public bodies that claim we needn't worry about our data sitting on “clouds” because “it is anonymized.” We are told that it is only the aggregate level where her data is a drop in a data ocean which gives companies “insights” and leads to “improving” their services. We are told by researchers and companies that our medical records will remain confidential because data is “anonymized”. The trouble is, the trick aside, that anonymization does not work.
The ridiculous, the known, and the unknown
Etymology of the word “mask” is uncertain. It may come from the Arabic verb sakhira - "to ridicule", or from Provencal mascarar - "to black (the face)", Catalan mascarar, Old French mascurer, or from pre-Indoeuropean mask - “black”. Most human cultures have known masks. They have allowed us to think about what it means to be human through practices of imagining and assuming different identities.⁴ Throughout history, we have worn them knowingly. We have seen the world through them and knew the world could see us wearing them, and behaved accordingly. In the data-heavy world we inhabit today, for-profit actors that use data analytics and smart devices to observe what we do - and thereby to “know” who we are - run a masquerade where our identities are unknown to us. We are known to them, as customers, and as humans, they claim.
...[F]eatures on an ancient greek mask...conveyed the person's essential nature, his or her unchanging character, and social status. (Mary Louise Hart, 2010, The Art of Ancient Greek Theatre)
Glass masks have come again, like in the ancient Greece, to associate a wearer with a social status and role. How much autonomy do you have in describing yourself to your friends, colleagues, potential and current employers, bureaucrats, strangers, banks and various sellers in different situations when you meet them, unaware of the glass and permanent mask you wear? The mask tells them a story about you before you get an opportunity to do so. The question is, how much do you know about criteria for decisions that affect you, decisions made while looking at you through the glass of your mask? How much control do you have over the shapes of the masks that you wear, defined behind corporate closed doors? How much agency, then, in shaping your social functions? Why do we allow the data industry to create, commodify and monetize our masks, to define our identities in the name of marketing?
A glass saw is useless (and, in Acquinas' books, therefore not beautiful). A glass mask is not. On the contrary, potential uses of a glass mask are vast and many of them not even known at the moment of creating a mask. What is ridiculous is that it is not the wearer who exploits those uses.
The one thing that the masks we know have in common with the glass ones: they are solid and freeze a given identity, they do not allow for fluidity and plurality of identities as long as they are on our faces. The difference, however, is that once we take the wooden or paper-mache mask off, that identity is gone. The glass masks stay forever on our face and we have no means of taking them off.
This blog on profiling stems from research Morana Miljanovic did on the data industry when she was a programme researcher at Tactical Tech in 2014. See other posts from the series here
1. For example, your gym membership cards, car rentals records, voter records, marriage and divorce register entries, birth certificate, real estate ownership records, criminal records, medical records, utility bills, postal and shipping records, and supermarket loyalty cards.
2. For example, location data from mobile phones, radio frequency identification (RFID) tags, credit card transactions, purchase histories at retailers and web behaviour histories compiled with help of tracking technologies, from cookies to Facebook “like” buttons on various websites that “know” you are there without you even clicking on them.
3. For example, Datalogix partners with physical stores that offer membership or loyalty cards, and with Facebook. It matches data provided by users to set up Facebook accounts and store loyalty accounts (e.g. an email address). This allows determining whether a certain product was bought after clicking on an ad on Facebook. Removing parts of user data (hashing) does not ensure anonymity and weak safeguards exist against abuse, but opt-out is possible. Integrating online and offline data is pursued by many companies, notably the biggest data brokers.
4. Using masks, taking them away from others or forcing them upon others has been used across societies as a tool of asserting power and resistance. I am thankful to Max Haiven for the following example: in many parts of Canada, one of the first things the colonists did to subjugate the indigenous people and in attempt to break down their communities was to ban masked ceremonies.
Profiling: PROtecting citizens' rights and Fighting ILlicit profilING -
Big Brother's Little Helpers: How Choice Point and Other Commercial Data Brokers
Collect and Package Your Data for Law Enforcement, by Chris Jay Hoofnagle -
Facebook is using you, by Lori Andrews -
Hildebrandt, M; Gutwirth, S., 2010, editors, Profiling the European Citizen, Springer
Source of image: https://4.bp.blogspot.com/_JUtdcDTVuU0/TAtvJOHanUI/AAAAAAAACA0/sC7x6VwikU0/s320/258px-Mascaras_carnaval.jpg