Article body

Vignette 1: LOVEINT

Met her, date her,

Not much later,

She finds out what I meant

'bout keeping her content.

#NSAlovepoems

Quentin Hardy ‏@qhardy[1]

Fig. 1

Karen Jeane Mills, no title, digital image, 2018

© Karen Jeane Mills, commissioned by the author

-> See the list of figures

Access to the world’s largest database lets him search for secrets about his ex-girlfriend. And later, the polygraph test makes him admit that he’s done it. It’s always the machine that’s to blame—first for the uncontrollable urge to know, and then for the unbearable guilt of knowing.

Weeks after his search, when he fails the lie detector test necessary to renew his security clearance, he cites an overwhelming “curiosity” to explain his lack of self-control.[2] He never considers himself at fault for querying the database because that’s what he does for a living: as an analyst for the National Security Agency (NSA), he has been trained to spy. And part of this training means becoming immune to the impacts of spying—and, it would seem, immune to the consequences of breaching another person’s privacy. It means not feeling the breach as a breach. To him, it’s just a query. Just a way to get more information—to confirm a suspicion, to gather more data, to finish a conversation. To become an analyst, after all, he had merged with a system that normalizes these kinds of searches, that frames data as merely data: dry, neutral, and devoid of meaning until aggregated or triangulated into a larger pattern.

The phone records and the metadata he tapped into to spy on his ex-girlfriend were deemed insignificant by most employees at the Agency. Indeed, eleven other NSA agents would later be caught committing similar transgressions; like him, these agents suffered few professional consequences. Internally, the incidents did not constitute a scandal either. Of the eleven known cases of NSA employees breaching the boundaries between work and pleasure that have surfaced since 2013, eight involved snooping on current or past lovers or spouses during the last decade.[3] Five employees quit before being disciplined. The rest received letters of reprimand or short suspensions without pay. Few dropped in rank; when they did, the demotion smacked of symbolism rather than a genuine punishment.

While unfettered access to their present and past lovers’ personal details proved too tantalizing for these eleven employees to resist, the data involved in these cases represents just the tiniest fraction of 1.7 billion communications intercepted by the NSA every day.[4] NSA programs such as PRISM and XKeyscore[5] give analysts open access to American citizens’ private information. It remains unclear to the public what the parameters of use are or how (or if) these are policed internally at the Agency. The network’s reach is huge and tentacular; and because of how metadata is gathered, spying on an ex is spying on their entire network, too.

So what does “curiosity” mean in this context, beyond being enough to justify breaching the privacy of one’s intimate partner(s)? What about privacy, or intimacy, itself? As he queried her data, was he hoping that he could (finally) know her––the real her, the secret her? Was he thinking that he could finally know what she’d kept from him, cross-reference the many versions of the stories she’d told, fill in the interruptions, defragment the threads, and be privy to the details of her private conversations with others, too? Does he feel entitled to these details—not only as an NSA employee, but also as her ex-boyfriend? Primed for surveillance at this scale, does he reason that true intimacy means knowing everything? Do the lines between analyst and lover blur further—does he believe that this, too, is for her safety, for her protection?

Compared to the analyst’s unfettered access to her innermost self, why would he settle for the lover’s partial truths? We cannot know definitive answers to these questions, but we can see how his role at the NSA would facilitate any effort at omniscience. Nobody can know what his motives were, but his actions illustrate the ways in which surveillance is antithetical to intimacy. Maybe he imagined that his training qualified him to keep her data in check. Maybe that training taught him to think of relationships as something to be managed numerically, rationally, analytically: objectively. Maybe he became an NSA employee in order to gain this kind of privileged access to other people’s data—or, perhaps this privileged access is what thwarted his ability to think ethically and empathetically. Or perhaps it’s impossible to resist such a God-like, fly-like, ghost-like viewpoint. Perhaps surveillance is antithetical to intimacy because the data surveillance gathers comprises the deep uncertainties and blind trust that constitute intimacy. Or perhaps surveillance is yet another tentacle extension of the privilege white men afford themselves by building these infrastructures in the first place. Surveillance as insecurity.

As the scandal of NSA agents spying on lovers past and present broke in 2013, the news media acknowledged that the case—known as LOVEINT—was at least potentially troubling. Part of LOVEINT’s power to disturb, they suggested, was that some people could too easily relate to the often violent and controlling desire to breach trust—to seek out truths that are not easily available, and perhaps not meant for us to uncover. Given the chance, however, how many others would do the same as these NSA employees? How many of us do, in fact, do something similar with the means that we have, by checking a lover’s email or phone sub rosa? How many of us spy on each other by way of bureaucratic paperwork or devices that reveal traces of each other’s digital routines? Isn’t social media largely built for legitimated forms of self-tracking and for “following” others? The language certainly has a stalkerish ring to it. And if it becomes increasingly difficult for us to distinguish between a quick flip through a lover’s phone and mass data theft, perhaps we understand that intimacy is a more crucial component—and motive—of surveillance than so far made explicit in our technosocial imaginaries.

Vignette 2: Love Handles

It knows the real, inglorious version of me who copy-pasted the same joke to match 567, 568, and 569; who exchanged compulsively with 16 different people simultaneously one New Year’s Day, and then ghosted 16 of them.

Judith Duportail[6]

Fig. 2

Karen Jeane Mills, no title, digital image, 2018

© Karen Jeane Mills, commissioned by the author

-> See the list of figures

It’s a lot of information.[7] When the two of them look at the hard copies they requested, they realize that everything they’ve done online, however fleeting it might have seemed in the moment, amounts to tomes in print. In paper format, it takes on the weight of accumulated evidence. The tally for her is 800 pages from Tinder, a popular dating app; for him, a 1,200-page PDF from Facebook.[8] While two people don’t make a trend, they can make a point—about how Big Tech handles intimacies, with little regard paid to its users’ privacy or intimate lives. She’s a journalist; he’s a privacy activist. The two of them share an interest in user privacy in the context of the EU data protection law at a moment when Big Tech—especially social media companies—are trying their hand at surveilling users.[9]

First: the journalist

She orders her personal data from Tinder, and the company delivers her an 800-page report that details things she’d mostly forgotten regarding her various flirtations, desires, and fears. Embarrassed, she flips through the pages that speak back to her age, education, interests, and tastes. Recorded in these pages are also incredible volumes of information about her whereabouts, habits, proclivities—all things that emerge from patterns in the aggregate data. This is all data she’d willfully shared through the app itself for the purposes of dating. But the guilt and shame she later feels is evidence that you can in fact surprise yourself—not just by encountering a constellation of interpersonal communications that wouldn’t otherwise be read in relation to each other, but also when a company report confronts you with intimate patterns about yourself that you weren’t aware they were even collecting.

Tinder knows her in ways she doesn’t know herself because while she’s forgotten almost all of her 1,700 Tinder interactions, the app hasn’t, and won’t. Her ability to forget is what has allowed her to move on, to grow, to like new things without having to trace the many trajectories that informed those choices—without having to consider whether they were guided by moments of solitude, longing, boredom, sleepless nights, impulses, rejection, or the restlessness of too much quiet. The more she used the app, the more refined it became. In information technology studies, aggregated data generates what’s called “secondary implicit disclosed information.”[10] This just means that the app generates new data from patterns in the data volunteered by its users. And because Tinder has 50 million users, her data is cross-referenced with many others, which in turn reveals more about everyone using the app, as a group, than it does about each individual.

Tinder doesn’t hide the fact that it collects data on its users. They also reserve the right to sell it, trade it, or repurpose it. Tinder is made for matchmaking and most of its users are more preoccupied with finding lustful connections than with how their data might not be as safe, secure, or private as it feels within the framework of the app. That was true for her until she heard that the app’s algorithm produced a “desirability score” for all its users.[11] Tinder’s internal rating is called “the Elo Score” (a chess concept, referring to ability levels) and the app privately determines all of its users’ desirability score (which is not based, as one might expect, exclusively on the number of right and left “swipes” by others).[12] In turn, this score informs who you are likely to match with and thus to date, placing people in categories based on secret algorithmically generated criteria controlled by Tinder. She knows that her ratings limit her pool to people “in her own category.”[13] The more you match with people deemed highly desirable by Tinder, however, the more desirable you become. The app literalizes and reinforces the idea that people should date others in their “league” through algorithmic wizardry that quantifies the unquantifiable.

Second: the law student and privacy

Facebook also gathers data on its users. Given that one of its data centres is in Ireland and services the site’s European clientele, it is subject to different laws than in the US. There, the “right to access” entitles Europeans to know what a company knows about them. So when he ordered his Facebook details from the company, Facebook had no choice but to comply. They sent him a 1,200-page PDF outlining his clicks, likes, and pokes. He was only on Facebook for three years when the request was made, but the complexities of the data astound and worry him. As a privacy activist and a lawyer, he has since posted the contents and an analysis online, revealing the kinds of categories that Facebook is collecting, or willing to admit it’s collecting:

Fig. 3

Data categories collected by Facebook. Max Schrems, “Facebook’s Data Pool,” Europe Versus Facebook, http://europe-v-facebook.org/EN/Data_Pool/data_pool.html (accessed 16 May 2018).

-> See the list of figures

The lawyer wasn’t privy to his own biometric faceprint (considered a trade secret); presumably, the company leaves out other such experiments, which it too considers to be secondary information, a calculated byproduct of its magnificent algorithms. But he keeps pushing and challenging the legal system, insisting that a precedent not be set for companies like Facebook to act above the law. Above all, he wants to break the persistent myth circulated by the industry that nobody cares about their privacy on social media sites.[14] Mass, indiscriminate surveillance has quickly been normalized in data-driven industries—where storage becomes a fortress for ideals that see and support Big Tech as knowing best and caring for its users’ well-being.[15]

Vignette 3: Algorithmic Cookies

“No, it isn’t,” Charmaine insists. “Love isn’t like that.

With love, you can’t stop yourself.” -

The Heart Goes Last, Margaret Atwood[16]

Fig. 4

Karen Jeane Mills, no title, digital image, 2018

© Karen Jeane Mills, commissioned by the author

-> See the list of figures

“Do you know who this is?” she asks her new boyfriend with sincere bewilderment. She looks closely at the stranger’s face that Facebook has suggested to her as a possible connection: she is a blond woman in her late twenties. “Who is this?” she asks again, pointing to the blonde woman on her phone. The BF barely looks up. He works up a shrug, and says, “I think we hung out a few years ago, I don’t really know…” He keeps eating his cereal, unfazed, as few would be in this situation. Doesn’t he want to know why? Or, how she’s gotten around to asking him about an obscure ex? Nope.

Later, they are driving together. The GPS offers rerouting after rerouting as they make a detour to the liquor store on their way to a party in the suburbs. Time stops as she swipes off the mapping app, revealing a series of texts below. He snatches the phone away. She recognizes the Facebook woman’s name. It’s on his phone. They are driving. The boyfriend keeps his eyes on the road. They both remain silent.

We know where this is headed. But why deny knowing her in the first place? What had become normal for him here? Why couldn’t he speak of their relationship openly? And how had their networks become so entangled?

In the weeks to follow, she creates a series of fake dating accounts to match with his to see which online dating sites he was using (there were five; all of them very active). She confronts him and he lies again. Later still, while he is in the shower, she goes through his entire contact list and reads all of his texts. What she discovered was that he’d kept up conversations with a dozen or so women, recounting specific details of their personal lives in a way that created and maintained intimacies. He often asked them for sexy photos, which he’d just as often receive. When confronted about these conversations, though, he shows no remorse. He just says, “I don’t see the problem. It’s just virtual stuff.”

The line of what counts as “real” emotional contact and what counts as “virtual” flirtation is not for any one person to determine. What is at play here, however, is more significant than ongoing debates about what counts as “cheating.” These are often moral lines in the sand. Online communication has, however, created new ways to connect people; increasingly, it does so using algorithms programmed from a particular moral standpoint. Constantly managing how we are being tracked by our own devices can and will have huge effects on how we socialize and form connections offline.[17]

LinkedIn and Facebook, two of the most popular social networking sites for work and leisure, constantly recommend new connections in an effort to increase your (their) network. LinkedIn offers a sidebar of “People You May Know”; so does Facebook.[18] But how does the platform know who you should know?[19] Officially, it looks for commonalities between members, and shared connections in terms of employment, education, and experience (algorithmically defined). It also draws from cookies[20] and contacts imported from users’ address books. The rhizomatic nature of the platform’s growth renders the always-new web of connections almost too vast, as if to confuse and convince its users that it isn’t pulling from things like email, geolocation data, Facebook, or dating apps. But it is. While you might limit your privacy settings and turn off location services, these conscientious choices can be overridden by just one of your contacts offering LinkedIn access to their contacts. Facebook operates in a similar way.[21] Even if you change your phone number or email address, your friend network will reconnect you, insert you back into the social media sphere. Location services will out you based on proximity to another person.[22] And, increasingly, deep-learning facial recognition algorithms (Microsoft Face API, Facebook’s Facial Recognition App, Amazon Rekognition) are starting to do the work of profiling and connecting people, too. For example, Facebook makes a “template” of your face by using “a string of numbers” that is unique to you.[23] It then uses this data to match you and others in relation to you. The examples have become endless—and normal—and less startling to many as a result. In this story the couple is doomed to understand itself within the trappings of white heteronormative society and have this simultaneously be disrupted and reinforced by the impulsive affordances and addictive features of social media.

Surveillance structures intimacy in the present moment, and does so by normalizing and compartmentalizing it, and flattening desire along the way. This is important in the bigger picture of surveillance studies because too often concerns over privacy are not explored in terms of how they change our lives in profound ways—we all need secrets and the space to explore our multitudes.