When you can trust nobody, trust the smart machine

I will be at AOIR in Montreal, 10-13 October to present some newer work as I look beyond the book. Below is one brief summary of ongoing investigations:


 

What is the connection between smart machines, self-tracking, and the ongoing mis/disinformation epidemic? They are part of a broader shift in the social rules of truth and trust. Emerging today is a strange alliance of objectivity, technology and the ‘personal’ – often cast in opposition to the aging bastions of institutional expertise. The fantasy of an empowered individual who ‘knows for themselves’ smuggles in a new set of dependencies on opaque and powerful technologies.

 

1.

On one hand, individuals are encouraged to know more, and to take that knowing into their own hands. Emblematic is the growth of the self-tracking industry: measure your own health and productivity, discover the unique correlations that make you tick, and take control of rationalising and optimising your life. Taglines of ‘n=1’ and ‘small data’ sloganise the vision: the intrepid, tech-savvy individual on an empowering and personal quest to self-knowledge. Implicit here is a revalorisation of the personal and experiential: you have a claim to the truth of your body in ways that the doctor cannot, despite all their learned expertise. This is territory that I go into in some detail in the book.

 

smarr screencap.png

And so, Calit2’s Larry Smarr builds a giant 3D projection of his own microbiome – which, he claims, helped him diagnose the onset of Crohn’s disease before the doctors could.

 

But what does it mean to take control and know yourself, if this knowing happens through technologies that operate beyond the limits of the human senses? Subsidiary to the wider enthusiasm for big data, smart machines and machine learning, the value proposition of much (not all) of self-tracking tech is predicated on the promise of data-driven objectivity: the idea that the machines will know us better than we know ourselves, and correct the biases and ‘fuzziness’ of human senses, cognition, memory. And this claim to objectivity is predicated on a highly physical relationship: these smart machines live on the wrist, under the bedsheets, sometimes even in the user’s body, embedding their observations, notifications, recommendations, into the lived rhythms of everyday life. What we find is a very particular mixture of the personal and the machinic, the objective and the experiential: know yourself – through machines that know you better than you do.

 

risley affidavit.png

Jeannine Risley’s Fitbit data is used to help disprove her claims of being raped by an intruder. What is called ‘self-knowledge’ becomes increasingly capable being disassociated from the control and intentions of the ‘self’.

 

2.

Another transformative site for how we know and how we trust is that of political mis/disinformation. While the comparison is neither simple nor obvious, I am exploring the idea that they are animated by a common, broader shift towards a particular alliance of the objective, machinic and ‘personal’. In the political sphere, its current enemies are well-defined: institutional expertise, bureaucratic truthmaking and, in a piece of historical irony, liberalism as the dishonest face of a privileged elite. Here, new information technologies are leveraged towards what van Zoonen labelled ‘i-pistemology’: the embrace of personal and experiential truth in opposition to top-down and expert factmaking.

 

fake post.png

In such ‘deceptive’ social media postings, we find no comprehensive and consistent message per se, but a more flexible and scattershot method. The aim is not to defeat a rival message in the game of public opinion and truthtelling, but to add noise to the game until it breaks down. It is this general erosion of established rules that allows half-baked, factually incorrect and otherwise suspect information to compete with more official ones.

 

The ongoing ‘fake news’ epidemic of course has roots in post-Cold War geopolitics, and the free speech ideology embedded into social media platforms and their corporate custodians. But it is also an extension of a decades-long decline in public trust of institutions and experts. It is also an unintended consequence of what we thought was the best part about Internet technologies: the ability to give everyone a voice, to break down artificial gatekeepers, and allow more information to reach more people. It is well known how Dylann Roof, who killed nine in the 2015 Charleston massacre, began that path with a simple online search of ‘black on white crime’. The focus here is on what danah boyd identified as a loss of orienting anchors in the age of online misinformation: emerging generations of media users who are taught to assemble their own eclectic mix of truths in a hyper-pluralistic media environment, while also learning a deep distrust of official sources.

 

science march combo.png

2017 saw the March for Science: an earnest defence of evidence-based, objective, institutionalised truth as an indispensable tool for the government of self and others. The underlying sentiment: this isn’t an agenda for a particular kind of truth and trust, this is just reality – and anyway, didn’t we already settle this debate? But the debate over what counts as reality and how we get access to it is never quite settled.

 

3.

These are strange and unsettling combinations: the displacement of trust from institutions to technologies in the guise of the empowered ‘I’, and the related proliferation of alternative forms of truthtelling. My current suspicion is that they express an increasingly unstable set of contradictions in our long-running relationship with the Enlightenment. On one hand, we find the enduring belief in better knowledge, especially through depersonalised and inhuman forms of objectivity, as the ticket to rational and informed human subjects. At the same time, this figure of the individual who knows for themselves – found in Kant’s inaugural call of Sapere aude! – is increasingly subject to both deliberate and structural manipulations by sociotechnical systems. We are pushed to discover our ‘personal truths’ in the wilderness of speculation, relying only on ourselves – which, in practice, often means relying on technologies whose workings escape our power to audit. There is nobody you can trust these days, but the smart machine shall not lead you astray.

 

Advertisements

Lecture @ Wattis Institute for Contemporary Arts

I will be at the Wattis Institute for Contemporary Arts, San Francisco, on 20 March 2018 to discuss data, bodies and intimacy, as part of their year-long program on the work of Seth Price. More information here.

 

Data, or, Bodies into Facts

Data never stops accumulating. There is always more of it. Data covers everything and everyone, like skin, and yet different people have different levels of access to it,so it’s never quite fair to call it “objective” or even “truthful.

Entire industries are built around storing data, and then protecting,  organizing, verifying, optimizing, and distributing itFrom there, even the most banal pieces of data work to penetrate the most intimate corners of our lives.

For Sunha Hong, the promise of data is the promise to turn bodies into factsemotions, behavior, and every messy amorphous human reality can be distilled into the discrete, clean cuts of calculable information. We track our exercise, our sexual lives, our relationships, our happiness, in the hope of selfknowledge achieved through machines wrought in the hands of others. Data promises a certain kind of intimacy, but everything about our lived experience constantly violates this serene aesthetic wherein bodies are sanitized, purified, and disinfected into objective and neutral facts. This is the pushpull between the raw and the mediated.

Whether it be by looking at surveillance,algorithmic, or selftracking technologies,Hong’s work points to the question of how human individuals become the ingredient for the production of truths and judgments about them by things other than themselves.

 

Update: He gives a talk.

sign

[Talk] U. Milano-Bicocca

I will be at the University of Milano-Bicocca next week to give a talk on surveillance, self-tracking and the data-driven life. It will overlap significantly with my presentation last week at the Affect Theory conference; I’ll be posting the full text and slides afterwards. Abstract below.

The Data-Driven Life: The Parameters of Knowing in the Online Surveillance Society

‘Information overload’ is an old cliché, but when it was still fresh, it conveyed a broad and fundamental liquidity in the parameters of our experience. What it meant – and felt like – to remember, to know, was changing. Surveillance today is not simply a question of privacy or governmental power, but a practical extension of such liquidity. Surveillance’s fevered dream of total prediction hinges on its ability to subtend human sensibility – with its forgetfulness, bias, and other problems – to reach ‘raw’ and comprehensive data. This data-hunger, shared by states, corporations and individuals alike, betrays a ‘honeymoon objectivity’. The rise of new technologies for knowledge production is being misconstrued as a discovery of pure and unmediated information. The result is a profound shift in what qualifies as knowledge; who, or what, does the knowing; what decisions and actions are legitimated through that knowledge. Surveillance practices and controversies today host a reparametrisation of what ‘knowing’ entails.

In this talk, I will address two specific cases: the state surveillance of the Snowden Affair, and the self-surveillance of the Quantified Self (QS) movement. I draw on interviews, ethnographic observation and archival research that is part of a larger, ongoing project.

  1. I know we are being watched, Snowden told us so – but I don’t see it, and I don’t feel it. A vast surveillance program withdraws into the recesses of technological systems, denying our capacity to know and experience it. Conventional forms of proof or risk probabilities elude both arguments for and against it. This situation provokes two major patterns of ‘knowing’. First, subjunctivity leverages the recessive unknowability surrounding surveillance as if it were in some way true and certain, producing hypothetical, provisionary bases for real, enduring actions and beliefs. Statistical measures of danger become mathematically negligible, yet affectively overwhelming. Second, interpassivity projects others (human and nonhuman) who believe and experience what we cannot ourselves in our stead. Even if the world of surveillance and terror is not real in my back yard, these interpellated others help make it ‘real enough’. Technology’s recession thus provokes an existential dilemma; how do I ‘know’? What is it supposed to feel like to ‘know’?
  1. We cannot stand guard over our judgments without machines to keep us steady. If our knowledge of our own bodies, habits and affects were previously left to unreliable memories and gut feeling, QS promises a data-driven existence where you truly “come into contact with yourself” – through persistent, often wearable, self-surveillance. The Delphic maxim know thyself is applied to a very different existential condition, where my lived relationship with technology becomes the authoritative site for an abstracted relationship with my own body. Yet QSers also acknowledge that data is always incomplete, raising new uncertainties and requiring the intervention of subjective judgment. Here, it is technology’s protrusion which forces the question: how will you ‘know’ yourself through a digitality that subtends your memory and intention?

[ARTICLES] “Subjunctive and Interpassive” and “Presence”

Presence, or the sense of being-there and being-with in the new media society

Open access @First Monday.

This essay argues that the ways in which we come to feel connectivity and intimacy are often inconsistent with and irreducible to traditional markers like physical proximity, the human face or the synchronicity of message transmission. It identifies this non-objective and affective property as presence: conventionalised ways of intuiting sociability and publicness. The new media society is a specific situation where such habits of being affected are socially and historically parametrised. The essay provides two case studies. First: how do we derive a diffuse, indirect, intuitive sense of communicative participation — and yet also manage to convince ourselves of anonymity online? The second describes surveillance and data-mining as a kind of alienation: I am told my personal data is being exploited, but I do not quite ‘feel’ it. Surveillance practices increasingly withdraw from everyday experience, yet this withdrawal actually contributes to its strong presence.

Subjunctive and Interpassive ‘Knowing’ in the Surveillance Society

Open access @Media and Communication.

The Snowden affair marked not a switch from ignorance to informed enlightenment, but a problematisation of knowing as a condition. What does it mean to know of a surveillance apparatus that recedes from your sensory experience at every turn? How do we mobilise that knowledge for opinion and action when its benefits and harms are only articulable in terms of future-forwarded “as if”s? If the extent, legality and efficacy of surveillance is allegedly proven in secrecy, what kind of knowledge can we be said to “possess”? This essay characterises such knowing as “world-building”. We cobble together facts, claims, hypotheticals into a set of often speculative and deferred foundations for thought, opinion, feeling, action. Surveillance technology’s recession from everyday life accentuates this process. Based on close analysis of the public mediated discourse on the Snowden affair, I offer two common patterns of such world-building or knowing. They are (1)subjunctivity, the conceit of “I cannot know, but I must act as if it is true”; (2) interpassivity, which says “I don’t believe it/I am not affected, but someone else is (in my stead)”.