Smart Machines and Enlightenment Dreams (2)

In part one, I mentioned a ‘nagging suspicion’:

aren’t (1) the fantastically optimistic projections around objective data & AI, and (2) the increasingly high-profile crises of fake news, algorithmic bias, and in short, ‘bad’ machinic information, linked in some way? And aren’t both these hopes and fears rooted, perhaps, in the Enlightenment’s image of the knowing subject?

As usual, we’re caught up in two seemingly opposite fantasies. First, that the human is a biased, stupid, unreliable processor of information, and must be augmented – e.g. by the expanding industry of smart machines for self-tracking. Second, that the individual can know for themselves, they can find the truth, if only they can be more educated, ingest more information – e.g. by watching more Jordan Peterson videos.

Below are some of my still-early thoughts around what we might call the rise of personal truthmaking: an individualistic approach that says technology is going to empower people to know better than the experts, often in cynical and aggressive opposition to institutional truth, but a style that we find in normative discourses around fact-checking and media literacy as well as by redpilled conspiracy theorists, and in mainstream marketisation of smart devices as well as the public concern around the corruption of politics.

 

Smart machines

Let’s start with the relatively celebrated, mainstream instance, a frontrunner in all the latest fads in data futurism. Big data is passé; the contrarian cool is with small data, the n=1, where you measure your exercise, quantify your sleep, analyse your productivity, take pictures of your shit, get an app to listen to you having sex, to discover the unique truths about you and nobody else, and use that data to ping, nudge, gamify yourself to a better place.

Implicit here is a clear message of individual empowerment: you can know yourself in a way that the experts cannot. Take the case of Larry Smarr, whose self-tracking exploits were widely covered by mainstream media as well as self-tracking communities. Smarr made a 3D model of his gut microbiota, and tracked it in minute detail:

smarr screencap

This, Smarr says, helped him diagnose the onset of Crohn’s disease before the doctors could. He speaks about the limitations of the doctor-patient relationship, and how, given the limited personal attention the healthcare system can afford for your own idiosyncratic body and lifestyle, you are the one that has to take more control. Ironically, there is a moment where Kant, in his 1784 What is Enlightenment?, broaches the same theme:

It is so easy to be immature [unmündigkeit]. If I have […] a doctor who judges my diet for me […] surely I do not need to trouble myself. I have no need to think, if only I can pay.

To be sure, Kant is no proto-anti-vaxxer. Leaving aside for a moment (though a major topic for my research) the many readings of aufklärung and its place in historicising the Enlightenment, we can glimpse in that text a deep tension between the exhortation to overcome tutelage, to have the courage to use your own understanding, and the pursuit of universally objective truth as the basis for rationalisation and reform. And it is this tension that again animates the contemporary fantasy of ubiquitous smart machines that will know you better than you know yourself, and in the process empower a knowing, rational, happy individual.

Now, it just so happens that Larry Smarr is a director at Calit2, a pioneer of supercomputing tech. He has the money, the tech savvy, the giant room to install his gut in triple-XL. But for everybody else, the promise of personal knowledge often involves a new set of dependencies. As I’ve discussed elsewhere, the selling point of many of these devices is that they will collect the kind of data that lies beyond our own sensory capabilities, such as sleep disturbances or galvanic skin response, and that they will deliver data that is objective and impartial. It’s a kind of ‘personal’ empowerment that works by empowering a new class of personalised machines to, the advertising mantra goes, ‘know us better than we know ourselves’.

The book will focus on how this particular kind of truthmaking begins with the image of the hacker-enthusiast, tracking oneself by oneself using self-made tools, and over time, scales up to the appropriation of these data production lines by insurance companies, law enforcement, and other institutions of capture and control. But here, we might ask: how does this particular dynamic resonate with other contexts of personal truthmaking?

 

Redpilling

We might recall that what’s happening with self-tracking follows a well-worn pattern in technologies of datafication. With the likes of Google and Amazon, ‘personalisation’ meant two things at the same time. We were offered the boon of personal choice and convenience, but what we also got was a personalised form of surveillance, manipulation, and social sorting. In the world of data, there’s always a fine print to the promise of the personal – and often it’s the kind of fine print that lies beyond the reach of ordinary human lives and/or the human senses.

Fast forward a few years, and personalisation is again being raised as a pernicious, antidemocratic force. This time, it’s fake news, and the idea that we’re all falling into our own filter bubbles and rabbit holes, a world of delusions curated by youtube algorithms. When Russian-manufactured Facebook content looks like this:

fake post

we find no consistent and directly political message per se, but a more flexible and scattershot method. The aim is not to defeat a rival message in the game of public opinion and truthtelling, but to add noise to the game until it breaks down under the weight of unverifiable nonsense. It is this general erosion of established rules that allows half-baked, factually incorrect and otherwise suspect information to compete with more official ones.

We recognise here the long, generational decline across many Western nations of public trust in institutions that folks like Ethan Zuckerman has emphasised as the backdrop for the fake news epidemic. At the same time, as Fred Turner explains, the current disinformation epidemic is also an unintended consequence of what we thought was the best part about Internet technologies: the ability to give everyone a voice, to break down artificial gatekeepers, and allow more information to reach more people.

Consider the well known story of how the 2015 Charleston shooter began that path with a simple online search of ‘black on white crime’ – and stumbling on a range of sources, showing him an increasingly funneled branch of information around crime and race relations. In a way, he was doing exactly what we asked of the Internet and its users: consult multiple sources of information. Discover unlikely connections. Make up your own mind.

The same goes for the man who shot up a pizza restaurant because his research led him to believe Pizzagate was real. In a handwritten letter, Welch shows earnest regret about the harm he has done – because he sought to ‘help people’ and ‘end corruption that he truly felt was harming innocent lives.’

Here we find what danah boyd calls the backfire of media literacy. It’s not that these people ran away from information. The problem was that they dove into it with the confidence that they could read enough, process it properly, and come to the secret truth. Thus the meme is that you need to ‘redpilling’ yourself, to see the world in an objective way, to defeat the lies of the mainstream media.

Picture2

Once again, there is a certain displacement, the fine print, parallel to what we saw with self-tracking. Smart machines promise autonomous self-knowledge, but only by putting your trust in a new set of technological mediators to know you better than you know yourself. Redpilling invites individuals to do their research and figure out their own truth – but you’ll do it through a new class of mediators that help plug you into a network of alternative facts.

 

Charisma entrepreneurs

The Pizzagate shooter, we know, was an avid subscriber to Alex Jones’ Infowars. The trail of dependencies behind the promise of individual empowerment reveals shifting cultural norms around what a trustworthy, authentic, likeable source of information feels like.

America, of course, woke up to this shift in November 2016. And in the days after, the outgoing President offered a stern warning about the complicity of our new media technologies:

An explanation of climate change from a Nobel Prize-winning physicist looks exactly the same on your Facebook page as the denial of climate change by somebody on the Koch brothers’ payroll.

The assumption being, of course, that we would universally still find the Nobel a marker of unquestionable trust, and vice versa for Koch money. But what if a Harvard professorship is no longer such an unquestioned seal of guarantee, and what if being funded by oil money isn’t a death knell for your own credibility about climate change?

To describe these changes in terms of cynicism and paranoia is to capture an important part of this picture, but not all of it. We rarely pass from a world of belief to a world without, but from one set of heuristics and fantasies to another. What recent reports such as one on the ‘alternative influence network‘ of youtube microcelebrities reveals is the emergence of a certain charismatic form of truth-peddling.

By charismatic, I am contrasting the more serious, institutionalised bureaucratic styles to what Weber had called ‘charismatic authority’ [charismatische Herrschaft]: that which attracts belief precisely through its appearance as an unorganised, extraordinary form of truth. It’s critical here to distinguish this charisma from some internal psychological power, as if certain people possess a magical quality to entrance others. Weber considered charisma in more or less relational terms, as an effect of others’ invested belief, and something which often undergirds more institutionalised forms of power as well. The key is to understand charisma’s self-presentation as an explicitly extra-institutional circuit, through which actors are able to promise truth and action too radical for normal process, and to claim a certain ideological purity or proximity to Truth beyond the messiness of the status quo.

We can immediately recognise how alternative influencers, elements of the far-right, etc. have sought to turn issues like anti-political correctness into a marker of such charismatic authority. And individuals like Jones become exhibits in the emerging performative styles of such charisma, from his regular Mongol horde-like forays into the mainstream to pick up notoriety, or his self-righteous masculine rage as a default emotional state:

Picture1.png

But we should add to that a couple of slightly less obvious dimensions, ones which make clear the parallels and resonances across the different businesses that sell the fantasy of personal truthmaking.

The first is that influencers like Jones consistently claim that they are the rational ones, they are the ones that go for scientific evidence, they are the true heirs of the Enlightenment. The common refrain is: I’m not gonna tell you what to think: I just want to inform you about what’s happening, about Pizzagate, about fluoride in your water, about the vaccines, and let you make up your own mind. The reams of paper strewn about Jones’ desk, regularly waved at the camera with gusto, are markers of this seeming commitment to Reason and data – even though, in many cases, this ‘evidence’ is simply Infowars articles reprinted to testify on Infowars the show.

Alex Jones doesn’t reject the Enlightenment; he wants to own it.

Second, all this is further complicated by the commercialised structure of this charismatic truthmaking. Alex Jones isn’t just a fearless truthspeaker, but also a full time vitamin peddler. While Jones works to obscure his exact revenue and audience numbers, his ‘side’ business of dubious supplements has grown into a major source of funding that helps support the continued production of political content. In his many infomercials seeded into the show, Jones touts products like Super Male Vitality – a mostly pointless mixture of common herbal ingredients packed with a premium price and a phallic rubber stopper.

Picture3.png

Recently, Jones has updated his stock with products like “Happease” – “Declare war on stress and fatigue with mother nature’s ultimate weapons” – in a clear nod to the more dominant market of highly feminised wellness markets (think Gwyneth Paltrow’s goop and its ‘Psychic Vampire Repellent’). The connection between fake news and fake pills is made clear in one of Jones’ own sales pitches:

You know, many revolutionaries rob banks, and kidnap people for funds. We promote in the free market the products we use that are about preparedness. That’s how we fund this revolution for the new world order.

Such shifts threaten to leave Obama’s earlier warning behind as a quaint reminder of older standards. For instance, exposing someone’s financial conflict of interest used to be a surefire way to destroy their credibility as a neutral, objective truthteller. But how do we adapt if that equation has changed? As Sarah Banet-Weiser has shown in Authentic, you can now sell out and be authentic, you can brand your authenticity. You can make your name as a mysterious, counter-cultural graffiti artist speaking truth to power, and then make a killing auctioning your piece at Sotheby’s, having the piece rip itself up on front of the buyer’s eyes – and they will love how real it is. In such times, can we really win the battle for reason by showing how transparent and independent our fact-checkers are?

 

Truth isn’t truth

Today, we often say truth is in crisis. The emblematic moment was the Marches for Science, which brought out outrage, but also a certain Sisyphean exasperation. Haven’t we been through this already? Surely truth is truth? Surely the correct path of history has already been established, and what must be done is to remind everyone of this?

science march combo

Well, Rudy Giuliani has the answer for us: truth isn’t truth. Facts are in the eyes of the beholder, or at least, nowadays they are. Or, to be less glib: the struggle today is not simply between truth and ignorance, science and anger – a binary in which the right side goes without saying, and the wrong side is the dustbin of history screaming and wailing for the hopefully final time. Rather, it is a struggle over what kinds of authorities, what kinds of ways of talking and thinking, might count as rational, and how everybody’s trying to say the data and the technology are on their side.

It’s a twisted kind of Enlightenment, where the call to know for yourself, to use Reason, doesn’t unify us on common ground, but becomes a weapon to wield against the other side. Insisting on the restoration and revalorisation of objective journalism or faith in objective science might be tempting, and certainly an essential part of any realistic solution. But taken too far, they risk becoming just as atavistic as MAGA: a reference point cast deep enough into the mist, that it sustains us as fantasy precisely as something on the cusp of visibility and actuality. A nice dream about Making America Modern Again.

Information has always required an expansive set of emotional, imaginative, irrational investments in order to keep the engines running. What we see in self-tracking and charismatic entrepreneurs are emerging ‘disruptive’ groups that transform the ecosystem for the production and circulation of such imaginations. We might then ask: what is the notion of the good life quietly holding up, and spreading through, the futurism of smart machines or the paranoid reason of charismatic influencer?

 

Interview @ Gas Gallery

I spoke with Ceci Moss at Gas, a mobile art gallery that roams Los Angeles and the web, about different forms of self-tracking: the technological promises and economic precarities, moral injunctions and everyday habits… found here.

An excerpt:

We have to ask not only ‘is this really empowering or not’, but also ‘what is it about our society that makes us feel like we need to empower ourselves in this way?’ In the same way, we have to ask what kind of new labours, new troubles, new responsibilities, new guilts, that these empowering activities bring to our doorstep. From an economic perspective, if you are someone who has to constantly sell one’s productivity to the market, the ‘empowerment’ of self-tracking and self-care becomes a necessary labour for one’s survival. The injunction to ‘care for yourself’ is a truncated version of ‘you’ve got to care for yourself to stay afloat, because nobody will do it for you.’

The interview is part of their ongoing exhibition:

take care | June 9–July 20, 2018

Featuring: Hayley Barker, Darya Diamond, Ian James, Young Joon Kwak, C. Lavender, Sarah Manuwal, Saewon Oh, Amanda Vincelli, and SoftCells presents: Jules Gimbrone

How do radical ambitions of “self-care” persist or depart from capitalist society’s preoccupation with wellness and the industry surrounding it, particularly when filtered through technological advances? How can we imagine personal wellness that complicates or diverges from capitalist and consumerist tendencies? Taking its name from the common valediction, which is both an expression of familiarity and an instruction of caution, take care, is a group exhibition that considers the many tensions surrounding the possibilities of self-care.

 

Lecture @ Wattis Institute for Contemporary Arts

I will be at the Wattis Institute for Contemporary Arts, San Francisco, on 20 March 2018 to discuss data, bodies and intimacy, as part of their year-long program on the work of Seth Price. More information here.

 

Data, or, Bodies into Facts

Data never stops accumulating. There is always more of it. Data covers everything and everyone, like skin, and yet different people have different levels of access to it,so it’s never quite fair to call it “objective” or even “truthful.

Entire industries are built around storing data, and then protecting,  organizing, verifying, optimizing, and distributing itFrom there, even the most banal pieces of data work to penetrate the most intimate corners of our lives.

For Sunha Hong, the promise of data is the promise to turn bodies into factsemotions, behavior, and every messy amorphous human reality can be distilled into the discrete, clean cuts of calculable information. We track our exercise, our sexual lives, our relationships, our happiness, in the hope of selfknowledge achieved through machines wrought in the hands of others. Data promises a certain kind of intimacy, but everything about our lived experience constantly violates this serene aesthetic wherein bodies are sanitized, purified, and disinfected into objective and neutral facts. This is the pushpull between the raw and the mediated.

Whether it be by looking at surveillance,algorithmic, or selftracking technologies,Hong’s work points to the question of how human individuals become the ingredient for the production of truths and judgments about them by things other than themselves.

 

Update: He gives a talk.

sign

Data Epistemologies – Dissertation online

Data Epistemologies: Surveillance and Uncertainty, my dissertation at the University of Pennsylvania, is now online for public access here. It is, in many ways, a working draft for my current book project.

Abstract:

Data Epistemologies studies the changing ways in which ‘knowledge’ is defined, promised, problematised, legitimated vis-á-vis the advent of digital, ‘big’ data surveillance technologies in early twenty-first century America. As part of the period’s fascination with ‘new’ media and ‘big’ data, such technologies intersect ambitious claims to better knowledge with a problematisation of uncertainty. This entanglement, I argue, results in contextual reconfigurations of what ‘counts’ as knowledge and who (or what) is granted authority to produce it – whether it involves proving that indiscriminate domestic surveillance prevents terrorist attacks, to arguing that machinic sensors can know us better than we can ever know ourselves.

The present work focuses on two empirical cases. The first is the ‘Snowden Affair’ (2013-Present): the public controversy unleashed through the leakage of vast quantities of secret material on the electronic surveillance practices of the U.S. government. The second is the ‘Quantified Self’ (2007-Present), a name which describes both an international community of experimenters and the wider industry built up around the use of data-driven surveillance technology for self-tracking every possible aspect of the individual ‘self’. By triangulating media coverage, connoisseur communities, advertising discourse and leaked material, I examine how surveillance technologies were presented for public debate and speculation.

This dissertation is thus a critical diagnosis of the contemporary faith in ‘raw’ data, sensing machines and algorithmic decision-making, and of their public promotion as the next great leap towards objective knowledge. Surveillance is not only a means of totalitarian control or a technology for objective knowledge, but a collective fantasy that seeks to mobilise public support for new epistemic systems. Surveillance, as part of a broader enthusiasm for ‘data-driven’ societies, extends the old modern project whereby the human subject – its habits, its affects, its actions – become the ingredient, the raw material, the object, the target, for the production of truths and judgments about them by things other than themselves.

Data’s Intimacy

New piece at the open source Communication+1, titled ‘Data’s Intimacy: Machinic Sensibility and the Quantified Self’. This is the first of, I think, two or three pieces on self-tracking that will be rolling out over the next couple of years.

expereal

expereal, a mood visualisation service (now possibly defunct)

 

Abstract

Today, machines observe, record, sense the world – not just for us, but sometimes instead of us (in our stead), and even indifferently to us humans. And yet, we remain human. Correlationism may not be up to a comprehensive ontology, but the ways in which we encounter, and struggle to make some kind of sense of, machinic sensibility matters. The nature of that encounter is not instrumentality, or even McLuhanian extension, but a full-blown ‘relationship’ where the terms by which machines ‘experience’ the world, and communicate with each other, parametrises the conditions for our own experience. This essay will play out one such relationship currently in the making: the boom in self-tracking technologies, and the attendant promise of data’s intimacy.

This essay proceeds in three sections, all of which draw on a larger research project into self-tracking and contemporary data epistemologies. It thus leverages observations from close reading of self-tracking’s publicisation in the mass media between 2007 and 2016; analysis of over fifty self-tracking products, some of it through self-experimentation; and interviews and ethnographic observation, primarily of the ‘Quantified Self’ connoisseur community. The first section examines the dominant public presentations of self-tracking in early twenty-first century discourse. This discourse embraces a vision of automated and intimate self-surveillance, which is then promised to deliver superior control and objective knowledge over the self. Next, I link these promises to the recent theoretical turns towards the agency of objects and the autonomous sensory capacities of new media to consider the implications of such theories – and the technological shifts they address – for the phenomenology of the new media subject. Finally, I return to self-tracking discourse to consider its own idealisation of such a subject – what I call ‘data-sense’. I conclude by calling for a more explicit public and intellectual debate around the relationships we forge with new technologies, and the consequences they have for who – and what – is given which kinds of authority to speak the truth of the ‘self’.

[Talk] U. Milano-Bicocca

I will be at the University of Milano-Bicocca next week to give a talk on surveillance, self-tracking and the data-driven life. It will overlap significantly with my presentation last week at the Affect Theory conference; I’ll be posting the full text and slides afterwards. Abstract below.

The Data-Driven Life: The Parameters of Knowing in the Online Surveillance Society

‘Information overload’ is an old cliché, but when it was still fresh, it conveyed a broad and fundamental liquidity in the parameters of our experience. What it meant – and felt like – to remember, to know, was changing. Surveillance today is not simply a question of privacy or governmental power, but a practical extension of such liquidity. Surveillance’s fevered dream of total prediction hinges on its ability to subtend human sensibility – with its forgetfulness, bias, and other problems – to reach ‘raw’ and comprehensive data. This data-hunger, shared by states, corporations and individuals alike, betrays a ‘honeymoon objectivity’. The rise of new technologies for knowledge production is being misconstrued as a discovery of pure and unmediated information. The result is a profound shift in what qualifies as knowledge; who, or what, does the knowing; what decisions and actions are legitimated through that knowledge. Surveillance practices and controversies today host a reparametrisation of what ‘knowing’ entails.

In this talk, I will address two specific cases: the state surveillance of the Snowden Affair, and the self-surveillance of the Quantified Self (QS) movement. I draw on interviews, ethnographic observation and archival research that is part of a larger, ongoing project.

  1. I know we are being watched, Snowden told us so – but I don’t see it, and I don’t feel it. A vast surveillance program withdraws into the recesses of technological systems, denying our capacity to know and experience it. Conventional forms of proof or risk probabilities elude both arguments for and against it. This situation provokes two major patterns of ‘knowing’. First, subjunctivity leverages the recessive unknowability surrounding surveillance as if it were in some way true and certain, producing hypothetical, provisionary bases for real, enduring actions and beliefs. Statistical measures of danger become mathematically negligible, yet affectively overwhelming. Second, interpassivity projects others (human and nonhuman) who believe and experience what we cannot ourselves in our stead. Even if the world of surveillance and terror is not real in my back yard, these interpellated others help make it ‘real enough’. Technology’s recession thus provokes an existential dilemma; how do I ‘know’? What is it supposed to feel like to ‘know’?
  1. We cannot stand guard over our judgments without machines to keep us steady. If our knowledge of our own bodies, habits and affects were previously left to unreliable memories and gut feeling, QS promises a data-driven existence where you truly “come into contact with yourself” – through persistent, often wearable, self-surveillance. The Delphic maxim know thyself is applied to a very different existential condition, where my lived relationship with technology becomes the authoritative site for an abstracted relationship with my own body. Yet QSers also acknowledge that data is always incomplete, raising new uncertainties and requiring the intervention of subjective judgment. Here, it is technology’s protrusion which forces the question: how will you ‘know’ yourself through a digitality that subtends your memory and intention?