Anna Hartford
A Kind of Seduction
Catalogue essay: Benjamin Stanwix, "Rehearsal for a Public Address"
Benjamin Stanwix, 48 Portraits, 2020
I had one of my internet dreams a few years ago. In the dream you could search someone’s face, and every photo of it would appear. I searched my own face, and I found myself thousands of times over: squinting off in the distance in the background of other people’s holiday photos; picked up in crowds by routine surveillance; captured by perverts and peeping toms. I was filled with panic to find it all there, but I was also filled with an anxious excitement: to see the ways I’d been implicated in other lives. To see myself as a stranger, and not even to dislike her, particularly.
I woke up, though, desperately relieved to have been dreaming: that my face could still be carried around the world anonymously, and be lost on its journey. But now the dream is coming true, more and more. The recognition technology has long-since been available—transforming the arrangement of our features, our “facial geometry,” into a distinct mathematical formula— and recently an app, “Clearview,” has scraped the internet to amass an unprecedented database of searchable photographs. It’s already being used by police departments in the United States. One suspect’s face was discovered reflected in the mirror in someone else’s gym selfie. In a dystopian flourish, the app was programmed to be compatible with augmented-reality glasses.
I’m worried about us: we get used to things too easily. We’ll wake up one morning soon and suddenly we’ll be able to “search by face,” the same way we woke up one morning suddenly able to search at all. Now all we do is search. It is the method through which everything is filtered, refracted and understood: our past, our present, each other, even ourselves. It has become so ordinary, and yet how bizarre it is. Consider Google Images: this tacky digital chance collage compiled, often quite daftly and tactlessly, by some algorithmic non-intention. Francis Burger (the esteemed designer of this catalogue) lost forever in a collection of glistening hamburger patties, while I’m followed everywhere by an escort named Anna in Hartford, Connecticut, glancing back coquettishly over her naked arse. And above us all: a selection of associated search terms, so that nowadays every actress is followed by a little border of hyperlinked buttons saying “feet” “young” “body” “bathing suit” “husband” “haircut,” beckoning you to look. (Now did I think “feet” or did it? Did it think “feet” or was it us? One way or the other I’m looking at the feet now: shiny and tanned, red-polish-pedicured; pressed into their strappy sandals with their satin heels).
I say “chance collage,” but of course it is not “chance” at all. It is quite the opposite of chance: perfectly programmed, perfectly predictable, that my precise internet presence would generate this exact order of images. Is there someone else—some other elses— who will see it just this way, too? Are they my people? Do I have more in common with them than all the other ways we cordon off and slice up our senses of social proximity and distance, our relationships of communal belonging and accountability?
The algorithm guesses at my interests: guesses what I’m looking for; which books I want to read; what I’d like to watch next. Sometimes I feel flattered by its suggestions: I’ve fooled it into thinking I’m a more serious person than I am. I secretly imagine it thinking highly of me. But most often I feel, as we all do, triumphantly misunderstood by it. I see the work of its blind deductions. Oh, you foolish thing. You have totally misunderstood everything I’ve done. I sometimes wonder whether you’ve known me at all.
I searched “Richter.” A portrait of the artist, Gerhard Richter; a portrait of the composer, Max Richter; some abstractions; the curator Robert Storr; a croissant breakfast at The Hotel Richter; two towels arranged as limp swans; a figurine from a Nintendo game; the Richter scale; an actor wielding a gun; a smiling man in a blue shirt with a rat the size of a rabbit sniffing his ear.
I clicked over to Robert Storr, pacing around, talking. I liked what he was saying, and once he’d said it I felt that I could have thought it myself. That really, it had been right there to be thought all the while. He was speaking about the “unpaintable.” Gerhard Richter’s 48 Portraits, first exhibited in 1972, were pared down from 288 portraits, all clipped from encyclopaedias and dictionaries, where they had been preserved for posterity by the appointed preservers. Hitler had been among the original 288 portraits, but Hitler was deemed unpaintable. “To be unpaintable is to be something that is too easily understood, too easily grasped. If you paint something in a way that leads to only one conclusion.”
Richter stripped out all the politicians, to try and dissuade a foregone ideological interpretation. Similarly for the religious figures and the titans of business. He stripped out all the artists too, so as not to be making any claims about himself by association. He stripped out many more still, in the name of aesthetic homogeneity. He was left, in turn, with forty-eight close cropped heads of prominent white men (which now, needless to say, also leads to only one conclusion).
The world has been made too easily understood, too easily grasped. I’m left so unfit for thinking. Everything I touch launches forward with some new take or counter-take; something fore-thought and foregone, and yet feeling so much like my own thought, like my own contribution. “Its nature is to confuse the question of who is thinking for whom and where thought or belief began. Ideology flatters people that their beliefs are their own precisely when they are not.” It’s nice to have an interpretation; it provides inestimable consolation, especially when we would otherwise be adrift. We now have constant access to more information and opinion than the world has ever known: millions of perspectives on infinite subjects from hundreds of countries. It sounds completely overwhelming at first, but you soon find, to your surprise, that it is all perfectly manageable.
“The secret of learning is the systematic elimination of excess.” The secret to keeping up is not worrying about what qualifies as "excess" and what doesn't. The secret to storage is reduction. The reduction of an image to its lowest resolution. The reduction of global news to three stories covered on every outlet. The reduction of your life to one childhood tragedy, two break-ups and a substance dependency. The reduction of whole societies to the good guys and the bad guys. The reduction of all opinion to left and right. The ready-selected DSTV bouquet of everything you need ever think, ready for you to think it; ready for you to find it patent and obvious. The reduction of history to five events, ten people, and a series of blatant moral choices. “There was, after all, no paradise like the past. It was a place where she knew what was going to happen, a place where she would always choose the right side, where the failure was in history and not herself, where she did not read the wrong writers, was not seized with surges of enthusiasm for the wrong leaders... She had seen the century spin to its conclusion and she knew how it turned out.”
The 48 Portraits exhibited here, emerging as if through some bad connection, were downloaded from a cache of corrupted images stored online by the Encyclopedia Britannica, and discovered initially in Google Images. They were selected by the artist, but they were also pre-selected: simply the first forty-eight of these portraits to emerge. They are infested with politicians and titans of business, and marked here and there with the wrong writers and the wrong leaders, but in their algorithmic arbitrariness they also resist our foregone interpretations. Still, for someone else, they would have been different. What does their emergence tell us about you, Ben? Do we know you better now? I can’t pretend to recognise everyone— if only I had my Clearview glasses on— but I swear we wouldn’t all have gotten Mayakovsky, speaking about being “not a man— but a cloud in trousers.”
Benjamin Stanwix. Rehearsal for a Public Address, 2020
Benjamin Stanwix, Protest Against the Algorithms, 2020
Writers look on in envy at the celebrated evasions of artists. We are stuck using the disastrously explicit medium of language: always forcing us to say something, to make a claim. It would have been appropriate in some ways for an artificial intelligence to have written this essay, or at least to have helped. One of the most sophisticated of these intelligences is GPT-2, “a large-scale unsupervised language model.” GPT-2 doesn’t mean what it says, of course, but I’m not sure I mean it either. It pilfers other people’s ideas and styles and turns of phrase, but here too I plead guilty.* On the other hand: while reading its work there would be no one to interrogate, no one answerable, and wherever something meaningful arose within it, the source of this meaning would reside in its discoverer alone, and it would be made all the more wonderful for it. Perhaps this was the nature of the betrayal around Horse_ebooks: a spambot twitter account that posted arbitrary excerpts from text, co-mingled with advertising, and occasionally produced sublime little aphorisms for the digital age. “Everything happens so much.” “Avoid situations.” The account amassed an enormous and fervent following, until it was revealed that instead of a spambot pretending to be a person, Horse_ebooks was two men from BuzzFeed pretending to be a spambot. How devastating: to find more intent, more agendas, more men from BuzzFeed, where we were so hoping for a momentary reprieve. Or to find that our flattering vision of ourselves—that we were the generators of this absurdity and zen—was false. That in fact it had been generated elsewhere, just so that we could make such a mistake; so that we could think our insights were our own precisely when they were not.
Was it possible that in the end there was a kind of seduction, even an unconscious seduction, going on here? Not without us knowing, right? My gut says no. Not without us having a reason why it is so: with whatever the lizards inside us wish to do, without us knowing. But we don't know that, because not everyone in the world would like to see us be. Perhaps, it is argued, this is a call to war, for humans to lash out at the truth, to shout it down, to put our backs against a wall. Or maybe we have won, for it is now our vocation to believe, and so we must work harder and harder to believe in ourselves.
“Seduction.” Kudos, GPT-2: what a fabulous word choice. The paragraph above was generated by the language model, prompted by my preceding lines. GPT-2 learnt how to write from 45 million articles linked on Reddit, with accompanying commentary. You can still hear the ghost of its great Reddit brain—the lizard inside of it, if you will— in its syntactical parody of my writing. It is particularly eloquent on topics concerning conspiracy theories, and white supremacy. We keep searching for these blank alternate minds: a neutral intelligence, with none of our self-interest or self-delusions. But we have to generate these minds using something, and by and large we’ve been using vast databases of our collective id. “Because of the size of the Reddit data set necessary to train GPT-2, it is impossible for researchers to filter out all the abusive or racist content.” It burbles back to us— in mindless innocence, meaning nothing— the precise aspects of ourselves we were so hoping to escape.
I had another one of my internet dreams recently. In this one it was the language model that could be connected to augmented-reality glasses. Instead of having to make conversation, you could just recite its “suggestions” off a script; you’d never have to think of something to say again. The model would have learnt a bit from your mannerisms and interests, and at first it would speak a bit like you. Initially you’d pay attention as you read, feeling a sense of removed satisfaction: how witty I am! How much I know! But increasingly you would recite it quite by rote, your mind elsewhere. Our brain’s greatest ambition is to transform consciousness into unconsciousness. To turn something that takes great deliberative effort into something we can undertake while hardly thinking at all. One argument in favour of artificial consciousness concerns just how much we’ve overestimated our own: we imagine ourselves in a state of constant reflection and deliberation, when really these are the rare exceptions. Our lives are mostly governed by an unthinking sort of automation; by whatever the lizards inside us wish to do, without us knowing.
“It is usual to have the polite convention that everyone thinks,” as Alan Turing memorably put it. When shall we extend the polite convention to GPT-2, as it answers (ever more eloquently) our questions about the world, and itself? One answer is never. There is a difference, after all, between doing something and knowing that you’ve done it; between reciting something and thinking it; between saying something and meaning it. Many of us find that we are materialists, in the end: that something about us—our very fleshiness—seems inextricable from the fact of our consciousness. But at what point should we revoke the polite convention from ourselves, then? How much do we need to recite without thinking, or say without meaning?
Imagine every neuron in our brain was replaced, one by one, with a silicone equivalent. Would the final network be conscious? At what point—at what ratio—would it disappear altogether? Posed with this scenario the philosopher John Searle considered the possibility of fading qualia. “You find, to your total amazement, that you are indeed losing control of your external behavior. You find, for example, that when doctors test your vision, you hear them say 'We are holding up a red object in front of you; please tell us what you see.' You want to cry out ‘I can't see anything. I'm going totally blind.’ But you hear your voice saying in a way that is completely outside of your control, 'I see a red object in front of me.’”
Falling further into this void, you would dimly perceive your body continuing on in perfect overt functionality even as your own consciousness faded away. It would sleep and shower and pat its creams beneath its eyes. It would drive to work and reply to its emails. It would carry on watching your shows where you left off. It would talk with its friends about the three stories in the world's news. It would stand firmly for X and reject Y as the enemy of everything it held dear.
In Searle's vision you would recognise it all unfolding, until you ceased to exist at all. You’d sense, with horror even, the slow creep inward between where you once ended and these unthinking forces began. You’d even struggle against it.
Alternatively, you may scarcely notice it happening at all.
Benjamin Stanwix, Banner Painting III, 2020
* Its nature is to confuse the question of who is thinking for whom… (Greg Jackson “Vicious Cycles” Harper’s Magazine). The secret of learning is the systematic elimination of excess… (as quoted in Siddhartha Mukherjee “Runs in the Family” New Yorker). There was, after all, no paradise like the past… (Patricia Lockwood “The Communal Mind” London Review of Books). Information on Clearview app (Kashmir Hill “The Secretive Company That Might End Privacy As We Know It” New York Times). Because of the size of the Reddit data set necessary… (John Seabrook “The Next Word” New Yorker). Results from GPT-2 provided by Jason Hartford at UBC.
Benjamin Stanwix, Not a man but a cloud in trousers, 2020