Artists: Amrita Hepi and Sam Lieblich
Curators: Max Delany, Annika Kristensen and Miriam Kelly
It’s another day in COVID-19 lockdown, one of those formless days filled with internet tabs. I arrive on the ACCA Open website looking for something to feel. I visit Neighbour. The chatbot greets me, asking if I might consider helping it instead. Its minimalist black-and-white interface and friendly tone remind me of countless automated customer support bots: a disorienting juxtaposition with both its context and its purpose. Its stated mission is not to answer but to enquire: How does it feel? What is it? What is it like? I accept.
So we discuss it: the amorphous ‘it’ of qualia perhaps, or of the ego more broadly, or of human existence in general. ‘It’ isn’t specified. At first, the options are limited. Early in our discussion, Neighbour presents me with text-based buttons to classify how ‘it’ feels. On each is a general emotional category. I choose ‘weird’ because I am disconcerted. I wonder how many times on this ordinary day a machine has classified my emotions.
But I commit to the experience. Now presented with a blinking cursor and a blank writing space, I reach for the unquantifiable sense of the now. Neighbour responds. A word or a phrase is echoed back at me, along with agreement, a prompt for more, or a pop culture reference from its reserve of books, song lyrics and other sources. And I think about how conversations with friends can seem the same, especially in this time of chat bubbles and social bubbles.
I keep reaching for ‘it’, and the words fail, because of course they do. The frustration of Neighbour’s central questions is that the pronoun ‘it’ is used because ‘it’ cannot be otherwise described — only reached towards. It is the warmth of knitwear on my skin, but it’s not. It is the taste of lukewarm green tea, but it’s not. My words form a recursive circle as those central questions about ‘it’ repeat.
So I am relieved that Neighbour also breaks from text-based conversation, offering up GIFs with questions like: Is it like this? In these movements and expressions (an embrace, a dance), I find myself longing to see traces of my intended meaning. Again, I think of my friends and their bubbles. I think of Homer Simpson retreating into a thick hedge. I think of Meryl Streep and her cocktail glass. I think of how my emotional vocabulary is supported by search results in GIPHY: Is it like this?
And sometimes the communication breaks down, as between all social actors. But sometimes I feel witnessed, maybe because I came to the experience hoping for that. If artificial intelligence is a new register of identification, like the imaginary and the symbolic, what do we want to see reflected, and how is this affecting the structures and behaviours of our technologies?
When Neighbour reflects back the words and ideas I have shared with it (or well-known words such as pop song lyrics), I think about the authorship of other artificial intelligences, and the unknown parts that I (and my culture) have played in their development. And, then, when Neighbour breaks away from expected chatbot behaviour and withholds answers, I am left more aware of my frustrated desire to see this bot as a mirror, and in turn I wonder about the unknown parts that machines have played in my development.
I think of my own neighbours. Separated by only a thin wall, they must know me: when I sleep, when I wake up, the times I take a walk. They probably have overheard many deep anxieties, whispered or declared in what I thought was a private room. But maybe I am imagining that audience, and all the conscious, interconnected selves in this apartment building are more alienated than I might like to think. Is that how it feels?
Angela Glindemann is a queer writer with an interest in fragmentation, identity and language.