“It’s crazy we finish each other’s… sandwiches.” Anna and Hans’s doomed romance is illustrated in this comical mismatch. But it also shows what is so important in our relationships with one another: empathy.

The song illustrates how impossible relationships are without empathy. Our ability to understand to one another, to put ourselves in another’s shoes is key to what makes us human. Anyone watching the Rugby World Cup recently saw this trait in action — supporters crowded round a JCR screen, cheering and weeping alongside players the other side of the world. Our capacity to share in the emotions and feelings of others seems to know no bounds.

And it is this very human quality that allows us to assert our superiority over machines. Yes. machines may be able to compute, synthesise, calculate at a level beyond our understanding. But if they can’t “put themselves in our shoes” there is a whole realm of knowledge that they can have no part of.

Yes, we can see machines performing an operation — but we still want a doctor to break the news of a diagnosis. A machine may output the results of a statute — but it takes a judge to determine whether or not the man in the stand really means it when he says he is not going to reoffend. A robot therapist? Inconceivable.

But what happens when the vast quantities of data currently available to machines starts to be put to use? Facebook can already predict how you will vote better than your spouse can — with just 300 likes. And imagine the volume of data we could generate on a daily basis. We walk around, watched by CCTV. We reveal our spending habits in our clubcards. A Spotify playlist can give you a pretty accurate indication of someone’s ‘vibe’ (are they the Park End, Dancing Queen vibe or a Bully My Nu Leng-er?). And as the quantity of data we output increases, the complexity and nuance of a psychometric profile generated by it will become ever more accurate.

So what happens when there are machines with this level of knowledge about who we are? We can see some of the teething problems at the moment with the use of data in the service industry. Memorably, a Target manager recounts their trial of personalised coupons. A man rang up, irate that his teenage daughter had been sent money off on baby products. He hollered at the manager, only to return 3 days later shamefaced and abashed. “The due date in March…” Target’s shopper data had worked out the pregnancy before the girls own father. And that’s just from one data set!

But we will become more sensitised to the use of personalized data. The Facebook Ads on the side of your newsfeed, displaying content from the sites you last visited — 5 years ago they would have seemed Orwellian yet now are part of the fabric of everyday life. We will become increasingly used to machines being able to gauge and read our interests — to use data to give an increasingly personalised service tailor made to our tastes, emotions and more. Eventually, interacting with things that understand us will become the norm.

As this happens, expect upheaval in the fabric of our relationships with one another. Think of friends who just seem to ‘get’ you. They can read you like a book — they know how to cheer you up, how to push your buttons — when to give you space and when you actually just want a hug. Now imagine this, but with no errors. Interacting with things that can perfectly identify how to deliver the optimal outcome.

This is going to bring a host of problems. One is our tolerance for others — what will it be like when talking to real people is opening up to the possibility of being read wrong — of a conversational misstep, awkward pauses, misjudged intentions — that are absent from the world of machines. The lack of tolerance at the moment, this zeitgeist of ‘cancelling’ for missteps could in part be a result of our acclimatization to the personalised world of social media. Being misunderstood, and the differences in interaction with those who think in a radically different way to us is at odds to the ever responsive devices we spend so much of our time on.

The second is how to specify what the machines should aim for. What AI offers us is the opportunity to become increasingly granular in how we specify the goals and tasks for a machine. Think of a basic calculator — you tell it 2+2 and it gives you 4. You move onto programmes — Scratch games where you can move icons across a screen, as instructions become more general. Now we have things like maximise the amount of user interaction or win a game of Go. So what do we ask machines with vast amounts of data to give us? What is the good life?

The importance of this cannot be understated — we can see how things go wrong when the goals aren’t specified in a way that we would like. It’s why Facebook throws out clickbait, or the guilty pleasure food porn videos I can’t seem to stop watching. It aims to maximise the time interacted with — not the quality. The parameters for the model led to the fake news, the hours wasted on ads, the fact that the person who you love to hate stalk always comes up on your newsfeed. None of this was design — just a poorly specified goal. Philosophy can seem so far away from the real world — yet we can see the room for error from a misstatement in what the machine is set off to direct to. Even something prima facie uncontroversial like ‘make people happy’ has unintended effects. Are people going to be shown the news? Or charity advertising? Or allegations against leading figures? Machines and AI will be able to deliver an ever more increasingly abstract set of goals. But ideas of what is good, what is right — even being the tech nerd that I am, I struggle to imagine how Facebook could move to finding the answer to these questions that are fundamental to being human. More likely, you end up with Deep Thoughts 42 as The Answer to the Ultimate Question of Life, the Universe, and Everything.

The third is the need to be challenged. We learn to be accurate in how we speak, and what we say because we are called out for a lack of clarity by people misunderstanding what we meant. We learn to express ourselves clearly because there is this room for error. But if we are surrounded by things that can understand us better than even we know ourselves why bother aiming for precise language. Yet there are so many advantages to being forced to communicate and express ourselves — to think about how we come across to others. It permits us to clarify our own ideas, to refine them through the mode of communication and bring them to the forefront of our consciousness. If the machines we interact with simply respond to our subconscious cues, will our consciousness itself, our rationality be imperilled?

Lauren Levine

Image courtesy of @askkell on Unsplash. 

Leave a Reply

Your email address will not be published. Required fields are marked *