You wake up, but no alarm rings. You walk downstairs to where coffee has been made and everything is ironed. Out the door to the bus stop, there are no weeds in the pavement, no cracks in the tarmac, your body temperature is perfect. You get on the bus, a faint beep signals your payment as you sit down in the clean, driver-less vehicle. Arriving at work there are no emails compose or calls to make, you don’t need to perform surgery, teach classes, sell goods, check tickets, answer customer queries, guard spaces, build structures, groom dogs, fix bikes, mix cocktails. You’re living in a post-human, robotopian world and are useless to the point of perfection. You’ve entered the Singularity.
The Singularity is the most elegant of all extinction myths. It’s the belief that, by about 2045, technology and computing systems will have surpassed the capability of the human mind, thereby rendering mankind redundant. It’s an extinction trope built on the belief that our own brilliance will bring us to autodestruct. ‘Peak and abyss, they are now joined together,’ as Nietzsche put it. Ray Kurzweil, author of The Singularity Is Near, believes that the rate of progress we are now experiencing will make the rate of computation multiply a billion-fold in the years to come. This means that computing systems, or ‘nonbiological intelligence’, will become equivalent to that of the human brain. Not only that, but they will surpass the subtlety of human intelligence.
At the Future of Humanity Institute at Oxford University, Stuart Armstrong is a researcher in Predictions and Artificial Intelligence (one of the very few institutes of its kind in the world) whose other fields include such things as Global Catastrophic Risks, Future Technologies, Human Enhancement, and Applied Epistemology. Before we can conduct the interview, I am told that I must ditch the term ‘Singularity’ at the door in favour of ‘intelligence explosion’, as the FHI prefers to call it. ‘The term “Singularity” is fine, it’s just laden with connotations, not-quite religious, but pretty close,’ says Stuart. He explains that quantitative predictions (i.e. timelines) aren’t possible when speaking of intelligence explosion, but that qualitative ones are. ‘We paint scenarios of AIs gaining human-like skills. Add to that the ability to be copied, you get the potential to replace every worker in the world with a digital version, mainly because that would be so much cheaper.’ Stuart plays this down by adding that the popular tendency to imagine an evil AI is wrong. The real danger is ‘lethal indifference,’ which describes how an AI could have programming that is incompatible with ‘continued human flourishing’. Which brings us to the origami swan portion of the evening: ‘You may think you’ve programmed an AI to keep everyone safe and happy, but the actual outcome is that now it has entombed people in concrete bunkers on heroin drips, because any other outcome is less safe for humans.’
We love data and we don’t do God much, so the Singularity is our perfect mystery. We have prophets in our scientists, and we have The-Unknown-Intelligence-Singularity-Explosion, the higher being we cannot fathom. It’s nearly religious and yet somehow cool: ‘People like to talk about apocalypses, but people don’t want to do much about apocalypses,’ Stuart muses. The Singularity theory is extreme, dramatic, otherworldly. You can look on the end of mankind whilst pausing at its brilliance and admitting that you can do nothing about its future. You’re sort of off the hook. Buying into it gives you the gift of faith whilst allowing you the detachment of numbers and science. Stuart calls it a ‘positive apocalypse.’ It’s neat and self-contained – our greatest achievement and our lowest, cancelling each other out, flatlining into oblivion.
English documentarian Adam Curtis is known for his engrossing, if often disheartening films using talking heads and found footage to illustrate a grim interpretation of the world around us. Taking its name from a Richard Brautigan poem, All Watched Over By Machines of Loving Grace makes the point that computers have not at all liberated humans (far from it in fact) via the minds of Ayn Rand, Alan Greenspan and Buckminster Fuller.
Don’t get smart
For more from the expert, the aforementioned Stuart Armstrong’s Smarter Than Us: The Rise of Machine Intelligence, a 64-page treatise on what happens when machines outsmart us, is available here bit.ly/SmarterThanUs
I’ll be back again and again and again and again
Before there was the silver liquid goo guy reforming his face after a shotgun blast, before there was Guns ‘n’ Roses, before there are good Arnie cracking wise and saving the day, there was the genuinely brilliant and terrifying spectacle of James Cameron’s The Terminator. Set a mere 15 years from now in 2029, where AI missile defense system Skynet becomes self-aware and decides to ditch the puny humans, The Terminator takes the Singularity idea and makes a crazy shoot-‘em-up yarn out of it. Time travel, for fucks sake.
Words: Roisin Agnew / Illustration: Steve McCarthy