When Stephen Hawking, Bill Gates and Elon Musk all agree on something, it’s worth paying attention.
All three have warned of the potential dangers that artificial intelligence or AI can bring. The world’s foremost physicist, Hawking said that the full development of AI could “spell the end of the human race.” Musk, the tech entrepreneur who brought us PayPal, Tesla and SpaceX described artificial intelligence as our “biggest existential threat” and said that playing around with AI was like “summoning the demon.” Gates, who knows a thing or two about tech, puts himself in the “concerned” camp when it comes to machines becoming too intelligent for us humans to control.
What are these wise souls afraid of? AI is broadly described as the ability of computer systems to ape or mimic human intelligent behavior. This could be anything from recognizing speech, to visual perception, making decisions and translating languages. Examples run from Deep Blue who beat chess champion Garry Kasparov to supercomputer Watson who out guessed the world’s bestJeopardy player. Fictionally, we have “Her,” the movie that depicts the protagonist, played by Joaquin Phoenix, falling in love with his operating system, seductively voiced by Scarlet Johansson. And coming soon, “Chappie” stars a stolen police robot who is reprogrammed to make conscious choices and to feel emotions.
An important component of AI, and a key element in the fears it engenders, is the ability of machines to take action on their own without human intervention. This could take the form of a computer reprogramming itself in the face of an obstacle or restriction. In other words, to think for itself and to take action accordingly.
Needless to say, there are those in the tech world who have a more sanguine view of AI and what it could bring. Kevin Kelly, the founding editor of Wired magazine does not see the future inhabited by HAL’s – the homicidal computer on board the spaceship in “2001 A Space Odyssey.” Kelly sees a more prosaic world that looks more like Amazon Web Services, a cheap, smart, utility which is also exceedingly boring simply because it will run in the background of our lives. He says AI will enliven inert objects in the way that electricity did over a hundred years ago. “Everything that we formerly electrified, we will now cognitize.” And he sees the business plans of the next 10,000 start-ups as easy to predict: “ Take X and add AI.
While he acknowledges the concerns about artificial intelligence, Kelly writes, “As AI develops, we might have to engineer ways to prevent consciousness in them – our most premium AI services will be advertised as consciousness-free.” (my emphasis). And this from the author of a book called, “What Technology Wants”.
Running parallel to the extraordinary advances in the field of AI is the even bigger development of what is loosely called, The Internet of Things or IoT. This can be broadly described as the emergence of countless objects, animals and even people who have uniquely identifiable, embedded devices that are wirelessly connected to the Internet. These “nodes” can send or receive information without the need for human intervention. There are estimates that there will be 50 billion connected devices by 2020. Current examples of these “smart” devices include Nest thermostats, wifi-enabled washing machines and the increasingly connected cars with their built-in sensors that can avoid accidents and even park for you.
The US Federal Trade Commission is sufficiently concerned about the security and privacy implications of the Internet of Things, and has conducted a public workshop and released a report urging companies to adopt best practices and “bake in” procedures to minimize data collection and to ensure consumers trust in the new networked environment.
Tim O’Reilly, coiner of the phrase, “Web 2.0” sees the Internet of Things as the most important online development yet. He thinks the name is misleading – that the IoT will simply mean giving people greater access to human intelligence and that it is “really about human augmentation” and that we will shortly “expect our devices to anticipate us in all sorts of ways”. He uses the “intelligent personal assistant”,Google Now, to make his point.
So what happens with these millions of embedded devices connect to artificially intelligent machines? What does AI + IoT = ? Will it mean the end of civilization as we know it? Will our self-programming computers send out hostile orders to the chips we’ve added to our everyday objects? Or is this just another disruptive moment, similar to the harnessing of steam or the splitting of the atom? An important step in our own evolution as a species, but nothing to be too concerned about?
The answer may lie in some new thinking about consciousness. As a concept, as well as an experience, consciousness has proved remarkably hard to pin down. We all know that we have it (or at least we think we do), but scientists are unable to prove that we have it or, indeed, exactly what it is and how it arises. Dictionaries describe consciousness as the state of being awake and aware of our own existence. It is an “internal knowledge” characterized by sensation, emotions and thought.
Just over 20 years ago, an obscure Australian philosopher named David Chalmers created controversy in philosophical circles by raising what became known as the Hard Problem of Consciousness. He asked how the grey matter inside our heads gave rise to the mysterious experience of being. What makes us different than, say, a very efficient robot, one with, perhaps, artificial intelligence? And are we humans the only ones with consciousness?
Some scientists propose that consciousness is an illusion, a trick of the brain. Still others believe we will never solve the consciousness riddle. But a fewneuroscientiststhink we may finally figure it out, provided we accept the remarkable idea that soon computers or the Internet might one day become conscious.
In an extensive Guardian article, the author Oliver Burkeman writes that Chalmers and others have put forth a notion that all things in the universe might be or potentially be conscious “providing the information it contains is sufficiently interconnected and organized.” So could an iPhone or a thermostat be conscious? And, if so, could we have a “Conscious Web”?
Back in the earliest days of the web, the author, Jennifer Cobb Kreisberg wrote an influential piece entitled, “A Globe, Clothing Itself with a Brain.” In it she described the work of a little known Jesuit priest and paleontologist, Teilhard de Chardin, who fifty years earlier described a global sphere of thought, the “living unity of a single tissue” containing our collective thoughts, experiences and consciousness.
Teilhard called it the “nooshphere” (noo is Greek for mind). He saw it as the evolutionary step beyond our geosphere (physical world) and biosphere (biological world). The informational wiring of a being, whether it is made up of neurons or electronics, gives birth to consciousness. As the diversification of nervous connections increase, de Chardin argued, evolution is led towards greater consciousness. Or as John Perry Barlow, Grateful Dead lyricist, cyber advocate and Teilhard de Chardin fan said, “With cyberspace, we are, in effect, hard-wiring the collective consciousness”.
So, perhaps we shouldn’t be so alarmed. Maybe we are on the cusp of a breakthrough not just in the field of artificial intelligence and the emerging Internet of Things, but also in our understanding of consciousness itself. If we can resolve the privacy, security and trust issues that both AI and the IoT present, we might make an evolutionary leap of historic proportions. And it’s just possible Teilhard’s remarkable vision of an interconnected “thinking layer” is what the web has been all along.
This article is published in collaboration with LinkedIn. Publication does not imply endorsement of views by the World Economic Forum.
To keep up with the Agenda subscribe to our weekly newsletter.
Author:Stephen Balkam is the founder and CEO of the Family Online Safety Institute.
Image: Internet LAN cables are pictured in this photo illustration. REUTERS/Tim Wimborne.