(These are excerpts from my book "Intelligence is not Artificial")
The Consciousness of Super-human Intelligence
(Warning: this chapter and the next one are boring philosophical speculation).
Given that non-human intelligence exists all around us, what would make a particular non-human intelligence also "superhuman"? I haven't seen a definition of "superhuman" (as opposed to simply "non-human").
However, there is at least one feature that i would expect to find in a superhuman intelligence: consciousness. I think, i feel, sometimes i suffer and sometimes i rejoice. If i have consciousness, an intelligence that is superior to mine should have it too.
We know that human brains are conscious, but we don't really know why and how. We don't really know what makes us conscious, how the electrochemical processes inside our brain yield feelings and emotions. (My book "Thinking about Thought" is a survey of the most influential viewpoints on consciousness). An electronic replica of your brain might or might not be conscious, and might or might not be "you". We don't really know how to build conscious beings, and not even how to find out if something is conscious. If one of the machines that we are building turns out to develop its own consciousness, it will be an amazing stroke of luck.
However, i doubt that i would call "superhuman" something that is less conscious than me, no matter how fast it is at calculating the 100th million digit of the square root of 2, how good it is at recognizing cats and how good it is at playing go/weiqi.
However, you might object, a super-human intelligence will not need to be conscious. You might object that feelings and emotions are a sign of weakness, not of strength. Consciousness makes us cry. Feelings cause us to make mistakes that we later regret, and that sometimes hurt us or hurt others. Maybe a being that is more intelligent than us and does not feel anything is actually the secret to outperforming human intelligence.
In fact, "consciousness" for an information-processing machine could be something altogether different from the consciousness of an energy-processing being like us. Our qualia (conscious feelings) measure energy levels: light, sound, etc. If information-processing machines ever develop qualia, it would make sense that those qualia be about information levels; not qualia related to physical life, but qualia related to "virtual" life in the universe of information.
It is not even clear whether superhuman intelligence requires human intelligence first: can human-level intelligence be skipped on the way to superhuman intelligence? Do machines need to as smart as us before becoming smarter than us or can they find a short cut to superhuman intelligence?
We cannot answer this question looking at biological intelligence because the progress of machine intelligence is happening in a completely different way from the way that biological intelligence evolved. The way Nature works is simple: new species don't need to climb the ladder of intelligence: they start out at a given level of intelligence, bypassing all the lower ones. For example, humans have never been so little intelligent as bacteria. The way Artificial Intelligence works is different: it tweaks software programs, making them more and more intelligent, and these software programs can run on any computer that is powerful enough. A.I. is about the progress of software (that can run on any hardware), Nature is about the progress of hardware (an hardware that also includes a brain that, in turn, somehow includes a software called "mind").
Back to the Table of Contents
Purchase "Intelligence is not Artificial"