(These are excerpts from my book "Intelligence is not Artificial")
A Brief History of Bionic Humans, Cyborgs and Neuroengineering
(The pictures are here)
The first electrical implant in an ear was the work of French surgeons Andre Djourno and Charles Eyries in 1957. Building upon their work, in 1961 William House invented the "cochlear implant", an electronic implant that sends signals from the ear directly to the auditory nerve (as opposed to hearing aids that simply amplify the sound in the ear).
Spanish-born neuroscientist Jose Delgado is credited with publishing the first paper on implanting electrodes into human brains: "Permanent Implantation of Multi-lead Electrodes in the Brain" (1952). In 1965 he famously managed to control a bull via a remote device, injecting fear at will into the bull's brain. He then published his dystopian vision in the book "Physical Control of the Mind - Toward a Psychocivilized Society" (1969). In 1969 he created the first bidirectional brain-machine-brain interface when he implanted devices in the brain of a monkey and then sent signals in response to the brain's activity.
For a while the discipline of brain-machine interfaces lay dormant because the machines were just not up to the task. Nonetheless,
Jacques Vidal at UCLA was implanting simple sensors within the brains of rats, mice, monkeys, and eventually humans. He published the first academic paper about brain-machine interfaces ("Toward direct brain-computer communication", 1973). A decade later, Emanuel Donchin and Larry Farwell at the University of Illinois introduced the concept of "brain fingerprinting ("Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials", 1988).
In 2000 William Dobelle in Portugal developed an implanted vision system that allowed blind people to see outlines of the scene. His patients Jens Naumann and Cheri Robertson became "bionic" celebrities as Dobelle continued to refine his artificial vision system.
The electrical interfacing of semiconductors and neurons is not trivial because neurons communicate using ions whereas semiconductors use electrons. In 1991 Peter Fromherz at the Max Planck Institute in Munich solved the problem of sensing the electrical field of a neuron on an electronic chip, and in 1995 he solved the problem of stimulating a neuron with an electronic chip (he used neurons of leeches). In 2001 he was therefore able to build a hybrid circuit of electronics and neurons (a snail's neurons).
In 2002 John Chapin and Sanjiv Talwar at the State University of New York debuted their "roborats", rats whose brains were fed electrical signals via a remote computer to guide their movements.
As for getting data out of the brain into a machine (output neuroprosthetics) in 1998 the Irish-born scientist Philip Kennedy at Georgia Tech developed a brain implant that could capture the "will" of a paralyzed man (Johnny Ray) to move an arm. In 1987 Kennedy had founded Neural Signals to develop a brain-computer interface, the first bionic startup. (Ray died in 2002 and in 2014 Kennedy himself almost died when he courageously chose to have electrodes surgically implanted in his own brain)
In 1998 Kevin Warwick at the University of Reading in Britain implanted a transmitter in his arm to activate computer-controlled devices, a bionic precursor of the "Internet of Things". (In the same year Warwick also created an artificial intelligence to compose pop songs). In 2002 Warwick used a BrainGate device to connect his nervous system to the Internet.
In 2002 Brown University spun off Cyberkinetics, a startup charged with developing its BrainGate technology. In 2005 John Donoghue's team implanted a BrainGate device in the brain of a paralyzed woman, Cathy Hutchinson, which allowed her to operate a robotic arm.
In 2002 the Brazilian-born scientist Miguel Nicolelis at Duke University implanted a microchip into a monkey's brain that allowed the monkey to control a robotic arm.
In 2004 Theodore Berger at the University of Southern California in Los Angeles demonstrated a hippocampal prosthesis conceived to replace the long-term-memory function lost by a damaged hippocampus. His lab would become another major hub of bionic research. In 2011 Berger developed "memory chips" that can turn memories on and off in a mouse's brain, and in 2015 Berger and Dong Song built a brain prosthesis to help people suffering from memory loss
In 2004 color-blind artist Neil Harbisson (born in Britain, raised in Spain, relocated to New York) became the first person in the world to have an antenna implanted in his skull, a device that transformed color into sound.
(In 2010 Harbisson founded the Cyborg Foundation to defend "cyborg rights", the equivalent of "human rights" for cyborgs like him).
In 2004 PositiveID in Florida started selling VeriChip, an RFID chip implant for humans developed in Texas at Destron Fearing, a company that manufactures RFID tags for animal identification.
In 2003 the psychologist Marcel Just and the machine-learning guru Tom Mitchell at Carnegie Mellon University started collaborating on a system to read minds: identify patterns of brain activity in fMRI associated to different objects ("Predicting Human Brain Activity Associated with the Meanings of Nouns", 2008).
This is when the government stepped in. In 2006 the Defense Advanced Research Projects Agency (DARPA) asked scientists to submit "innovative proposals to develop technology to create insect-cyborgs".
And this is also when the transhumanist movement adopted bionics. In 2006 Seattle-based transhumanist Amal Graafstra boasted a microchip in each hand, one for storing data (that could be uploaded and downloaded from/to a smartphone) and one for a code that unlocked his front door and logged him into his computer. In 2012 Graafstra implanted chips on attendees of the Toorcamp for $50 each, and in 2013 he started a website to sell home implants, dangerousthings.com.
In 2010 Epoc in Australia released a neuroheadset for videogames, Emotiv, to play videogames with your brain waves.
The laboratory of Finnish-born engineer Arto Nurmikko at Brown University that had inherited the BrainGate project from Cyberkinetics. By 2008 this device had become a wireless transmitter for paralyzed patients with a neural implant that bypassed the spinal cord. In 2011 Leigh Hochberg of that team used BrainGate to make a paralyzed woman operate a robotic arm simply by thinking about the movement.
Experiments on brains became more and more ambitious. In 2011 Matti Mintz in Israel replaced a rat's cerebellum with a computerized cerebellum. In 2012 the brain implant designed by Sam Deadwyler at Wake Forest University managed to improve the long-term memory of monkeys.
At the same time some independents began to view implants as the tattoos of the 21st century. In 2013 biohacker Rich Lee in Utah hired Steve Haworth in Arizona to implant headphones into his ears. Haworth had pioneered "body modification", a high-tech evolution of "body piercing" that implants devices (typically magnets) under the skin.
Two-way transmission was just a matter of combining existing technologies. In 2013 Nicolelis made two rats communicate (and they were located in two different countries) by capturing the "thoughts" of one rat's brain and sending them to the other rat's brain over the Internet and an electrode. In 2015 Nicolelis connected the brains of monkeys so that they could collaborate to perform a task.
In 2013 the Indian-born computer scientist Rajesh Rao and the Italian-born psychologist Andrea Stocco at the University of Washington devised a way to send a brain signal from Rao's brain to Stocco's hand over the Internet, i.e. Rao made Stocco's hand move, probably the first time that a human was capable of controlling the body part of another human. In that year Rao (also a scholar of the ancient Indus script and of classical Indian painting) published "Brain-Computer Interfacing" (2013).
It was just a matter of time before someone thought of expanding the cyborgs beyond vision, sound and movement. In 2014 the team led by Italian-born electrical engineer Silvestro Micera at the Federal Institute of Technology (EPFL) in Switzerland designed an artificial hand for an amputee, Dennis Aabo-Soerensen. This hand sends electrical signals to the nervous system so as to create the sensation of touch.
In 2014 Chinese-born wireless scientist Ada Poon at Stanford invented a safe way to transfer energy to chips implanted in the body (to "electroceutical devices").
In 2015 Zoran Nenadic and An Do of the University of California at Irvine attached an electroencephalograph device to the head of a paraplegyc man and made him walk a few steps.
In 2015 EPFL built a robotic wheelchair for paralyzed people. This chair combines brain control with artificial intelligence. In 2016 Gregoire Courtine at EPFL used BrainGate to restore movement to a monkey's paralyzed leg.
In 2016 Nick Ramsey's team in Holland (at University Medical Center Utrecht) inserted wireless electrodes into the skull of a paralyzed patient (unable to speak or move) so that she could control a computer mouse simply by thinking of moving her fingers.
Meanwhile, also in 2016, Niels Birbaumer at University of Tuebingen worked with patients affected by complete motor paralysis but perfectly lucid as far as mental processes go (the state called "complete locked-in"). Using functional near-infrared spectroscopy (fNIRS), they were able to answer yes/no questions with their thoughts.
It may soon be possible to do the same things without a brain implant. In 2016 Bin He at the University of Minnesota demonstrated an EEG cap fitted with 64 electrodes that can convert the "thoughts" a person into the movement of a robotic arm for grasping objects in a room.
In 2017 Bill Kochevar, a man with complete paralysis, was able to feed himself thanks to a brain-controlled arm designed by Bolu Ajiboye at Case Western Reserve University in Ohio. The idea of developing neuroprostheses to enhance the human brain became popular with rich entrepreneurs who in fact started the two most hyped bionic ventures of the age: in 2016 Elon Musk launched Neuralink in San Francisco and Bryan Johnson founded Kernel in Los Angeles (Johnson had founded in 2007 the online and mobile payment platform Braintree, acquired by PayPal in 2013).
But they were not the first startups in this field. Among the pioneers were Thomas Oxley's Synchron and Matt Angle's Paradromics.
In 2017 Facebook announced a project (led by Mark Chevillet) to decode speech directly from the brain and Eberhard Fetz at the Washington University began adapting a chip by ARM to design chips for brains.
In 2017 ARM, maker of the most popular chip for smartphones, opened a Center for Sensorimotor Neural Engineering.
In 2018 Stefan Harrer of IBM Australia announced GraspNet, a system that uses Deep Learning (running on an embedded Nvidia chip) to decode EEG signals and control a robotic arm. This had been done before but it was extremely difficult to decipher the very weak EEG signals. By using A.I., Harrer's team managed to get a clearer signal.
In 2017 bionic projects were carried out by
surgeon Eric Leuthardt at Washington University,
by Newton Howard at Oxford University (formerly the director of the MIT Mind Machine Project), whose neural implant used technology from Intel and Qualcomm,
and by Dong Song at University of Southern California, whose brain implant boosted human memory.
In 2013 Michel Maharbiz and Jose Carmena at UC Berkeley presented "neural dust", by which they referred to a dust-sized, wireless, battery-less sensor that can be implanted in the nervous system as well as in muscles and in organs and can be activated with ultrasounds.
In 2016 DARPA's Neural Engineering Systems Design project, run by Phillip Alvelda, funded several bionic projects, including the project by Edward Chang at UC San Francisco to treat mental illnesses.
In 2018 Andrea Stocco's team at the University of Washington, the pioneers of non-invasive direct brain-to-brain communication, demonstrated a network of brains interacting via electroencephalograms (EEGs), which records electrical activity in the brain, and transcranial magnetic stimulations (TMSs), which transmits information into the brain ("BrainNet - A Multi-Person Brain-to-Brain Interface for Direct Collaboration Between Brains", 2018).
Electroceuticals are implantable devices that deliver electrical impulses to the neural circuits of organs in order to treat ailments. The most famous "electroceutical" is the pacemaker. The first implantable pacemaker was invented by Rune Elmqvist in Sweden and placed into Arne Larsson by the surgeon Ake Senning in 1958 (and Larsson went on to outlive Senning by five years). In 1998 Kevin Tracey at Northwell Health near New York discovered that the nervous system and the immune system communicate via the vagus nerve: in particular, the vagus nerve emits chemicals that regulate the immune system. Tracey and others then began developing the technology of "vagal nerve stimulation" or VNS for therapeutic uses. In other words, VNS is a tool for "hacking" the nervous system.
Can A.I. visualize your thoughts? Yukiyasu Kamitani's lab at Kyoto University in Japan had experimented with "deep image reconstruction" ("Deep Image Reconstruction from Human Brain Activity", 2017). And what about dreams? Can A.I. visualize your dreams? Wouldn't be exciting if we could videotape our dreams and then project them on a screen? The same Kamitani lab decoded visual imagery during sleep ("Neural Decoding of Visual Imagery During Sleep", 2013) and Jack Gallant's lab at UC Berkeley captured the brain activity related to watching movies ("Decoding the Semantic Content of Natural Movies from Human Brain Activity", 2016).
Mind-reading to generate speech (to literally "listen" to what you are thinking) is not science-fiction anymore. For example, Edward Chang's team at UC San Francisco used a recurrent neural network to decode cortical signals into simulations of movements of the vocal tract, and then to transform these virtual movements into spoken sentences ("Speech synthesis from neural decoding of spoken sentences", 2019).
This is certainly an exciting field that can restore movement to paralyzed people, but there are concerns for how this will be used. Most people are concerned that governments could use this technology to control people, but maybe we should be more concerned about how ordinary people will use it.
Here is an example from a very similar field.
In 1946 a surgeon called Robert Heath at Tulane University invented a technique to implant electrodes into brains through small holes drilled into the skull in order to stimulate specific brain regions. There is a book about Heath's invention: Lone Frank's "The Pleasure Shock". Why "the pleasure shock"? Because this procedure tends to induce a feeling of pleasure in the patient.
After it was approved in 2002 in the USA, this procedure is now called "deep brain stimulation" and has been performed on about 100,000 people with brain diseases such as Parkinson's disease.
Deep brain stimulation costs about $50,000 so it is done only in emergency cases, but one can see a future in which it will become cheaper and people will start using it not to treat mental illness but simply to get pleasure.
In other words, there is a risk of creating an epidemics of "pleasure addicts"
who spend their spare time self-stimulating their brain with this procedure
the same way that some people take drugs or watch pornographic videos.
We are moving closer to developing telepathy. The future of "man-machine interfaces" could be just thought. In that case thought will also become the natural form of communication between people. Telepathy will have become reality. Then maybe spammers will start flooding my brain with thoughts about useless products, and government agencies will start listening to my thoughts. History tends to repeat itself.
Back to the Table of Contents
Purchase "Intelligence is not Artificial"