Among various fearsome enemies that humanity faces in the Star Trek universe, the hostile and menacing Borg stand out, and not just because of their intimidating assertion that “resistance is futile.” “Borg” is an abbreviation for cyborg, itself a contraction of “cybernetic organism.” Cyborgs occupy a special place in the roster of artifi cial creatures seen in fi ction and increasingly in real life. Robots are artifi cial creations that do not necessarily look human; androids are also artifi cial, but deliberately designed to look human. Both, however, are pure machines. Cyborgs bring in a new element with eerie connotations, for they are hybrid creatures that combine artifi cial nonliving elements with natural living ones.
Fictional cyborgs and human-machine interfaces have generally bad reputations. The Borg were once organic, but evolved to incorporate artifi cial enhancements such as singularly ugly ocular implants, and now want humanity to join them. In the fi lm Sleep Dealer (2008) set in the near future, Mexican workers are totally barred from the U. S. Instead they operate machinery across the border by remotely “jacking in” their neural networks, with devastating consequences for the workers. Even a fi lm that is not science fi ction, Stanley Kubrick’s Dr. Strangelove (1964), shows an uncontrollable human-machine interface when Strangelove cannot prevent his attached artifi cial arm from snapping out a “Heil Hitler” salute at a crucial moment.
Another powerful cyborgian theme appears in C. L. Moore’s science fi ction story No Woman Born (1944) about a famous and much beloved dancer, Deirdre, who is terribly burned in a fi re. Her undamaged brain is placed in a new golden metallic body that resembles a Brancusi sculpture. Displaying unique grace and beauty, Deirdre-as-cyborg captivates audiences with her dancing, but her metal shell walls her off from humanity, and she sinks into profound loneliness and despair. Similarly, in the 1987 film RoboCop, the brain of a dead policeman is incorporated into a formidable metal body. RoboCop is a tireless and dedicated law offi cer whose built-in weapon never misses its criminal targets; but trapped halfway between man and machine, he feels deep, lingering sadness over what he has lost.
A brain-in-a-box or a rebellious machine attachment are still mostly fantasies, but signifi cant steps toward these scenarios are being realized in laboratories around the world. While in science fi ction such experimentation transgresses the natural order, the reality is that these activities are carried out for mostly benign reasons, to better understand the brain-body linkage and to help victims of disease, accident, and war. But the morality becomes more complex when cyborgian research overlaps military needs.
The humanitarian reasons for the research are clear, even when the result is a true cyborg, such as was created in 2000 by Sandro Mussa-Ivaldi at Northwestern University. His hybrid creature consisted of a motorized wheeled cart controlled by the brain stem from a fi sh called a sea lamprey, held in a nutrient solution. Implanted electrodes connected the brain to light sensors. Other electrodes sent the brain’s responses to the cart’s motors. Depending on where the electrodes were placed in the brain, the cyborg consistently moved toward or away from a light source, convincingly demonstrating that a living brain taken from its natural body could control an artifi cial body.
Mussa-Ivaldi’s aim was not to create a RoboCop, but to study how neural signals control movement. This is essential to develop advanced prosthetic devices that can be controlled by thought alone, to replace missing limbs. The same motivation has driven other researchers to link machines with the brains of monkeys and humans, though these remain within their original bodies rather than being inserted into artifi cial ones.
Pioneering work on humans has been carried out by neuroscientist Philip Kennedy, affi liated with Emory University and his own company Neural Signals. His subject was a Vietnam War veteran named Johnny Ray, who at age 53 was left with full brain function but an almost totally paralyzed body after a stroke. With FDA approval, in 1998 Kennedy implanted fi ne electrodes in the part of Ray’s brain devoted to moving the hands (these insertions are painless because the brain lacks pain receptors). The signals from the brain controlled a cursor that Ray could watch on a computer screen. After some months of training, Ray could position the cursor by pure thought with no bodily movement. By picking out letters from an alphabet on the screen, he was fi nally able to communicate from within his isolated state.
Then, beginning in 2000, Miguel Nicolelis at Duke University showed that a brain could do more than move a computer cursor. He implanted electrodes in the brains of living macaque monkeys and trained two of them to move a cursor on a computer screen, and then to move a robot arm, merely by willing it. Both Kennedy and Nicolelis interpret their results as showing that after suitable training, subjects come to feel that neurally-controlled artifi cial devices are natural parts of their own bodies.
A decade later, these promising beginnings have been extended. A major impetus for prosthetic limbs has come out of the U. S. military involvement in Iraq and Afghanistan, where the widespread use of body armor has protected soldier’s lives but not always their limbs. Out of 33,000 injured military personnel, some 1,200 are amputees. In response, the Defense Advanced Research Projects Agency (DARPA) of the Department of Defense has committed millions to support research in prosthetics. This effort has produced prototype prosthetic arms with different forms of neural control and other advanced features, which are now undergoing clinical trials.
Improved prosthetics are important for military medicine and for medicine in general, but there are also purely military applications of neural interfaces. DARPA has funded research to connect a brain directly to an external weapon or device, such as a powered exoskeleton that would greatly amplify the strength and speed of an infantry soldier. Such devices would change the face of war in unpredictable ways.
In contrast, another purely benefi cial use of neural interfaces is to replace lost or damaged senses of hearing and vision. Cochlear implants for the deaf are the great success story of these neural prostheses. The cochlea is a small spiral shell-like structure (its name comes from the Latin for “snail”) in the inner ear. It houses the auditory nerves, which after excitation by sound vibrations send impulses to the brain’s auditory cortex to be interpreted as sound. In a cochlear implant a small external microphone and processor placed behind an ear detect sounds and convert them into electronic impulses. These travel along wires that have been embedded in the cochlea, stimulating the auditory nerves to produce the sensation of sound in the brain.
In the U. S., 22,000 of these devices have been implanted in adults and nearly 15,000 in children. The implants give signifi cant improvement in most cases, with one survey noting that individuals with the latest models correctly hear over 80% of what is said to them. But because the sound quality still lags natural hearing, and because of different views about whether deafness should be considered a mark of a special subculture rather than a handicap, some implantees feel uncomfortably suspended between the hearing and the deaf communities–an effect reminiscent of the alienated cyborgs Deirdre and RoboCop.
Still, cochlear implants are less challenging than visual prostheses. The need for such devices is great: millions of Americans over the age of 40 are blind or functionally so. A common cause is macular degeneration, an age-related condition that will become more prevalent as the general population ages. But the eye is a complex structure. Its retina contains nearly 130 million photoreceptors, the rods and cones. The interpretation of visual information by the brain’s visual cortex is equally intricate, requiring ten times as many neurons as does the interpretation of auditory information.
For these reasons, the development of visual implants has been diffi cult. In 2000, biomedical researcher William Dobelle tried a truly cyborg-like approach, implanting an array of electrodes on the surface of the brain for each of several blind volunteers. The array was connected to an electrical socket mounted on the subject’s skull, into which was plugged a video camera. As signals from the camera stimulated the visual cortex, subjects reported rudimentary glimmers of vision. Other scientists are continuing research on direct stimulation of the visual cortex in monkeys, but a practical device based on cortical implants has yet to emerge.
If the optic nerve that runs from retina to brain is intact, the implantation can also be performed at the retina, where electrical stimulation can produce flashes of light. The fi rst clinical trials of such a retinal implant occurred in 2000 and others have been tested since. One, an array of 60 electrodes connected to a video camera, is produced by Second Sight Medical Products. This can give only low-resolution images, and would cost up to $100,000, but the company is seeking Federal approval for the device. The long-term hope is to produce an array of 1,000 electrodes which would offer adequate resolution for reading.
The recipients of these retinal implants do not look nearly as dramatic as one of William Dobelle’s subjects, who–with an electrical cable exiting his skull and a tiny camera near his eye–could almost be taken for a Borg. But the joyous reactions of blind people who can suddenly see something, if only flashes of light, overrides any bizarre science fictional aspects of the cyborgs among us. Unlike the Borg, they are to be welcomed, not resisted.
• Sidney Perkowitz, the Candler Professor of Physics at Emory University, is the author of Digital People. His latest book is Hollywood Science: Movies, Science, and the End of the World.
Posted: April 19, 2012 at 6:05 pm