Skip to main content

Verified by Psychology Today

Neuroscience

Brain Implants: What Can We Expect?

Some think brain implants could be the start of a new era for medicine.

Key points

  • Brain implants are no longer science fiction.
  • Prostheses which interface the nervous system with electronic devices have been around for decades.
  • Developments in brain-computer interfaces are reminiscent of cochlear implant development 40 years ago.

The idea of using implanted electrodes to control or influence the mind has existed since at least the 1960s. "Mind tech" has long hovered somewhere between science and fantasy. Neuralink’s recent announcement of the successful implantation of the company’s new chip in a human brain seems to move the dial. The device should one day permit people to operate a computer using only thought.

When recruitment of volunteers for a clinical trial began, people suffering from Amyotrophic Lateral Sclerosis (ALS or Lou Gehrig’s disease) were specifically invited to enroll. The late Stephen Hawking, diagnosed at the age of 21, became one of the world’s most famous theoretical physicists. Hawking interacted with his environment using devices controlled with his eye and cheek muscles. An implantable chip, controlled by thought, would be a quantum leap forward.

Medicine is constantly pushing at its own boundaries. Various forces drive it forwards. One is physicians’ desire to help their patients. Another is hope, which motivates patients and their families to keep going. And for medical entrepreneurs, there’s the economic incentive. Hardly surprising that announcements of "promising new treatments" are frequently smothered in hype.

What should we expect of the neural implant? Though not quite mind tech, neuroprosthetic devices have been in use for 40 years. They became possible thanks to advances in microelectronics, to the development of biocompatible materials, and to advances in surgical techniques. Though not all have been successful, the overall trend is clear. Below are some examples.

Years of progress

Myoelectric devices are used to control artificial limbs. They use electrical signals generated by the peripheral nervous system, so they don’t involve implants in the brain. The most famous early example, known as the Boston Arm, appeared in the late 1960s. Early users—above-elbow amputees—didn’t much like them. Improvements are constantly being made. Users want better sensory feedback. They want to feel the position of their artificial limb, rather than having to check with their eyes.

Then there’s the cochlear implant, sometimes considered the first neural prosthetic. Research began in the 1950s. An electrode implanted in the ear would transform sound waves into electrical impulses transmitted to the auditory nerve. Various models appeared, using different software, different electrodes, and requiring different surgical approaches.

The device is intended to help deaf people having little or no benefit from conventional hearing aids. The first models received FDA approval for implantation in adults in the mid-1980s. With the start of pediatric implantation, a few years later, controversy emerged. Deaf communities’ fears for the future of deaf culture had some justification. Fears that the device could be hacked, and users’ actions controlled from outside, did not.

Then there’s deep brain stimulation (DBS), which has been under development since the late 1990s. This does involve implantation into the brain. DBS received FDA approval for the treatment of Parkinson’s disease in 2002. Thousands of people with PD and essential tremor have since gotten implants. A pacemaker-like device stimulates an area of the brain selected by the surgeon, depending on the patient’s symptoms. There are many DBS success stories, claims of exaggerated optimism, and various attempts to explore other possible applications of the technology.

Brain-computer interfaces

Other neuroprosthetic devices were also under development in the 1990s. Richard Normann and colleagues at the University of Utah developed a tiny electrode that became known as a "Utah array." Implanted into the visual cortex of the brain, it would give visual sensations to blind people. The search for other possible applications followed when long-term safety and stability were established. In 2008, Blackrock Neurotech was founded, and its version of the Utah array received FDA approval as an investigational device in 2021.

As with the cochlear implant previously, different groups are now taking different paths, though with similar therapeutic goals. The BrainGate Neural Interface System, also approved by FDA as an investigational device, has its origins at Brown University. It uses a brain-implantable sensor to detect neural signals that are transformed into signals to control assistive technologies. Today, the BrainGate team involves scientists, clinicians, and engineers from four leading universities, Massachusetts General Hospital, and VA Providence Healthcare System. The consortium intends its neurotechnologies to help people with neurologic disease, injury, or limb loss.

Personal experience

The various designs of brain-computer interface are all highly experimental. Nathan Copeland is one of very few people who’ve had one for some years. He broke his spine in a car accident and is paralyzed from the chest down. Part of a research project at the University of Pittsburgh, he has four implanted Blackrock Utah arrays connected to sockets on top of his head. With these, he can control robots and computers, and they send sensations back into his brain. In a fascinating 2019 interview, he talks about his experiences.

Controversy revisited?

Current developments in the brain implant field are highly reminiscent of previous debates in the cochlear implant field. When is it acceptable to implant the device routinely? Cochlear implant pioneers didn’t agree on that. There was a time when some were ready to start offering a clinical service to desperate patients. Others believed more research was needed first.

The second controversy focused on design. Some thought a fully implantable device—if technically possible—would be better. It would be more aesthetically acceptable. Others considered it riskier. Replacing a part would require surgery. Thirty-odd years later, that discussion has been reopened. A fully implantable cochlear implant is now being investigated in clinical studies.

The question is coming up again with the brain-computer chips. Unfortunately, there’s never an ideal time or an ideal design. Comparative assessment of benefits and risks is subjective. So it’s hard not to wonder what determines how things work out.

References

Joseph Dumit, “Brain-mind machines and American technological dream marketing" pp 347-362 in Chris Hables Gray (editor) The Cyborg Handbook. Routledge: New York and London, 1995

Stuart Blume, The Artificial Ear. Cochlear Implants and the Culture of Deafness. Rutgers University Press, 2010

advertisement
More from Stuart Blume D.Phil.
More from Psychology Today