How Can You Experience Touch With an Artificial Hand?
A robotic arm allows users to feel the sensation using digital digits.
Posted March 1, 2018
When we experience the feeling of touch—the touch of your clothes against your body, for example, or the cold floor under your feet—we experience this sensation as something happening to our body. Because these sensations feel so connected with particular body parts, it’s easy to think our sense of touch is completely dependent on the body.
Seemingly flying in the face of this idea, however, scientists at the University of Pittsburgh Medical Center have recently designed a robotic arm with a serious difference: It allows the user to feel touch applied to the robotic fingers.
But how can this be? How can you feel a sensation coming from a pair of robotic fingers—fingers that aren’t connected to your body?
To understand this, we need to know more about how we experience touch.
The "little man" in our brains
Starting all the way at the bottom, when an object contacts your finger, this activates specialized cells under the surface of your skin called mechanoreceptors. These cells are mechanically activated—that is, pressure from contact with the skin deforms the outer membrane of the cell. This starts an impulse that tracks up the arm, through the spine, and off to the brain.
That part of the brain that receives and processes this input is called the somatosensory cortex. This area is one of the most fascinating places in the brain (though I confess I may have some bias, as this is the area I do my research on). The reason it is so interesting is its stunning organization, being composed of a multitude of tiny segments, each one representing a separate part of the body.
These segments are highly organized. They are laid out basically the same way in the brain as they are on the body, forming a little "person" when visualized, with feet at the top of the scalp and the face at the bottom.
This tiny man in the brain and how he was discovered give us our first clue to how you might feel without a body.
Dr. Penfield and the 'Montreal technique'
In the 1940s, a doctor called Wilder Penfield and his team were performing some pretty intense surgeries to detect and extract bits of brain that were causing seizures in epileptic patients. Further raising the stakes, these patients were not unconscious during the procedures. They were, in fact, wide-awake (though numbed with anesthetic so they couldn’t feel the pain of the surgery).
With the skull open, the doctors artificially activated parts of the brain by stimulating them with a tiny electrical probe (dubbed "The Montreal Procedure"). They needed the patients conscious so they could describe what they were experiencing—allowing the doctors to identify areas of the brain where seizures originated, and also to help work out what exactly the areas of the brain were doing. They found that stimulating the somatosensory area of the brain caused a sensation of touch on a particular location of the body (though usually an unnatural tingling).
These dramatic surgeries were more than just exceedingly useful for the patients. They equipped us with two foundational facts about the brain: first, that a detailed body map for touch exists in the brain; second, and importantly for our robot arm, that touch can be artificially induced without any stimulation from the body—by direct stimulation of the brain.
And this is precisely what the new robotic hand does for its user, Nathan Copeland.
In 2004, Nathan was involved in a serious car accident that destroyed much of his ability to move and to feel touch below the neck. Years later, Nathan underwent surgery to implant devices into his brain—allowing a computer to communicate with it directly (known as the brain-machine interface, or BMI).
Four tiny grids of electrodes—prongs that can send and receive electrical stimulation—were put into his brain. One grid is implanted into the movement area, allowing him to send motor signals to control the robotic arm with an incredible level of precision (10 degrees of freedom, meaning 10 independent points of motion). But BMI in this brain-to-body sense is old hat now (as funny as that sounds, given it's still pretty bloody amazing).
The reversal: Body-to-brain BMI
What is revolutionary about this man-machine link is that it sends messages the other way—from body to brain. The robotic arm itself has tiny sensors implanted in its fingers. (See Nathan and his arm at work here.)
Much like our normal mechanoreceptors, pressure on these sensors creates an electrical impulse that is sent up the robotic arm by wires, through a computer, back into the other two electrode grids that are implanted in the somatosensory cortex.
Critical to the success of this process, the computer modulates the pressure signals into a message the brain can understand as ‘touch’—transforming it to best match what a normal touch message looks like when received by this area of the brain.
The researchers have already demonstrated that touch is localized to specific fingers with impressive accuracy: Indeed, Nathan is 73-97 percent accurate when asked to report which finger is being touched. More recently, the team has provided a touch of different intensities with different strengths of brain stimulation.
This brain-to-body-to-brain invention is certainly the future of prosthetic limbs for people who are paralyzed or missing limbs. Before you get too excited, however, such a device is not ready for the two-limbed amongst us.
This is because the implantation of these electrodes has the potential to degrade the brain tissue into which they are implanted. In Nathan, this is less concerning because, due to his paralysis, the implanted tissue is no longer performing its main function.
With the development of better ways of reading brain activity from outside the skull or less invasive soft electrodes, this may change. So for those of you looking to become a cyborg, watch this space…