Bringing the sensation of touch to artificial limbs sounds like something from science fiction, but a US military hospital has developed a way to do it with new sensors and the innovative use of haptic technology. Oliver Hotham speaks to researcher Dustin Tyler from the Great Lakes Science Center about the manufacturing and development process behind the project, how the technology works, and the importance of sensation in learning and development.

main

From our earliest moments as children, touch informs how we understand the world and serves a critical survival role: we can detect pain, heat, cold and we learn about life. Now, thanks to research by Case Western Reserve University, a new spin on prosthetics may be able to help amputees regain sensation – or, at least, do a pretty good job of simulating it.

“We just don’t appreciate how much we use touch in everyday life,” says Dustin Tyler, an associate professor in the department of biomedical engineering at the Great Lakes Science Center. “It is what connects us to the world. The biggest central organ we have is for touch: our entire skin.

“When you’re a baby, you’re going around, exploring the world, feeling everything, and your brain is learning the patterns of activation.”

Tyler has spent much of the past five years working on bringing it back to those who have, through accidents or combat, lost limbs and the sense of touch. In his previous career in the private sector, he developed devices able to communicate with the nervous systems of patients with paralysis or stroke symptoms.

A stint at DARPA (the research and development arm of the US Department of Defense) investigating new prosthetics and the shift from motor-oriented mechanical limbs to sensory-oriented ones got him working on his most important – and well-publicised – project to date.

“I was struck by people coming back from conflicts in the Middle East, not just with an arm missing, or a hand, but two hands and a leg,” he says. “I thought ‘man, we’ve got to do something about this’.”

Updated prosthetics

This thinking resulted in collaboration between Case Western Reserve University and the Veterans Association, and the beginning of four years of work on prototype sensors attached to an artificial hand that – it was thought –  could one day  restore, or at least simulate, the sense of touch.

“That was really how it began: with the question of whether we can use the technology we’ve been developing for 20 years and go from the motor system, the muscles, to the sensory side,” says Tyler.

By speaking to patients about the daily difficulties they faced, Tyler and his team learned something quite extraordinary: despite all the recent innovations in prosthesis, the old-fashioned hook remained one of the most practical prosthetics for everyday tasks.

“The reason for this is simple: we rely on touch to grab things,” he says. “When a person has a hand prosthesis, for example, they can’t feel it, so it becomes very difficult to manipulate. Ultimately, the hook was just a more functional device.”

It vindicates ideas that are central to this research: that sensation is critical to exploring and understanding the world. If a prosthetic is to work effectively, it must recreate touch, connecting the brain with the sensations of the outside world.

“It’s all part of a bigger picture. When you look at it, there really was a significant gap in how we were dealing with prostheses,” says Tyler. “This really drove home to me the importance of sensation, not only for function, but also just for exploring and understanding the world. Then, from a subjective perspective, the exploration patients really want is to be able to reconnect with loved ones.”

Recent developments in haptic (touch-related) technology were essential to achieving an accurate imitation of sensation. Anyone who has played a video game with a ‘rumbling’ controller has experienced rudimentary kinaesthetic communication: the vibration of the device is usually a sign of damage being sustained, or impending doom. Haptics also have an altogether more revolutionary use, and can be harnessed to transform the sense of touch into a digital image to help recreate the sensation of grip.

“We started from the motor perspective,” says Tyler. “Our initial information and knowledge was from the way we had been stimulating muscles.”

The device works by having surgeons implant tiny, electrode-laden cuffs around nerve bundles within a patient’s arm. Insulated wires run from the cuffs inside the forearm up and out of the upper arm. When in the lab and hooked up to sensors, a patient can then feel 19 discrete locations in their fingertips, palm and the back of their hand. Electric pulses sent through the wires feed signals to the brain, recreating the feeling of touch in the prosthetic hand.

The brain reads the different signal patterns, passed through the cuffs, as different stimuli. The scientists then fine-tune the patterns and the patient eventually becomes attuned to them.

The patient controls this ‘myoelectric’ device by flexing the muscles in their arm. In tests to see whether adding sensations would improve control over a prosthetic hand, the researchers also put thin-film force sensors in the device’s index and middle fingers, and thumb, and used the signals from those sensors to trigger the corresponding nerve stimulation. Dexterity and sensitivity to held objects markedly increased when haptics were involved – patients were able to perform more gruelling physical tasks 93% of the time with the haptic system but just 43% without it.

“The great thing about muscles is that they give an immediate response; put one pulse in and the muscle twitches once; put a lot of pulses in, and it twitches constantly, or becomes a titanic contraction,” adds Tyler.

It was a low-fi attempt, he admits, but it worked, at least rudimentarily. There was a great deal of research to be done before the knowledge could be transferred to the brain, which is much more complex than muscle; sending information through it works very differently than in other organs.

“We went back to the haptic literature and realised we had been a little naive in our approach,” admits Tyler, “so we started to look at how information from the hand was normally conveyed to the brain.”

Fingertip sensors have a range of jobs. Some detect rapid changes in pressure, others register constant pressure, and the differential information between them means a great deal. Translating this to a simulated system was no easy task.

“When you touch something, those rapidly adapting fibres fire a little bit at first, then they turn off, and then the constant ones change to a slightly different firing range,” says Tyler. “The difference is that one fires quickly to show you hit contact, and another follows long term, and your brain is used to that pattern of information.

“At first, we were speaking the wrong language; we were putting in gibberish and the brain was interpreting it as such.”

We just don’t appreciate how much we use touch in everyday life. It is what connects us to the world. The biggest central organ we have is for touch: our entire skin.

The team headed back to the drawing board, creating a neuroengineering experiment to work out how to duplicate information and brain patterns. This was a big breakthrough; the insight being that information was not simply the rate at which the muscle is stimulated, but also the pattern of strength and stimulation. “It was the connection of the basic neuroscience and mixing it with our engineering,” adds Tyler.

Language barrier

These findings represent something of a missing link in prosthesis. As motor control becomes increasingly advanced, fusing touch with more advanced artificial limbs (something that hasn’t yet been achieved) will be the next step. Tyler believes that it’s a little like developing language as a child.

“Ultimately, we’re learning the language, but we’ve got a lot of lexicon to go,” he says. “From a neuroscience perspective, we’ve learned a few patterns that tend to work but have no idea what the number of patterns are or how we can control them in a reliable manner.

“It’s like opening a dictionary having learned one of the thousands of words: now we’ve got to learn the rest of them.”

The research has also yielded a surprise benefit that raises even more questions about the complex relationship between touch, the central nervous system and neurology. Unexpectedly, many of the subjects found themselves free of what’s known as ‘phantom pain’, an enigmatic and complex phenomenon whereby patients who have lost limbs continue to feel sensation in their extremities. Tyler and his team discussed it a lot in development and did think it might have an impact, but weren’t expecting such a dramatic effect.

“It’s been a really surprisingly huge benefit to us,” he says. “I didn’t fully appreciate phantom pain until we started to work with the subjects and they described their experiences.”

Test subjects in trials also said that their phantom pain disappeared almost entirely after using the prosthetic.

“There’s a lot of evidence for certain types of pain being centrally mediated,” adds Tyler. “When you lose sensory input, the brain has to make up that space. There’s no more input coming to the sensory places, so the brain fills it with what it remembers; if you have a traumatic accident in which you lose your hand, the brain recalls this huge, painful event.”

Get a grip

So, what’s next? The next stage of understanding is moving beyond single-digit touch to multiple points across the hand simultaneously, as well as developing the control and motor side of the equation.

“We want patients to think in terms of having a hand, rather than a prosthesis,” says Tyler.

Monitoring intensity of grip and stimulating pressure will also be essential to maximising the utility of the research in everyday life. Understanding how these impulses move from the hand to the brain and continuing to make the “stimulation paradigm”, as Tyler calls it, as sophisticated as possible, will hopefully lead to improvements being built into future models.

It will be a while before the technology is ready to be commercially available. Tyler and his team plan to have a viable prototype within three years and, beyond that, get into feasibility trials, and the nuts and bolts of early-stage FDA work. Getting something like this on to the market isn’t cheap but Tyler sees applicability on patients with amputated lower extremities, too: diabetics, for example. Beyond that, the neural interfacing and new stimulation techniques may be useful in controlling tremors, deep-brain stimulation and more.

“That might actually be the route to get us commercially viable,” he says. “But there’s an awful lot to think about, beyond the science stuff.”