by Matt Simon: Surgeons use muscle grafts to amplify nerve signals—allowing amputees to control a new prosthetic with incredible precision…
In an ordinary lab at the University of Michigan, Joseph Hamilton, of Flint, does the also-ordinary: He grabs a shiny ball and a bottle; he presses buttons and stacks little cubes; he zips and unzips zippers. Well, it would be ordinary if Hamilton wasn’t an amputee doing this all with a robotic hand—à la Luke Skywalker—and if he wasn’t a test subject for a major advance in the control of robotic limbs.
“It worked awesome,” Hamilton says of his test run with the robotic hand. “If it was something that I had access to for daily use, it would make life so much easier.”
Up until this point, researchers have succeeded in giving amputees control over robotic hands by measuring nerve activity in the residual limb. That signal is extremely faint, resulting in clunky control of the prosthesis—the wearer may need to flex their shoulder to get the device’s thumb to move, for instance. But writing today in the journal Science Translational Medicine, researchers describe a clever way to amplify these signals for users like Hamilton. It’s so effective, participants can put on the robotic hand and pull off fine motor functions right away, no training required.
It all comes down to how the patients regrow their nerves. When a person loses, say, their arm from the elbow down, all their nerves want to grow back where they were before. “Patients get this big ball of nerves called a neuroma,” says University of Michigan plastic surgeon Paul Cederna, who codeveloped this new system. “And that can lead to pain and can prevent them from wearing their prosthesis and severely impact their quality of life.”
“So we were able with this approach to not only treat the end of the nerve to prevent the nerve from getting the neuroma pain and phantom pain,” Cederna says, “but also at the same time take those tiny little signals and amplify them with that piece of muscle.” They also added electrodes to the muscle to detect the signals, which were now up to 100 times more powerful than before the nerves grew into the muscle. By this point, the nerves were downright shouting. (By the way, a skin graft wouldn’t work as well as a muscle graft, because motor nerves don’t go to skin.)
In their experiments with Hamilton and three other subjects, the team found that after doing the grafts, the nerves that control the thumb interacted with this new muscle just as they would if the person still had their thumb. “We know the intent—in that case, to flex the thumb—just like the nerve and muscle interacted when there was a thumb,” says Cederna.
Next, the team had the subjects simply imagine a bunch of different hand movements. As they did so, an EKG picked up the signals of their nerves activating, just as the nerves would have done before the person lost their limb. They tracked these to pair particular nerve signals with particular movements. “The anatomy is making these signals very different from one another, and is very finger-specific,” says University of Michigan biomedical engineer Cindy Chestek, who codeveloped the system. One nerve might be highly active for controlling the thumb, for instance, but remain silent when another finger is moving.
All of this information is fed to algorithms, which learn to detect the nerve signals involved in making a fist, for example. The system then translates that collection of signals into commands telling the robotic hand how to scrunch all five fingers together.
“With about 15 minutes of training data, we train our algorithm and we start running online,” says Chestek. Then the subjects can try controlling the robotic hand. “And they can do it on the first try,” she says. That’s a big difference from what they may have experienced before with other prosthetics, which require more practice and are less intuitive to use. “The learning is in the algorithms, not in the people,” she says.
The hardware closely mimics the natural movements of a human limb, allowing the subjects like Hamilton to pull off fine manipulations like closing zippers. “It was pretty much a sense of having a real hand back, almost, as far as usefulness,” says Hamilton. “It worked very well, very seamlessly.”
The hardware is technically called the DEKA, after DEKA Research and Development, which invented the robotic hand. But the team also fondly calls it Luke’s hand, after the Skywalker. It’s made of a semitranslucent white sheath over a robotic skeleton, and was attached to the participants’ residual limbs using a specially designed socket.
“The very exciting part about this work is that it’s a biological interface,” says biomedical engineer Paul Marasco of the Cleveland Clinic, who wasn’t involved in the research. “They do the amplification biologically, and so once they’ve done that surgery, the interface itself is actually really pretty solid.” That means strong, clear signals that translate into complex manipulations of the robotic arm.
The system is still in its early days, with a major caveat being that the subject has to be hooked up to a computer in the lab—at-home or on-the-job use isn’t available yet. But the payoff of perfecting this technology and moving it out of the lab could be huge. Instead of making prosthesis wearers have to learn how to control a robotic hand, they’d just have to think of the movement, and the hand would instantly pinch or grasp or let go.
Cederna says it’s important to make the prosthetic as easy to use as possible, because even if the device is ultimately helpful, people won’t want to adopt it if it’s a hassle. “They need to wake up in the morning, put this thing on, turn it on, and it works,” says Cederna. “We can’t expect people to spend two or three hours calibrating the device before they can use it. You can’t get people to floss their teeth every night, and that takes 10 minutes.”