The Next Generation of Prostheses

By Hayley Dott

It is a blustery and bleak day outside Clark Hall at Johns Hopkins University, but inside, students are working on something big and bright on the horizon. Under the fluorescents of one classroom ceiling students are 3-D printing, programming, soldering things together; it is a symphony of movement in the orchestra of prostheses, artificial limbs for amputees. The soldering iron hums, the solder sizzles and the printer buzzes all in perfect harmony. This complex rhythm is the coming together of research into a sensor database that will lead to new, unprecedented technology in prosthetic devices. In this lab, the students hope to meld technological data with brain signals in prosthetic machinery, like no one has ever done before.  

In a lab science course called Bio-Instrumentation and Neuro-Engineering, led by graduate student Joseph Betthauser under the Head of Lab Nitish Thakor, students Eli Pivo, Jerry Li, Dani Kiyasseh and Scott Sterrett are researching how to activate and receive responses from sensors to collect more accurate data on what signals the brain sends to the arm to create movement. The research aims to create a probabilistic database that shows how different signals in the body and brain evokes responses in hand or arm movements. After the researchers compile data and complete the database, it will be made available to be implemented into any kind of research or actual prosthetic devices. After this research, the database will most likely be sent to an outside laboratory, and used to create a cuff that will detect and record arm movements in order to construct a more natural and advanced prosthesis.

Approximately 185,000 amputations occur each year in the United States. Additionally, one in every 190 Americans currently lives with an amputated limb, according to PBS. This technology could help the 50,000 people who will lose an arm or hand each year, and help create a more technologically advanced device.

The researchers first explain their work with a stick figure drawing on the whiteboard in Clark Hall, Classroom 101. Pivo, a sophomore majoring in Electrical and Computer Engineering, explains the first, crude prostheses as hook-type devices that resembled a pirate’s hands. As time progressed, prostheses were constructed to be more visually appealing and were equipped with more basic movements, like the ability to grab. The issue with these prostheses was how to control it, and how an amputee with a robotic hand actually could tell the hand what to do.

According to Isaac Perry Clements, a writer at How Stuff Works, the very first account of a prosthetic limb can be traced back to Greek and Roman times. Marcus Sergius was a Roman general who lost his right hand during combat in the second Punic War. In order to keep fighting, he formed a prosthetic hand out of iron with the ability to carry a shield.

By 1812, a prosthetic arm was created that could be controlled by the opposite shoulder with connecting straps, and by 1945, the Artificial Limb Program was created in response to the influx of World War II veteran amputees, “for the purpose of advancing scientific progress in artificial limb development,” per The National Academy of Science.

Later solutions were devices that required the wearer to flex certain muscles that would activate the hand, but this was very counter-intuitive. There are now other solutions, including electrical implants in the brain, but these can be somewhat uncomfortable, invasive, and dangerous.

Presently, modern prosthetic limbs are comprised of three components—the pylon, the socket, and the suspension system. The pylon is the internal structure of the prosthetic limbs and it provides support in the traditional method of metal rods. The socket is the segment of the prosthetic device that connects with the patient’s limb stump, called the residual limb. The suspension system keeps the prosthesis attached to the body. It can take the form of straps, belts, sleeves, or a harness system that are used to attach the prosthesis.

State-of-the-art prostheses now exist, but the average amputee cannot access them because of their exorbitant costs. A mycoelectric arm, a device that uses the amputee’s muscle contractions in the residual limb to control the device, is priced at up to $100,000. Because of these expenses, most people opt for a device with a gripping split hook that can be opened or closed, and is covered in a glove-like material to look more natural.

Other current, prosthetic technology involves the highly risky implanting of electrodes in the brain or chest, referred to as neuroprosthetics. This allows amputees to move limbs without thinking about it, due to a device implanted in the area of the brain where intentions are created.

The hippocampus is the part of the brain that receives and analyzes information and gives the proper response to the cerebral cortex. Many people who undergo hippocampus prosthesis procedures later suffer from infection that can only be addressed with high-risk brain surgery. According to Stephanie Desmon and Lauren Nelson of Johns Hopkins Medicine, the rate of infection from the prosthetic procedure is 21-40 percent.

Since 2004, scientists have been working to create a link between the “body and the machine,” a phrase coined by Hugh Herr and Roy Kornbluh, researchers in the MIT Media Laboratory. They believe in the potential for “lifeless mechanisms” to become “intimate extensions of the human body,” Kornbluh and Herr conducted their research by developing actuator technologies that behave like muscle and facilitate biological movement, in their construction of a prosthetic device for leg rehabilitation.

Today, students in the lab at JHU are just as optimistic. They are trying to connect remaining neurons from an amputation, to a new prosthesis that may move on its own. Because there are neurons that run all the way down from the brain to the arm, when a hand is removed, neurons that would reach the hand are still there. Their work will contribute to the construction of an extended prosthesis to these neurons that will read and decode neuron information from sensors called electromyography sensors that will be attached to the arm in a cuff. Their problem now is creating an algorithm that can decode neuron activation and tell a hand what to do.

Pivo explains to me that theoretically, you could make a hand like “Luke Skywalker’s” with this knowledge, a prosthesis that could receive information from the brain and send it back.

Malhotra offers a glimpse of the life changing, prosthetics technology as he tells me he is getting his “Raspberry Pi,” something I mistakenly believe to be a quick treat during an extensive session of lab work, but draws a tiny computer from his pocket instead. The Raspberry Pi is the miniature computer the researchers use to program. It is the “motherboard” of a computer, as Malhotra describes it; it doesn’t have a keyboard, mouse, or monitor, but it has all other essential parts of a computer. The researchers use a laptop to remotely access these parts.

The Raspberry Pi Foundation created the Raspberry Pi in the United Kingdom in 2012. According to the foundation’s website, they define the Raspberry Pi as a “low cost, credit-card sized computer that plugs into a computer monitor or TV, and uses a standard keyboard and mouse.”

The researchers use the Raspberry Pi platform instead of the tested and vetted operating systems of a Mac or PC, because the Raspberry Pi is more affordable and its small size makes it useful for collecting information during sensor detecting trials on the go.  In this experiment, the Raspberry Pi sends different signals to the sensors being examined and tested. If the sensor sends back a response, then the researchers have been successful. After the sensors have all sent back responses, meaning they have been mobilized, they can be worn on a vest or cuff to detect and record natural arm movements. This data will be implemented and embedded in future prosthetic technology to make it more life-like. Right now, the database is kept on a separate computer.

Currently, new prostheses users are taught to use a robotic hand by watching a television that instructs the amputee to open their hand, close it, and so on, in a video game-type sequence. This is still a very counterintuitive and confusing process, and but also doesn’t account for the different neurological reads of different amputees, which can be manipulated and changed by external factors that can ultimately result in misclassifications by the prosthetic technology. Kiyasseh describes the sensation that an amputee would feel if experiencing this misclassification as confusing two very different, but simple actions.

“Imagine trying to open the door for your neighbor, but the prostheses reads it as something else, and you end up slapping your neighbor across the face?” he explained.

Kiyasseh, Pivo, Malhotra, Sterrett and Li are attempting to lower the incidences of these misclassifications. Within their technique they have several classifications, including actions like the closing, opening, rotating, rotating the other way, and pointing of a hand.

“We want to improve the program that decodes the information and spits that into the robotic hand,” Pivo said.

The lab team compared their research to speech recognition research done at Yale University. Li describes the process of machine learning in that when we do something good, we receive good feedback from the environment, and vice-versa for the opposite. In computer science, machine learning refers to the study and building of algorithms that can learn from and make predictions on data. In an experiment at Yale headed by the principal investigator Louis Goldstein, called “Landmark-based robust speech recognition using prosody-guided models of speech variability,” researchers constructed a machine that listens to a person talking, then guesses what he or she is going to say by putting letters together based on what is usually most likely to follow. This sort of “auto-complete” technology is superior to machines that purely listen to a person’s speech, because those always lag a few seconds behind.

The Hopkins students are hoping to incorporate aspects of the speech recognition research into their own. They are creating technology that will detect and record millions of people’s arm movements, and compile it into a database, like the Yale students’ giant database of spoken words. The device they are creating will track people’s arm movements and frequency of movement across different professions and during different times of the day.  

Before they can create the database, they must finish the cuff that will detect the sensor responses that will be put in the database, and later used to construct the actual prosthesis. These sensors are called the CyberGlove, the IMU sensor, the EMG sensor, and the force sensor. The CyberGlove is a part of a system that contains small, vibrotactile simulators on each finger and the palm of the glove. The IMU sensor stands for inertial measurement unit that uses accelerometers, gyroscopes, and sometimes magnetometers to measure a body’s angular rate, specific force, or the magnetic around the body. The EMG sensor stands for electromyography, and measures the electrical activity produced by skeletal muscles. A force sensor is a resistor that has a change in resistance when a force or pressure is applied.

These sensors will be implemented in trials by being put into the testing devices that the volunteers will wear to record their arm movements. The force sensor will be put into a cuff, the IMU sensor, in a device on the chest, elbow, and cuff, the EMG sensor, on the upper arm, and the CyberGlove, in a glove.

The lab work began in Apr. 2015 and the researchers hope to be done by the end of the semester. Currently, they are finishing the prototype of the machine, a fully programmed version of the raspberry pi mentioned earlier, intended for trials. After the trials and data collection, the information will be sent to the actual manufacturers of the prostheses. There, the manufacturers will utilize the database the Hopkins students compiled.

So far, the researchers have gotten some results. They have collected data on the Cyber Glove, IMU sensor, and EMG sensor, but have yet to get results from the force sensor. Currently in the lab, they are going through iterations of 3-D printing the actual cuff that will go on a person during trials, perfecting it more and more each time. This is a lengthy process, in that the printer can take up to 30 hours to print each cuff.

Although rigorous and time-consuming, this technology can potentially impact the lives of prosthesis-users everywhere. Current users of mechanical prostheses will no longer have to wear heavy, constrictive devices, and may control the prostheses automatically. Similarly, prosthesis users who were once considering neurological surgery will no longer have to risk serious brain infection to get automatic feeling and control back in their hands again.

This research important not only to amputees but to the students, as Pivo explained.

“That’s probably what brought us into the lab, the technical challenge. It’s a very interesting technological problem; it uses a lot of different data-collecting techniques. The thing that makes this lab special, and what keeps us interested in it, would probably be that it’s doing a good thing, helping amputees,” Pivo said. “The project that we’re working on is a significant project, applying speech recognition technology to robotic prosthesis will actually help the technology significantly, and no one’s done it. So there’s the aspect of it being new and innovative, it being for a good purpose.”

Kiyasseh added that their work could be built upon, helping future students in the process.

“It’s not just something you do and you end there. What we can do can actually help our PHD students, if we make a database that is open to all, anyone can tap into it and ultimately use the work we have done,” he said.

Malhotra agreed, saying the project also “has real-world applications that we can potentially see in our lifetime.”

However, before the technology can be put into widespread use, it must first pass through trials and be approved for medical clearance by the FDA. This process usually spans an average of 12 years. In May 2014, the FDA allowed the marketing of the DEKA Arm System, a prosthetic arm that translates signals from a person’s muscles to perform tasks, after an approval process that took eight years. The student researchers at Hopkins hope the FDA will approve their technology one day in the near future.

If you want to be a part of the evolving project, you can apply to become a test subject or volunteer by e-mailing Nitish Thakor at [email protected], or Joseph Betthauser at [email protected].