Site icon The Inner Detail

How Scientists made Brain-Controlled Robotic Arm for a Paralyzed Man?

Contrary to the quibble of technology and robotics penetrating into man’s everyday life, or to the claim that robots or AI plunder human jobs, this feat by researchers of Pittsburg made a paralyzed man to handle things as if doing it really, with the help of robotic arm.

The researchers were able to bring the sense of touch in a paralyzed man, and he could handle things via the robot-arm by just thinking in his mind. This isn’t a fairytale, or a medical miracle but a pure technological capability.

Nathan Copeland, who was 18years old in 2004 got paralyzed when he met with a car accident back then. He hardly moved his body except his fingers and shoulders. At the need of a volunteer by a team from University of Pittsburgh, to test their robotic arm, Nathan joined the group in 2015.

The kind of research demanded brain-computer interface technology, like the companies Kernel and Elon Musk’s Neuralink do, as the team’s objective was to convert the man’s thoughts to robot’s action. As in need to read the electrical activity of brain for analyzing the thoughts, Copeland had to accept implanting chips onto his brain and in fact, he was excited with zero hesitation. “The inclusion criteria for studies like this is very small. Everything is in right position, so how can I not help push the Science forward?” were the words of him.

Implanting BCI Electrode on Brain

The research began when Copeland underwent surgery and got tacked with 4mm-sized electrode arrays onto his motor cortex and somatosensory cortex of his brain. The arrays read the electrical patterns wavering in his brain, showing his intentions to move his wrist and fingers. These impulses were conducted to control a robotic limb that sat atop a vertical stand beside him in the lab, by means of the brain-computer interface.

The robotic wrist should lift up when Copeland thought he should lift his hand; the hand should twist, when he intended to twist; should close when he wanted to grab.

This was the beginning phase of the study and they experimented out if Nathan was able to grasp and hold things in the robotic arm by just using his thought. Over the past few years Nathan had learned to control the robotic hand with his thoughts while watching what the hand was doing in response. Nathan, however, couldn’t able to make it quick, as he is unsure of whether the robot had fully in cover of lifting up an object, or the object is gonna fall from the robotic hand and so on, since he had no sense of touch directly with the object.

Stimulating the Sense of touch from Robotic Arm

Likely as we don’t necessarily rely on our visionary sense to hold a thing in our hand but on sense of touch, the team worked on to incorporate both motion commands and touch in real time for a robotic prosthetic arm. And this is the first time, world is experiencing it, which predicts just how BCIs might help circumvent the limits of paralysis.

The brain is bidirectional: It takes information in while also sending signals out to the rest of the body, telling it to act. Even a motion that seems as straightforward as grabbing a cup calls on your brain to both command your hand muscles and listen to the nerves in your fingers. Researchers wanted Copeland to stimulate the robotic arm and also get stimulated by electrical signals from it simultaneously. The challenge the team says was to make it all natural.

https://theinnerdetail.com/wp-content/uploads/2022/06/roboticarm.mp4

Identifying the Sense-of-touch area in Brain & BCI implant

Of the four micro-electrode arrays implanted in Copeland’s brain, two grids read movement intentions from his motor cortex to command the robot arm, and two stimulate his sensory system upon the touch by the robot.

Fortunately, Copeland retained his senses on tips of his fingers – right thumb, index and middle fingers to be precise. Q-tipped with this, researchers made Copeland to sit in a magnetic brain scanner and they found which specific contours of the brain correspond to the touch-sense of these fingers. The researchers then decoded his intentions to move by recording brain activity from individual electrodes while he imagined specific movements. And when they switched on the current to specific electrodes in his sensory system, he felt it.


Related Posts


To him, the sensation seems like it’s coming from the base of his fingers, near the top of his right palm. It can feel like natural pressure or warmth, or weird tingling—but he’s never experienced any pain. “I’ve actually just stared at my hand while that was going on like, ‘Man, that really feels like someone could be poking right there,’” Copeland says.

But the touch feedback was nothing comparable to real direct touch, as the robotic touch lagged in time. And so, maneuvering things locked a couple of seconds more than it would take for usual.

https://theinnerdetail.com/wp-content/uploads/2022/06/roboticarm1.mp4

Finally…

In view of alleviating the lag, the team went tacking torque sensors at the base of the mechanical fingers. Normally, every action we do is a reaction of a rotational movement of a joint. Say, if you raise your hand, shake your hand, stretch or bend your fingers, there has to be some rotational force at hinges. Researchers took that as advantage for accessing the senses. So, if you hold anything in your fingers, you are applying pressure to the joints corresponding to the fingers in contact.

 “It is maybe not the most obvious sensor to use,” says Robert Gaunt, who co-led the study with Collinger, but it proved to be very reliable. Electrical signals from that torque sensor flash to the BCI, which then stimulates the implanted brain electrode linked to Copeland’s corresponding finger.

When Copeland gripped a hard block, the firm resistance occurring at the robotic joint gave him a stronger sensation. Routing this sense of touch directly to Copeland’s hand meant he didn’t have to rely so much on vision. For the first time, he could feel his way through the robot tasks. “It just worked,” Copeland says. “The first time we did it, I was like, magically better somehow.”

The accomplishment made on Copeland paved the possibility to avail the robotic arm with feeling sensation to those paralyzed and handicapped, which could be the best exploitation of the technology, on my opinion. Well, as it happened to one, it hardly takes time to spread across the world, unless people don’t throw hesitation on this.

Let’s hope for the best! What are your thoughts? Comment down!

(Credits to @this_suhas_b for suggesting the topic)

Exit mobile version