I know a young man who lost his left arm from the elbow down in combat. He prefers to use a double hook to the clunky simulated hands that are currently on the market.
This development (pictured right) represents a new generation of prosthetic limbs which will allow the wearer to reach for objects automatically, without thinking — just like a real hand — are to be trialled for the first time. The bionic hand is fitted with a camera which instantaneously takes a picture of the object in front of it, assesses its shape and size and triggers a series of movements in the hand.
Bypassing the usual processes which require the user to see the object, physically stimulate the muscles in the arm and trigger a movement in the prosthetic limb, the hand ‘sees’ and reacts in one fluid movement.
A small number of amputees have already trialled the new technology and now the Newcastle University team are working with experts at Newcastle upon Tyne Hospitals NHS Foundation Trust to offer the ‘hands with eyes’ to patients at Newcastle’s Freeman Hospital.
Ghazal Ghazaei, Ali Alameer, Patrick Degenaar, Graham Morgan, Kianoush Nazarpour. Deep learning-based artificial vision for grasp classification in myoelectric hands. Journal of Neural Engineering, 2017; 14 (3): 036025 DOI: 10.1088/1741-2552/aa6802
Publishing their findings in the Journal of Neural Engineering, co-author on the study Dr Kianoush Nazarpour, a Senior Lecturer in Biomedical Engineering at Newcastle University, explains:
“Prosthetic limbs have changed very little in the past 100 years — the design is much better and the materials’ are lighter weight and more durable but they still work in the same way.
“Using computer vision, we have developed a bionic hand which can respond automatically — in fact, just like a real hand, the user can reach out and pick up a cup or a biscuit with nothing more than a quick glance in the right direction.
“Responsiveness has been one of the main barriers to artificial limbs. For many amputees the reference point is their healthy arm or leg so prosthetics seem slow and cumbersome in comparison.
“Now, for the first time in a century, we have developed an ‘intuitive’ hand that can react without thinking.”
In the UK there are around 600 new upper-limb amputees every year, of which 50% are in the age range of 15-54 years old. In the US there are 500,000 upper limb amputees a year.
Current prosthetic hands are controlled via myoelectric signals — that is electrical activity of the muscles recorded from the skin surface of the stump. Controlling them, says Dr Nazarpour, takes practice, concentration and, crucially, time.
Using neural networks — the basis for Artificial Intelligence — lead author on the study Ghazal Ghazaei showed the computer numerous object images and taught it to recognise the ‘grip’ needed for different objects.
“We would show the computer a picture of, for example, a stick,” explains Miss Ghazaei, who carried out the work as part of her PhD in the School of Electrical and Electronic Engineering at Newcastle University. “But not just one picture, many images of the same stick from different angles and orientations, even in different light and against different backgrounds and eventually the computer learns what grasp it needs to pick that stick up.
“So the computer isn’t just matching an image, it’s learning to recognise objects and group them according to the grasp type the hand has to perform to successfully pick it up. “It is this which enables it to accurately assess and pick up an object which it has never seen before — a huge step forward in the development of bionic limbs.”
Using a 99p camera fitted to the prosthesis, the hand ‘sees’ an object, picks the most appropriate grasp and sends a signal to the hand — all within a matter of milliseconds and ten times faster than any other limb currently on the market.
“One way would have been to create a photo database of every single object but clearly that would be a massive task and you would literally need every make of pen, toothbrush, shape of cup — the list is endless,” says Dr Nazarpour.
“The beauty of this system is that it’s much more flexible and the hand is able to pick up novel objects — which is crucial since in everyday life people effortlessly pick up a variety of objects that they have never seen before.”
It’s the first step towards a fully connected bionic hand
Neat tech and very needed. Who knows, one day we might have fully functioning android cyborgs, as opposed to the obviously malfunctioning units we live with today, like John "Woodentop" Kerry.
Come on, DAARPA.
Watching with interest here…
It could signal real change. And a new life for so many of we who served.
Or when Blue Avenger tears the arm off the postman, he can get a cyborg arm – and Blue Prisoner won't have to go back into the County lock-up.
That is a very good point!
This is an amazing advancement. I hope they can get it on the market quickly, and at a somewhat reasonable cost for the people who will need it.
God bless them.
What enters my mind right away is how exactly does that electronic processor know whether the object one is reaching for needs grabbing, like a puppy who just peed on the carpet; or squashing, like a spider on your girl friends arm, or squeezing, like a teenage zit.
That's what I want to know.
Good news. Sometimes progress is wonderful.
I asked myself the same question, and the clinical trials are beginning. It remains to be seen how this all works out, but for the first time ever, there is hope of a genuine solution. It's worth following to see whether the hype meets reality.
If it works, the cost will come down, but the article cites the 'low cost' nature of the technology.
We're not exactly living the way we thought that the Jetsons would. However, when you look at computers, cell phones, etc., the world we have is clearly not the world you and I grew up in (for good and bad).
Those titanium fangs would also tear up a cyborg/bionic arm, so there is no reason for the UPS guy to let down his guard.
Comments are closed.