Scientists Create an AI-Driven Prosthetic Hand Prototype

Scientists have come up with yet another major breakthrough when it comes to bionic limbs. A new type of AI-powered bionic hand will know exactly what it is grabbing at all times. The hand is equipped with computer vision to ensure it does not destroy whatever it is picking up at that time. A very intriguing development that will improve the way humans operate their prosthetics in the future.

Making Prosthetics Smarter Is A Huge Development

Prosthetic limbs have been a great invention, to say the least. Losing a limb is still a very depressing experience, but a prosthetic arm or leg can help humans function almost normally. However, there are still some kinks that need to be worked out, such as the strength exerted when picking up things with a bionic hand. Exerting too much strength may shatter objects or even bones.

It is certainly possible for the human “host” to control how their prosthetic hand behaves. The gripping mechanism can be controlled mechanically, or even by using myoelectric sensors which read the muscle activity from the skin. People who have money to burn can also rely on sensors implanted into their muscles to deliver sensory input. However, a new solution is in the works that will improve upon all of these available options.

Researchers at Newcastle University have come up with a bold and radical plan to enhance interaction between a prosthetic hand and everything it touches. A prototype has been developed which is controlled by an AI-powered camera. This camera is mounted on top of the hand, which makes it look rather alien. However, this is still an early prototype, and cosmetic refinements can always be made once the technology has proven to work as expected.

The camera itself relies on computer vision technology, which is then analyzed by the artificial intelligence running the show. This algorithm has been trained to recognize several hundred objects through weeks of intensive deep learning. Once the wearer of this hand moves to pick up an object, the camera will take a picture of it and cross-reference it with this database.

Depending on what the object tin question is, the AI will automatically force the hand to adjust its grasp. The wearer will then confirm this gripping action through a myoelectric signal. As a result, the bionic hand can act on its own in real-time. All it takes is briefly glancing in the right direction with the camera mounted on top of the hand. Not only does this provide more flexibility, but it will also provide the wearer with natural interaction movements regardless of which object is being touched. Plus, this new prototype provides a cost advantage as well, since it’s relatively cheap to make. Check out this video that showcases the prototype:

Unfortunately, novel solutions like these always come with a set of difficulties that need to be overcome. The success rate of recognizing objects properly is just below the 90% threshold, which is not good enough. Additionally, overriding the hand’s auction can prove to be quite difficult at times. For a prototype, the hand does the job just fine, and now is the time to start tweaking it further and further.

If you liked this article, follow us on Twitter @themerklenews and make sure to subscribe to our newsletter to receive the latest bitcoin, cryptocurrency, and technology news.