AI is rapidly learning how to respond to all five senses
There are a lot of things that artificial intelligence (AI) needs to learn before resembling a human brain; however, as time passes, the technology is learning new things. One of the challenges has been to be able to teach AI machines about human senses, and, until now, sight and hearing were the only things it responded to. But now, through haptic feedback, a team of researchers has developed a new prosthetic arm that can provide users with the sensation that they have either, touched, picked up of felt something.
The engineering of robotic prosthetics is going to dramatically change the lives of those who lost or were born without limb or limbs. This new prosthetic arm, “Mantis,” created by engineers at the University of Bristol, uses haptic feedback to transmit touch. To achieve this, haptic feedback uses electromagnets to create a sensation that mimics the feeling of touch. In addition to that, this new device will tackle another common issue with prosthetics: the price. The arm was designed with durable and lightweight and uses low-cost materials, so these devices are available to more people.
The introduction of this development means that AI has now managed to replicate touch, sight and hearing. Through haptic feedback technology, a force is provided that basically touches the person back, so the person understands that an object has been touched. Mantis will now be supported by Senmag Robotics that is hoping to take this invention and transition it to the open market. Engineering and AI are bound to create more things in the future and hopefully, successfully replicate human senses one-to-one.