AI and Prosthetic
Prosthetic limbs are remarkable medical devices. They give thousands around the globe the ability to manage activities otherwise not be feasible for them. But imagine if a prosthetic hand could feel, or if a robotic leg could behave differently depending on the terrain. Researchers around the globe are working to make this level of sophistication a reality, and they are doing so with the use of AI.
Currently, there exist prosthetics that can respond to the user’s thoughts, similar to a human arm. These work by tracking electrical impulses sent between the brain and target muscle, and interpreting these signals to give the user the desired action, say, clenching a fist. However, these sorts of devices are still a long way away from behaving like a human arm. Striving to create something that ‘feels like an extension of the body’, Robert Armiger, at Johns Hopkins University's Applied Physics Lab has designed a robotic arm with ‘sensation’. This arm has sensors in each fingertip that provide feedback on ambient conditions such as temperature and vibrations. Armiger’s system deploys AI algorithms to interpret the sensory data, anticipate the object being interacted with, and behave accordingly. This allows the prosthetic limb to simulate a sense of touch, mimicking more accurately the behaviour of a human arm.
However, real-world implementation of this technology poses a major challenge- profitability. The high costs of research, as well as the relatively small consumer base, means that smart prosthetics are currently still a ways away from becoming a reality for the people who need them.
With that being said, the union of AI and prosthetics shows real promise. If such projects can overcome the hurdle of cost, this could be a significant step forward in the world of prosthetics.
Thumbnail Credit: GETTY IMAGES