A new AI learning scheme combined with a spray-on smart skin can decipher the movements of human hands to recognize typing, sign language, and even the shape of simple familiar objects:
The technology quickly recognizes and interprets hand motion with limited data and minimal training and should work for all users, its developers say.
Besides finding use in gaming and virtual reality, the new hand-task-cognition technology could allow people to communicate with others and with machines using gestures. Other applications the technologists envision include surgeons remotely controlling medical devices, as well as a new modality for robots and prosthetics to achieve object and motion recognition.
[...] There are two key parts of the new system, which the team reported in the journal Nature Electronics. One is a mesh made of millions of nanowires of silver coated with gold that are embedded in a polyurethane plastic coating. The mesh, he says, is both durable and stretchy and helps the sensor stick to skin. "It conforms intimately to the wrinkles and folds of each human finger that wears it," [Korea Advanced Institute of Science and Technology professor Sungho] Jo says.
[...] The team directly printed the mesh onto the back of a user's hand going down the index finger. The nanowire network senses tiny changes to electrical resistance as the skin underneath stretches. As the hand moves, the nanomesh creates unique signal patterns that it wirelessly sends via a lightweight Bluetooth unit to a computer for processing.
[...] As the hand moves, the nanomesh creates unique signal patterns that it wirelessly sends via a lightweight Bluetooth unit to a computer for processing.
This is where the AI kicks in. A machine-learning system maps the changing patterns in electrical conductivity to specific physical tasks and gestures. The researchers first use random hand and finger motions from three different users to help the AI learn the general correlation between motions.
Originally spotted on The Eponymous Pickle.