Researchers are designing artificial limbs to be more sensational, with the emphasis on sensation. They have developed a language of touch that can be “felt” by computers and humans alike.
The engineers and students in the University of California, Los Angeles (UCLA) Biomechatronics Lab, led by mechanical engineer Veronica J. Santos, are constructing a language quantified with mechanical touch sensors that interact with objects of various shapes, sizes and textures. Using an array of instrumentation, the team is able to translate that interaction into data a computer can interpret.
This data is used to create a formula or algorithm that gives the computer the ability to identify patterns among the items it has in its library of experiences and something it has never felt before. The research will help the team develop artificial haptic intelligence, which is, essentially, giving robots, as well as prostheses, the “human touch.”
The research is supported by NSF award #1208519, NRI-Small: Context-Driven Haptic Inquiry of Objects Based on Task Requirements for Artificial Grasp and Manipulation. NRI is the acronym for the National Robotics Initiative.