Email Print Share
April 6, 2015

Giving robots and prostheses the human touch


Research engineers and students in the University of California, Los Angeles (UCLA), Biomechatronics Lab are designing artificial limbs to be more sensational, with the emphasis on sensation. With support from the National Science Foundation (NSF), the team, led by mechanical engineer Veronica J. Santos, is constructing a language of touch that both a computer and a human can understand. The researchers are quantifying this with mechanical touch sensors that interact with objects of various shapes, sizes and textures. Using an array of instrumentation, Santos' team is able to translate that interaction into data a computer can understand. The data is used to create a formula or algorithm that gives the computer the ability to identify patterns among the items it has in its library of experiences and something it has never felt before. This research will help the team develop artificial haptic intelligence, which is, essentially, giving robots, as well as prostheses, the "human touch."

Credit: National Science Foundation


Images and other media in the National Science Foundation Multimedia Gallery are available for use in print and electronic material by NSF employees, members of the media, university staff, teachers and the general public. All media in the gallery are intended for personal, educational and nonprofit/non-commercial use only.

Videos credited to the National Science Foundation, an agency of the U.S. Government, may be distributed freely. However, some materials within the videos may be copyrighted. If you would like to use portions of NSF-produced programs in another product, please contact the Video Team in the Office of Legislative and Public Affairs at the National Science Foundation.

Additional information about general usage can be found in Conditions.