
NSF Org: |
CCF Division of Computing and Communication Foundations |
Recipient: |
|
Initial Amendment Date: | February 28, 2023 |
Latest Amendment Date: | February 28, 2023 |
Award Number: | 2239764 |
Award Instrument: | Continuing Grant |
Program Manager: |
Karl Wimmer
kwimmer@nsf.gov (703)292-2095 CCF Division of Computing and Communication Foundations CSE Directorate for Computer and Information Science and Engineering |
Start Date: | March 1, 2023 |
End Date: | February 29, 2028 (Estimated) |
Total Intended Award Amount: | $522,702.00 |
Total Awarded Amount to Date: | $305,073.00 |
Funds Obligated to Date: |
|
History of Investigator: |
|
Recipient Sponsored Research Office: |
80 GEORGE ST MEDFORD MA US 02155-5519 (617)627-3696 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
169 HOLLAND ST FL 3 SOMERVILLE MA US 02144-2401 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): |
FRR-Foundationl Rsrch Robotics, FRR-Foundationl Rsrch Robotics |
Primary Program Source: |
01002627DB NSF RESEARCH & RELATED ACTIVIT 01002728DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.041, 47.070 |
ABSTRACT
Advances in visual and non-visual sensing technologies (e.g., artificial sense of touch) have enabled robots to greatly improve their object manipulation skills. Understanding how objects move, sound, and feel like can improve human-robot collaboration in tasks such as assembling components in manufacturing environments or sorting objects in warehouses and distribution centers. However, learned object knowledge by one robot cannot easily be used by a different robot, with a different body, sensors, and movement actions. In practice, this means that when a new robot is deployed, it has to learn many of its skills and much of its knowledge from scratch. This Faculty Early Career Development (CAREER) project will develop methods for transferring object knowledge across robots so that a newly deployed robot can make sense of the experiences of other robots that have operated in the same or similar environments. This project will facilitate the ability of collaborative robots in homes and workplaces to perceive and reason about the properties of objects. Robots in assistive settings will be better at learning tasks that require the sense of touch, for example, helping a disabled person take off their shoes. The project will also improve robots? ability to connect language to visual and non-visual perception, for example, helping robots recognize that a particular object can be referred to as ?soft?, which is important when humans and robots use language to communicate about objects.
Multisensory object knowledge includes recognition models that ground language in multiple sensory modalities (e.g., a classifier that recognizes if an object is ?soft? given haptic readings produced when pressing the object) as well as forward models which predict changes in the robot?s environment as a result of its actions. The research objective of this project is to enable multiple heterogeneous robots to learn and share multisensory object knowledge to reduce the amount of interaction data each individual robot needs to collect. This project hypothesizes that two or more robots with different embodiments and sensors can learn to transfer multisensory representations through the use of shared embedding spaces, to which robots map their own experiences and from which they learn using the experiences of other robots. This research will develop the theoretical framework for such transfer along with algorithms and representations that scale to large numbers of robots, sensory modalities, objects, and interaction behaviors. Experimental evaluation will be conducted using existing datasets in the beginning, as well as new datasets with increasing complexity that will be collected with multiple robotic platforms.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
Please report errors in award information by writing to: awardsearch@nsf.gov.