Skip to feedback

Award Abstract # 2239764
CAREER: Learning and Sharing Transferable Grounded Object Knowledge for Collaborative Robots

NSF Org: CCF
Division of Computing and Communication Foundations
Recipient: TRUSTEES OF TUFTS COLLEGE
Initial Amendment Date: February 28, 2023
Latest Amendment Date: February 28, 2023
Award Number: 2239764
Award Instrument: Continuing Grant
Program Manager: Karl Wimmer
kwimmer@nsf.gov
 (703)292-2095
CCF
 Division of Computing and Communication Foundations
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: March 1, 2023
End Date: February 29, 2028 (Estimated)
Total Intended Award Amount: $522,702.00
Total Awarded Amount to Date: $305,073.00
Funds Obligated to Date: FY 2023 = $305,073.00
History of Investigator:
  • Jivko Sinapov (Principal Investigator)
    Jivko.Sinapov@tufts.edu
Recipient Sponsored Research Office: Tufts University
80 GEORGE ST
MEDFORD
MA  US  02155-5519
(617)627-3696
Sponsor Congressional District: 05
Primary Place of Performance: Tufts University
169 HOLLAND ST FL 3
SOMERVILLE
MA  US  02144-2401
Primary Place of Performance
Congressional District:
07
Unique Entity Identifier (UEI): WL9FLBRVPJJ7
Parent UEI: WL9FLBRVPJJ7
NSF Program(s): FRR-Foundationl Rsrch Robotics,
FRR-Foundationl Rsrch Robotics
Primary Program Source: 01002324DB NSF RESEARCH & RELATED ACTIVIT
01002627DB NSF RESEARCH & RELATED ACTIVIT

01002728DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 1045, 6840
Program Element Code(s): 144y00, 144Y00
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.041, 47.070

ABSTRACT

Advances in visual and non-visual sensing technologies (e.g., artificial sense of touch) have enabled robots to greatly improve their object manipulation skills. Understanding how objects move, sound, and feel like can improve human-robot collaboration in tasks such as assembling components in manufacturing environments or sorting objects in warehouses and distribution centers. However, learned object knowledge by one robot cannot easily be used by a different robot, with a different body, sensors, and movement actions. In practice, this means that when a new robot is deployed, it has to learn many of its skills and much of its knowledge from scratch. This Faculty Early Career Development (CAREER) project will develop methods for transferring object knowledge across robots so that a newly deployed robot can make sense of the experiences of other robots that have operated in the same or similar environments. This project will facilitate the ability of collaborative robots in homes and workplaces to perceive and reason about the properties of objects. Robots in assistive settings will be better at learning tasks that require the sense of touch, for example, helping a disabled person take off their shoes. The project will also improve robots? ability to connect language to visual and non-visual perception, for example, helping robots recognize that a particular object can be referred to as ?soft?, which is important when humans and robots use language to communicate about objects.

Multisensory object knowledge includes recognition models that ground language in multiple sensory modalities (e.g., a classifier that recognizes if an object is ?soft? given haptic readings produced when pressing the object) as well as forward models which predict changes in the robot?s environment as a result of its actions. The research objective of this project is to enable multiple heterogeneous robots to learn and share multisensory object knowledge to reduce the amount of interaction data each individual robot needs to collect. This project hypothesizes that two or more robots with different embodiments and sensors can learn to transfer multisensory representations through the use of shared embedding spaces, to which robots map their own experiences and from which they learn using the experiences of other robots. This research will develop the theoretical framework for such transfer along with algorithms and representations that scale to large numbers of robots, sensory modalities, objects, and interaction behaviors. Experimental evaluation will be conducted using existing datasets in the beginning, as well as new datasets with increasing complexity that will be collected with multiple robotic platforms.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Shukla, Y. and Kesari, B. and Goel, S. and Wright, R. and Sinapov, J. "A Framework for Few-Shot Policy Transfer through Observation Mapping and Behavior Cloning" Proceedings of the IEEERSJ International Conference on Intelligent Robots and Systems , 2023 https://doi.org/10.1109/IROS55552.2023.10342477 Citation Details
Tatiya G. and Francis J. and and Sinapov, J. "Transferring Implicit Knowledge of Non-Visual Object Properties Across Heterogeneous Robot Morphologies" IEEE International Conference on Robotics and Automation , 2023 https://doi.org/10.1109/ICRA48891.2023.10160811 Citation Details

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page