Skip to feedback

Award Abstract # 2220868
Collaborative Research: Visual Tactile Neural Fields for Active Digital Twin Generation

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: TRUSTEES OF THE UNIVERSITY OF PENNSYLVANIA, THE
Initial Amendment Date: August 31, 2022
Latest Amendment Date: August 31, 2022
Award Number: 2220868
Award Instrument: Standard Grant
Program Manager: Ralph Wachter
rwachter@nsf.gov
 (703)292-8950
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: October 1, 2022
End Date: September 30, 2025 (Estimated)
Total Intended Award Amount: $272,290.00
Total Awarded Amount to Date: $272,290.00
Funds Obligated to Date: FY 2022 = $272,290.00
History of Investigator:
  • Kostas Daniilidis (Principal Investigator)
    kostas@cis.upenn.edu
Recipient Sponsored Research Office: University of Pennsylvania
3451 WALNUT ST STE 440A
PHILADELPHIA
PA  US  19104-6205
(215)898-7293
Sponsor Congressional District: 03
Primary Place of Performance: University of Pennsylvania
Research Services
Philadelphia
PA  US  19104-6205
Primary Place of Performance
Congressional District:
03
Unique Entity Identifier (UEI): GM1XX56LEP58
Parent UEI: GM1XX56LEP58
NSF Program(s): NRI-National Robotics Initiati
Primary Program Source: 01002223DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 6840
Program Element Code(s): 801300
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Robots will perform better at everyday activities when they can quickly combine their sensory data into a model of their environment, just like how humans instinctively use all their senses and knowledge to accomplish daily tasks. Robots, however, must be programmed to create these models that humans do intuitively, effortlessly, and robustly. This robotics project explores a novel algorithmic approach that combines visual and tactile sensory data with a knowledge of physics and a capability to learn that makes robot planning and reasoning more effective, efficient, and adaptable. The project includes the development and testing of research prototypes, preparation of new curriculum, and outreach to high school students and teachers and to the general public.

This project introduces a new data representation, called a Visual Tactile Neural Field (VTNF), that allows robots to combine data from visual and tactile sensors to create a unified model of an object. The VTNF is designed to be used in a closed-loop manner, where a robot may use data from its physical interactions with an object to create or improve a model and may use its current understanding of a model to inform how best to interact with a physical object. Towards this end, the investigators create the mathematical techniques, computational tools, and robot hardware necessary to generate a VTNF model. The investigators also develop techniques to quantify the uncertainty about an object and use this uncertainty to learn search policies that allow robots to generate accurate models as quickly as possible. The VTNF, which allows for the easy addition of new properties about an object, provides a flexible representational foundation for other researchers and practitioners to use to enable robots to learn faster by having a more detailed understanding of both the surrounding environment and their interactions with it.

This project is supported by the cross-directorate Foundational Research program in Robotics and the National Robotics Initiative, jointly managed and funded by the Directorates for Engineering (ENG) and Computer and Information Science and Engineering (CISE).

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Jiang, Wen and Lei, Boshu and Daniilidis, Kostas "FisherRF: Active View Selection and Mapping with Radiance Fields Using Fisher Information" , 2024 Citation Details

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page