
NSF Org: |
IIS Division of Information & Intelligent Systems |
Recipient: |
|
Initial Amendment Date: | September 9, 2019 |
Latest Amendment Date: | September 9, 2019 |
Award Number: | 1925157 |
Award Instrument: | Standard Grant |
Program Manager: |
Cang Ye
cye@nsf.gov (703)292-4702 IIS Division of Information & Intelligent Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | October 1, 2019 |
End Date: | September 30, 2024 (Estimated) |
Total Intended Award Amount: | $750,000.00 |
Total Awarded Amount to Date: | $750,000.00 |
Funds Obligated to Date: |
|
History of Investigator: |
|
Recipient Sponsored Research Office: |
615 W 131ST ST NEW YORK NY US 10027-7922 (212)854-6851 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
530 West 120th Street New York NY US 10027-7922 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | NRI-National Robotics Initiati |
Primary Program Source: |
|
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
This project studies robots that utilize the nearby and available physical objects to perform tasks, such as building a bridge out of miscellaneous rubble in a disaster area. Resourceful robots have the potential to enable many new applications in emergency response, healthcare, and manufacturing, which will improve the welfare, security, and efficiency of the overall population. The research investigates how the patterns between multiple senses, such as vision, sound, and touch, will help teach the robot to solve interaction tasks without needing a human teacher, which is expected to improve the flexibility and versatility of autonomous robots. The project will provide research and educational opportunities for both graduate and undergraduate students in computer science and mechanical engineering. Outcomes from this research will translate into new educational materials in computer vision, machine learning, and robotics.
This research investigates robots that interact with realistic environments in order to learn reusable representations for navigation and manipulation tasks. While there has been significant advancements leveraging machine learning for computer vision and robotics problems, a central challenge in both fields is generalizing to the realistic complexity and diversity of the physical world. Although simulation has proved instrumental in developing platforms for machine interaction, the unconstrained world is vast, making it computationally difficult to simulate. Instead, the investigators aim to capitalize on the inherent structure of physical environments through the natural synchronization of modalities and context to efficiently learn self-supervised representations and policies for interaction with unconstrained environments. The investigators also plan several evaluations to analyze the generalization capabilities of such algorithms.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
This project demonstrated how robots can learn about their environment through direct physical interaction rather than relying on pre-programmed simulations. The project developed new approaches that allow robots to gain understanding of physical properties and dynamics through hands-on experimentation, similar to how humans learn by interacting with the world around them. A major achievement of this research was the development of PaperBot, an innovative system that learns to design and optimize tools made from paper through real-world trial and error. Unlike traditional approaches that depend heavily on computer simulation, PaperBot learns directly from physical experiments using vision systems and force sensors to evaluate its designs. This represents an important advance in robotics, showing that robots can effectively learn complex physical properties through direct experience rather than theoretical models. The research produced results in two challenging test cases. First, PaperBot learned to design and fold paper airplanes that could fly further than human-designed versions after just 100 trials, mastering complex aerodynamic principles through experimentation. Second, the system created paper-based grippers capable of carefully handling delicate objects like fruit, demonstrating practical applications for fields like food processing and medical device handling.
The broader impacts of this research extend well beyond robotics. The project advances sustainable manufacturing by showing how recyclable materials like paper can be transformed into functional tools. The development of low-cost, customizable paper-based tools could improve healthcare accessibility, particularly in resource-limited settings. The research also promotes more accessible technological development by using readily available materials and sharing findings openly. These advances lay the groundwork for more adaptable and resourceful robotic systems that can learn from their environment and create custom solutions for specific tasks. This capability could transform various fields, from manufacturing and healthcare to environmental sustainability, while promoting more accessible and sustainable technological development.
Last Modified: 02/03/2025
Modified by: Carl M Vondrick
Please report errors in award information by writing to: awardsearch@nsf.gov.