Award Abstract # 2238968
CAREER: Robust and Collaborative Perception and Navigation for Construction Robots

NSF Org: IIS
Division of Information & Intelligent Systems
Recipient: NEW YORK UNIVERSITY
Initial Amendment Date: February 16, 2023
Latest Amendment Date: July 12, 2024
Award Number: 2238968
Award Instrument: Continuing Grant
Program Manager: Cang Ye
cye@nsf.gov
 (703)292-4702
IIS
 Division of Information & Intelligent Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: September 1, 2023
End Date: August 31, 2028 (Estimated)
Total Intended Award Amount: $599,999.00
Total Awarded Amount to Date: $599,999.00
Funds Obligated to Date: FY 2023 = $475,000.00
FY 2024 = $124,999.00
History of Investigator:
  • Chen Feng (Principal Investigator)
    cfeng@nyu.edu
Recipient Sponsored Research Office: New York University
70 WASHINGTON SQ S
NEW YORK
NY  US  10012-1019
(212)998-2121
Sponsor Congressional District: 10
Primary Place of Performance: New York University
70 WASHINGTON SQ S
NEW YORK
NY  US  10012-1019
Primary Place of Performance
Congressional District:
10
Unique Entity Identifier (UEI): NX9PXMKW5KW8
Parent UEI:
NSF Program(s): FRR-Foundationl Rsrch Robotics
Primary Program Source: 01002324DB NSF RESEARCH & RELATED ACTIVIT
01002425DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 075Z, 1045, 6840, 9102
Program Element Code(s): 144Y00
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.041, 47.070

ABSTRACT

The construction industry is responsible for maintaining aging civil infrastructure and build new facilities that can accommodate the social needs of the 21st century. This is in addition to the ongoing critical need to address long-standing problems in occupational safety, labor productivity, costs, and labor shortage. A promising technical solution is to introduce mobile robots on construction jobsites. It is possible to leverage recent discoveries in robotics and artificial intelligence (AI) to tackle those aforementioned challenges. However, unlike manufacturing automation or self-driving cars, construction robots face unique challenges due to the need to navigate dynamic environments. Such robots are also required to work closely with humans in a variaty of tasks and often handle heavy payloads. This award supports fundamental robotics research to allow better perception and navigation for construction jobsite monitoring robots. It will produce an intelligent mobile robot team equipped with cameras to autonomously monitor construction progress and operations to improve jobsite efficiency and safety. The results of this research will be widely applicable to scenarios beyond construction, ranging from connected and autonomous vehicles to service robotics in smart and accessible cities. The project will facilitate collaboration between robotics, artificial intelligence, and civil and mechanical engineering. Furthermore, it aims to broaden participation of underrepresented groups in engineering via educational games, multi-disciplinary robotics curriculum, and workforce training workshops.

Mobile robotics in construction jobsites are often limited by perception challenges due to occlusion and limited field of view. In dynamic jobsites, limited perception leads to navigation and inefficient assistance. To improve the robustness, reliability, and scalability of the vision system in mobile robots, novel self-supervised and graph-based representation learning will be used to extract, organize, and reason about places and objects from high-dimensional sensory inputs. This research will advance the state of the art along three directions: (1) robust navigation from topological representations for monitoring in dynamic and ever-changing jobsites, (2) collaborative perception for providing safer operation monitoring and collision warnings on busy jobsites, and (3) integrated perception and navigation at both the algorithm, system, and dataset levels. The research will be validated in real construction jobsites through industry partners, and the resulting software, hardware design, and dataset will be open source to stimulate future research.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Chen, Chao and Liu, Xinhao and Li, Yiming and Ding, Li and Feng, Chen "DeepMapping2: Self-Supervised Large-Scale LiDAR Map Optimization" IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) , 2023 https://doi.org/10.1109/CVPR52729.2023.00898 Citation Details
He, Yuhang and Fang, Irving and Li, Yiming and Shah, Rushi and Feng, Chen "Metric-Free Exploration for Topological Mapping by Task and Motion Imitation in Feature Space" , 2023 https://doi.org/10.15607/RSS.2023.XIX.099 Citation Details
Li, Yiming and Fang, Qi and Bai, Jiamu and Chen, Siheng and Juefei-Xu, Felix and Feng, Chen "Among Us: Adversarially Robust Collaborative Perception by Consensus" , 2023 https://doi.org/10.1109/ICCV51070.2023.00024 Citation Details
Su, Sanbao and Han, Songyang and Li, Yiming and Zhang, Zhili and Feng, Chen and Ding, Caiwen and Miao, Fei "Collaborative Multi-Object Tracking With Conformal Uncertainty Propagation" IEEE Robotics and Automation Letters , v.9 , 2024 https://doi.org/10.1109/LRA.2024.3364450 Citation Details
Zhang, Ruixuan and Han, Wenyu and Bian, Zilin and Ozbay, Kaan and Feng, Chen "Learning When to See for Long-Term Traffic Data Collection on Power-Constrained Devices" , 2023 https://doi.org/10.1109/ITSC57777.2023.10422352 Citation Details

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page