
NSF Org: |
CNS Division Of Computer and Network Systems |
Recipient: |
|
Initial Amendment Date: | June 29, 2023 |
Latest Amendment Date: | June 29, 2023 |
Award Number: | 2235231 |
Award Instrument: | Standard Grant |
Program Manager: |
Pavithra Prabhakar
pprabhak@nsf.gov (703)292-2585 CNS Division Of Computer and Network Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | July 1, 2023 |
End Date: | June 30, 2026 (Estimated) |
Total Intended Award Amount: | $700,000.00 |
Total Awarded Amount to Date: | $700,000.00 |
Funds Obligated to Date: |
|
History of Investigator: |
|
Recipient Sponsored Research Office: |
426 AUDITORIUM RD RM 2 EAST LANSING MI US 48824-2600 (517)355-5040 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
428 South Shaw Lane EAST LANSING MI US 48824-1226 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | CPS-Cyber-Physical Systems |
Primary Program Source: |
|
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
Autonomous driving is on the verge of revolutionizing the transportation system and significantly improving the well-being of people. An autonomous vehicle relies on multiple sensors and AI algorithms to facilitate sensing and perception for navigating the world. As the automotive industry primarily focuses on increasing autonomy levels and enhancing perception performance in mainly benign environments, the security and safety of perception technologies against physical attacks have yet to be thoroughly investigated. Specifically, adversaries creating physical-world perceptual illusions may pose a significant threat to the sensing and learning systems of autonomous vehicles, potentially undermining trust in these systems. This research project aims to deepen our understanding of the security and safety risks under physical attacks. The project endeavors to bolster sensing and learning resilience in autonomous driving against malicious perceptual illusion attacks. The success of the project will significantly advance the security and safety of autonomous driving in the face of emerging physical-world threats, paving the way for the safe deployment of autonomous vehicles in next-generation transportation systems.
The goal of this project is to investigate advanced sensing and learning technologies to enhance the precision and robustness of autonomous driving in intricate and hostile environments. The team?s approach includes: (i) a comprehensive framework to evaluate key vulnerabilities in software/hardware components of autonomous driving systems and devise effective attack vectors for generating false and deceptive perceptions; (ii) a real-time super-resolution radar sensing technology and a data fusion approach that integrates features from various sensor types at both the middle and late stages to effectively bolster the robustness of each sensing modality against illusions; and (iii) a systematic framework to enhance the algorithmic generality and achieve robust perception against multi-modal attacks using multi-view representation learning. The presented solutions will undergo rigorous testing using simulations and experiments to validate their effectiveness and robustness. These solutions contribute to the development of more secure and robust autonomous driving systems, capable of withstanding perceptual illusion attacks in real-world scenarios. The project will also offer research training opportunities for underrepresented students across diverse levels and age groups. The resulting novel technology will be shared as open-source for broader dissemination and advancement of the knowledge developed through this project.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
Please report errors in award information by writing to: awardsearch@nsf.gov.