Award Abstract # 2131511
Collaborative Research: DASS: Accountable Software Systems for Safety-Critical Applications

NSF Org: CCF
Division of Computing and Communication Foundations
Recipient: RECTOR & VISITORS OF THE UNIVERSITY OF VIRGINIA
Initial Amendment Date: August 30, 2021
Latest Amendment Date: August 30, 2021
Award Number: 2131511
Award Instrument: Standard Grant
Program Manager: Anindya Banerjee
abanerje@nsf.gov
 (703)292-7885
CCF
 Division of Computing and Communication Foundations
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: October 1, 2021
End Date: September 30, 2025 (Estimated)
Total Intended Award Amount: $374,876.00
Total Awarded Amount to Date: $374,876.00
Funds Obligated to Date: FY 2021 = $374,876.00
History of Investigator:
  • Lu Feng (Principal Investigator)
    lf9u@virginia.edu
Recipient Sponsored Research Office: University of Virginia Main Campus
1001 EMMET ST N
CHARLOTTESVILLE
VA  US  22903-4833
(434)924-4270
Sponsor Congressional District: 05
Primary Place of Performance: University of Virginia Main Campus
151 Engineer's Way
Charlottesville
VA  US  22904-4259
Primary Place of Performance
Congressional District:
05
Unique Entity Identifier (UEI): JJG6HU8PA4S5
Parent UEI:
NSF Program(s): DASS-Dsgng Accntble SW Systms
Primary Program Source: 01002122DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s):
Program Element Code(s): 175Y00
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Safety-critical software systems are entering the market in large numbers and are expected to transform many industries including healthcare, transportation, manufacturing, and others. In response to the rising societal impact and complexity of software systems, lawmakers and regulatory authorities are implementing new laws, regulations, and guidelines to hold software accountable for its harmful effects. These legal approaches differ across jurisdictions and across application domains, and they will evolve over time as lawmakers and regulators continue to study and address emerging software capabilities. Despite this mounting regulatory pressure, state-of-the-art software-design methodologies are deficient at providing the desired accountability in safety-critical systems. The project?s novelties are twofold: (1) developing principled approaches and tools for assuring and demonstrating accountability of safety-critical software systems with respect to laws and regulations that evolve over time, and (2) advancing a legal framework that harmonizes regulatory oversight of software systems across heterogeneous safety-critical domains. The project?s impacts are facilitating the design of safety-critical software systems that are accountable with respect to various regulations, and providing legal insight on how to extend or amend current regulatory approaches to enhance software accountability. In addition, the investigators will organize a series of interdisciplinary workshops and symposiums to bring together experts in software design and law to discuss open research questions and potential solutions to software accountability. The investigators also plan to develop new course materials in computer science and law to integrate the proposed research outcomes, and actively recruit underrepresented students for positions in the proposed project.

The project includes three research thrusts that seek to make fundamental contributions to both software design and law. The first thrust creates novel approaches and tools for developing compositional dynamic assurance cases throughout the software development lifecycle to assure and demonstrate accountability. The second thrust develops novel formal-verification techniques for generating provable and certifiable regulation compliance guarantees, which can be used as evidence in assurance cases. The third thrust develops legal insight on how lawmakers and regulators should extend or amend current regulatory approaches to incorporate advances in software accountability methods.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Maryam Bagheri and Josephine Lamp and Xugui Zhou and Lu Feng and Homa Alemzadeh "Towards Developing Safety Assurance Cases for Learning-Enabled Medical Cyber-Physical Systems" Proceedings of the Workshop on Artificial Intelligence Safety 2023 (SafeAI 2023) co-located with the Thirty-Seventh AAAI Conference on Artificial Intelligence (AAAI 2023) , 2023 Citation Details

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page