Archived funding opportunity

This solicitation is archived.

NSF 19-566: Real-Time Machine Learning (RTML)

Program Solicitation

Document Information

Document History

  • Posted: March 8, 2019

Program Solicitation NSF 19-566

NSF Logo

National Science Foundation

Directorate for Computer and Information Science and Engineering
     Division of Computing and Communication Foundations

Directorate for Engineering
     Division of Electrical, Communications and Cyber Systems

Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):

     June 06, 2019

Important Information And Revision Notes

Any proposal submitted in response to this solicitation should be submitted in accordance with the revised ;NSF Proposal & Award Policies & Procedures Guide (PAPPG) (NSF 19-1), which is effective for proposals submitted, or due, on or after February 25, 2019.

Summary Of Program Requirements

General Information

Program Title:

Real-Time Machine Learning (RTML)

Synopsis of Program:

A grand challenge in computing is the creation of machines that can proactively interpret and learn from data in real time, solve unfamiliar problems using what they have learned, and operate with the energy efficiency of the human brain. While complex machine-learning algorithms and advanced electronic hardware (henceforth referred to as 'hardware') that can support large-scale learning have been realized in recent years and support applications such as speech recognition and computer vision, emerging computing challenges require real-time learning, prediction, and automated decision-making in diverse domains such as autonomous vehicles, military applications, healthcare informatics and business analytics.

A salient feature of these emerging domains is the large and continuously streaming data sets that these applications generate, which must be processed efficiently enough to support real-time learning and decision making based on these data. This challenge requires novel hardware techniques and machine-learning architectures. This solicitation seeks to lay the foundation for next-generation co-design of RTML algorithms and hardware, with the principal focus on developing novel hardware architectures and learning algorithms in which all stages of training (including incremental training, hyperparameter estimation, and deployment) can be performed in real time.

The National Science Foundation (NSF) and the Defense Advanced Research Projects Agency (DARPA) are teaming up through this Real-Time Machine Learning (RTML) program to explore high-performance, energy-efficient hardware and machine-learning architectures that can learn from a continuous stream of new data in real time, through opportunities for post-award collaboration between researchers supported by DARPA and NSF.

Cognizant Program Officer(s):

Please note that the following information is current at the time of publishing. See program website for any updates to the points of contact.

Applicable Catalog of Federal Domestic Assistance (CFDA) Number(s):

  • 47.041 --- Engineering
  • 47.070 --- Computer and Information Science and Engineering

Award Information

Anticipated Type of Award: Continuing Grant

Estimated Number of Awards: 8 to 12

Anticipated Funding Amount: $10,000,000

Award size: Small Awards: up to $500,000 for 3 years; Large Awards: up to $1,500,000 for 3 years.

Estimated program budget, number of awards and average award size/duration are subject to the availability of funds.

Eligibility Information

Who May Submit Proposals:

Proposals may only be submitted by the following:

  • Institutions of Higher Education (IHEs) - Two- and four-year IHEs (including community colleges) accredited in, and having a campus located in the US, acting on behalf of their faculty members. Special Instructions for International Branch Campuses of US IHEs: If the proposal includes funding to be provided to an international branch campus of a US institution of higher education (including through use of subawards and consultant arrangements), the proposer must explain the benefit(s) to the project of performance at the international branch campus, and justify why the project activities cannot be performed at the US campus.

Who May Serve as PI:

There are no restrictions or limits.

Limit on Number of Proposals per Organization:

There are no restrictions or limits.

Limit on Number of Proposals per PI or Co-PI: 2

An individual can participate as PI, co-PI, Senior Personnel, or Consultant on no more than two proposals submitted in response to this solicitation.

These eligibility constraints will be strictly enforced in order to ensure fair and consistent treatment for everyone. In the event that an individual exceeds the two-proposal limit for this solicitation the first two proposals received will be accepted and the remainder will be returned without review. No exceptions will be made.

Additionally, proposals submitted in response to this solicitation may not duplicate or be substantially similar to other proposals concurrently under consideration by DARPA. Duplicate or substantially similar proposals will be returned without review.

Proposal Preparation and Submission Instructions

A. Proposal Preparation Instructions

  • Letters of Intent: Not required
  • Preliminary Proposal Submission: Not required
  • Full Proposals:

B. Budgetary Information

  • Cost Sharing Requirements:

    Inclusion of voluntary committed cost sharing is prohibited.

  • Indirect Cost (F&A) Limitations:

    Not Applicable

  • Other Budgetary Limitations:

    Other budgetary limitations apply. Please see the full text of this solicitation for further information.

C. Due Dates

  • Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):

    June 06, 2019

Proposal Review Information Criteria

Merit Review Criteria:

National Science Board approved criteria. Additional merit review considerations apply. Please see the full text of this solicitation for further information.

Award Administration Information

Award Conditions:

Additional award conditions apply. Please see the full text of this solicitation for further information.

Reporting Requirements:

Standard NSF reporting requirements apply.

I. Introduction

A grand challenge in computing is the creation of machines that can proactively interpret and learn from data in real time, solve unfamiliar problems using what they have learned, and operate with the energy efficiency of the human brain. The exponential increase in hardware performance, the development of complex machine-learning (ML) algorithms, and their realization using high-performance electronic hardware (referred henceforth as 'hardware') has facilitated learning from very large data sets in recent years. While older problems of artificial intelligence (AI), such as speech recognition and language translation, have already made their way into everyday use (e.g., in smartphones), current and future computing challenges need to be addressed to achieve the possibility of real-time prediction and automated decision-making in diverse domains such as autonomous vehicles, military applications, healthcare informatics and business analytics.

A salient feature of these emerging domains is the large and continuously streaming data sets that these applications generate, which must be processed efficiently enough to support real-time learning and decision making based on these data. This challenge requires novel hardware techniques and machine-learning architectures. NSF and DARPA are teaming up through this Real-Time Machine Learning (RTML) program to explore high-performance, energy-efficient hardware and ML architectures that can learn from a continuous stream of new data in real time.

II. Program Description

The need to process large data sets arising in many practical problems require real-time learning from data streams makes high-performance hardware necessary, and yet the very nature of these problems, along with currently known algorithms for addressing them, imposes significant hardware challenges. Current versions of deep-learning algorithms operate by using millions of parameters whose optimal values need to be determined for good performance in real time on high-performance hardware.

Conversely, the availability of fast hardware implementations can enable fuller use of Bayesian techniques, attractive for their ability to quantify prediction uncertainty and thus give estimates of reliability and prediction breakdown. The abilities of ML systems to self-assess for reliability and predict their own breakdowns (and also recover without significant ill effects) constitute critical areas for algorithm development as autonomous systems become widely deployed in both decision support and embodied AI agents. Only with attention to these challenges can we construct systems that are robust when they encounter novel situations or degradation and failure of sensors.

While ML algorithms need to account for the capabilities of hardware as well as real-time learning and inference constraints, hardware also needs to be re-designed from the ground-up to optimize ML architectures. To address this dual challenge, NSF and DARPA are teaming up to explore advances in energy-efficient hardware and ML architectures that can learn from a continuous stream of new data in real time. While this NSF program, called Real-Time Machine Learning (RTML), is distinct from the DARPA RTML program, the NSF program offers collaboration opportunities to awardees from DARPA (and DARPA offers similar opportunities to the NSF awardees) throughout the duration of their projects, as described in Section II.D.

II.A. Machine Learning:

As part of this program, various ML paradigms and architectures (including deep-learning) that can support real-time inference and rapid learning are of interest. These include, but are not limited to:

  1. feed forward neural networks (including convolutional nets);
  2. recurrent networks and specialized versions (e.g., liquid-state machines);
  3. neuroscience-inspired architectures, such as spike time-dependent neural nets including their stochastic counterparts;
  4. non-neural ML architectures inspired by psychophysics and derived from classical statistical methods;
  5. classical supervised learning (e.g., regression and decision trees and related ensemble techniques);
  6. unsupervised learning (e.g., clustering and manifold learning) approaches;
  7. semi-supervised learning methods;
  8. adversarial learning; and
  9. improved transfer learning, reinforcement learning, and one-shot learning algorithms and architectures appropriate for hardware implementation.

Centralized learning from aggregated data over time in a cloud environment often does not lend itself to real-time inference and adaptation to new, unlabeled datasets. These situations are often found in distributed settings such as autonomous vehicles, arrays of sensors, and adversarial settings where resources for exporting the newly-encountered data might be scarce or unavailable. In these cases, one can expect that data are being collected and processed by distributed arrays of sensors with some limited learning capabilities, though the communication among these sensors and to a centralized cloud could be highly constrained. Therefore, approaches to RTML in a distributed setting that can closely approximate ML performance in a centralized cloud setting are highly desired in this program.

II.B. Hardware Design and Realization:

The race to improve and accelerate machine learning through hardware realizations has been largely driven by several emerging hardware technologies that require significant additional research, development, and evaluation activities. Radical innovations spanning multiple layers of the design stack from device to circuit to architecture levels and involving both memory and switching elements are pertinent. At the circuit level, while the memristor has been reinvented as the fourth circuit element suitable for neural modeling, physical stochasticity inherent in some emerging devices can be used to realize spike-timing behavior of neurons. The photonic and spintronic architectures are other promising examples and have been explored for architecting deep-learning machines. At a higher level of the stack, in- or near-memory processing, possibly in a three-dimensional integrated circuit architecture, provides an example of non-von Neumann computing relevant for solving challenging AI problems as well.

Several of the neuronal models of computation currently being explored are not purely digital but are also analog or mixed-signal. Several emerging technologies lend themselves more naturally to analog/mixed-signal realizations. Attempts to solve computationally hard, purely combinatorial problems by reformulating the discrete problem as a continuous-time analog dynamical system are also underway. Such systems that are naturally realized in analog/mixed-signal hardware show much promise for faster run time and/or significantly less energy consumption for problems inspired by RTML.

Each alternative hardware paradigm has its unique constraints that can be satisfied in multiple ways by exploiting flexibility in algorithm design, thus giving rise to possibilities of hardware-software-algorithm co-design. This possibility is not just for hardware realizations using emerging technologies but can also be envisioned in conventional silicon CMOS platforms. For example, while the implementation of some deep-learning algorithms is practically infeasible within the limits of current CMOS technology, approximate versions of such algorithms are being explored as suitable for practical realization. Such research could inspire new algorithmic innovations from a foundational or analytical perspective, or could alternatively be driven by purely empirical or practical considerations. Both foundational and operational innovations in hardware-software-algorithm co-design of RTML architectures are of interest to this program.

The overall expectation of this program is to lay the foundation for next-generation co-design of RTML algorithms and hardware. Some program guidelines include:

  1. Learning algorithms in which all stages of training (including incremental training, hyperparameter estimation, and deployment) can be performed in real time will receive higher priority, recognizing the asymmetry in real-time learning versus real-time inference;
  2. Proposals submitted to this program should seek to demonstrate radical improvement in the metrics of performance, e.g., latency and energy efficiency, with minimal degradation in predictive performance, through hardware-software co-design;
  3. Hardware-software cross-layer co-design is required;
  4. Approximate algorithms for efficient implementation, e.g., low-precision gradient computation and sparsely-connected neuronal nodes, are within scope;
  5. Approaches to self-assessing systems are within scope;
  6. Distributed ML algorithms and hardware for real-time performance are within scope;
  7. Efficient and novel utilization of data and memory paths, e.g., in- or near-memory computations, are within scope;
  8. In addition to digital architecture, the program is also interested in analog/mixed-signal architectures; and
  9. Hardware technologies in silicon or other novel technologies, possibly on heterogeneous platforms, to the extent that is consistent with the goals of the DARPA RTML program, are encouraged.

Proposers should be aware that routine implementations of existing AI/ML algorithms in standard hardware are not within scope of this program.

II.C. Classes of Projects:

Proposals for the following two classes of projects will be accepted. Each proposer is expected to explain in the Project Description how the project fits within the selected category in terms of its scope and goals.

  • Small Projects: Small projects may be requested with total budgets of up to $500,000 for a period of up to three years. They are intended to support exploration of emerging and innovative ideas with substantial potential for impact. Proposals for Small projects are required to clearly describe the design and realization of the proposed RTML approach. The physical implementation is optional, but a roadmap of future development, e.g., in Field-Programmable Gate Arrays (FPGA), or in Application-Specific Integrated Circuits (ASIC) if pertinent, should be discussed. Small projects are not eligible for partnership supplements resulting from the DARPA collaboration (see details in Section II.D).
  • Large Projects: Large projects may be requested with total budgets up to $1,500,000 for a period of three years. They are intended to support multi-disciplinary efforts spanning ML, circuit, and hardware-software-algorithm co-design that accomplish clear goals requiring an integrated perspective spanning the disciplines. Proposals for Large projects are required to cover the design of the proposed RTML approach and the physical implementation in FPGA or ASIC. Collaborative teaming demonstrating appropriate multi-disciplinary expertise is required for Large projects. Large projects are also eligible for requesting partnership supplements resulting from the DARPA collaboration (see details in Section II.D).

II.D Structure of DARPA Collaboration:

The NSF-DARPA collaboration for this program seeks to enable cross-pollination of ideas that are being funded through the awards individually made by NSF and DARPA. The DARPA program will select project teams from submissions to its Broad Agency Announcement, and will award Phase 1 (18 months) and Phase 2 (18 months, for a total of 36 months including Phase 1) teams. NSF will independently select projects for 36-month awards. The DARPA Phase 1 objective is a RTML hardware silicon compiler, and the outcome will be made available by DARPA to the NSF awardees as an option to evaluate their proposed new RTML approaches. In the meantime, new techniques and results produced by NSF awardees during the first 18 months will be made available to DARPA project teams for them to implement in their Phase 2 efforts to explore novel ML architectures and circuits that will enable RTML. There will be four joint NSF-DARPA workshops during the 36-month program: at the initial program kick-off, and then at the 9-month, 18-month, and 27-month marks. Representatives of each NSF project are required to attend all four workshops to engage with DARPA project teams. These joint workshops are expected to promote knowledge-sharing and collaboration opportunities among the teams supported by both agencies. In particular, the first three workshops at the program kick-off and then the 9-month and 18-month marks (and the time in between these engagements) are critical to the expected synergy in the second half of the program.

Before starting Phase 2 work, DARPA performer teams are expected to synchronize expectations with the NSF RTML program to ensure that the latest techniques that are being produced by NSF awardees are being tried. As an option, DARPA performer teams can propose inclusion of researchers working in the NSF RTML awards as part of their DARPA Phase 2 efforts.

Any DARPA Phase 1 performers who do not qualify for Phase 2 support from DARPA can work with NSF awardees under this program to request supplemental funding from NSF through an existing NSF RTML Large award. Each such "partnership supplement" will be requested via an NSF awardee near the 18-month mark from the start date of their NSF project. The NSF RTML awardee will have to demonstrate that a collaboration with the DARPA performers will add value to the NSF project, to show alignment with and enhancing the goals of the NSF project, and to demonstrate sufficient intellectual merit in order to qualify for this supplemental funding to support the new project partner(s).

III. Award Information

Anticipated Type of Award: Continuing Grant

Estimated Number of Awards:8 to 12

Anticipated Funding Amount: $10,000,000

Award size: Small Awards: up to $500,000 for 3 years; Large Awards: up to $1,500,000 for 3 years.

Estimated program budget, number of awards and average award size/duration are subject to the availability of funds.

IV. Eligibility Information

Who May Submit Proposals:

Proposals may only be submitted by the following:

  • Institutions of Higher Education (IHEs) - Two- and four-year IHEs (including community colleges) accredited in, and having a campus located in the US, acting on behalf of their faculty members. Special Instructions for International Branch Campuses of US IHEs: If the proposal includes funding to be provided to an international branch campus of a US institution of higher education (including through use of subawards and consultant arrangements), the proposer must explain the benefit(s) to the project of performance at the international branch campus, and justify why the project activities cannot be performed at the US campus.

Who May Serve as PI:

There are no restrictions or limits.

Limit on Number of Proposals per Organization:

There are no restrictions or limits.

Limit on Number of Proposals per PI or Co-PI: 2

An individual can participate as PI, co-PI, Senior Personnel, or Consultant on no more than two proposals submitted in response to this solicitation.

These eligibility constraints will be strictly enforced in order to ensure fair and consistent treatment for everyone. In the event that an individual exceeds the two-proposal limit for this solicitation the first two proposals received will be accepted and the remainder will be returned without review. No exceptions will be made.

Additionally, proposals submitted in response to this solicitation may not duplicate or be substantially similar to other proposals concurrently under consideration by DARPA. Duplicate or substantially similar proposals will be returned without review.

Additional Eligibility Info:

Subawards are not permitted to overseas branch campuses/offices of US-based proposing organizations eligible to submit to this solicitation.

There are no restrictions on an institution from submitting to both the DARPA RTML program and this NSF RTML program; though these two funding opportunities are coordinated by NSF and DARPA, submissions are independently reviewed.

V. Proposal Preparation And Submission Instructions

A. Proposal Preparation Instructions

Full Proposal Preparation Instructions: Proposers may opt to submit proposals in response to this Program Solicitation via Grants.gov or via the NSF FastLane system.

  • Full proposals submitted via FastLane: Proposals submitted in response to this program solicitation should be prepared and submitted in accordance with the general guidelines contained in the NSF Proposal & Award Policies & Procedures Guide (PAPPG). The complete text of the PAPPG is available electronically on the NSF website at: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg. Paper copies of the PAPPG may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from nsfpubs@nsf.gov. Proposers are reminded to identify this program solicitation number in the program solicitation block on the NSF Cover Sheet For Proposal to the National Science Foundation. Compliance with this requirement is critical to determining the relevant proposal processing guidelines. Failure to submit this information may delay processing.

  • Full proposals submitted via Grants.gov: Proposals submitted in response to this program solicitation via Grants.gov should be prepared and submitted in accordance with the NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov. The complete text of the NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: (https://www.nsf.gov/publications/pub_summ.jsp?ods_key=grantsgovguide). To obtain copies of the Application Guide and Application Forms Package, click on the Apply tab on the Grants.gov site, then click on the Apply Step 1: Download a Grant Application Package and Application Instructions link and enter the funding opportunity number, (the program solicitation number without the NSF prefix) and press the Download Package button. Paper copies of the Grants.gov Application Guide also may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from nsfpubs@nsf.gov.

In determining which method to utilize in the electronic preparation and submission of the proposal, please note the following:

Collaborative Proposals. All collaborative proposals submitted as separate submissions from multiple organizations must be submitted via the NSF FastLane system. PAPPG Chapter II.D.3 provides additional information on collaborative proposals.

See PAPPG Chapter II.C.2 for guidance on the required sections of a full research proposal submitted to NSF. Please note that the proposal preparation instructions provided in this program solicitation may deviate from the PAPPG instructions.

The following information SUPPLEMENTS (note that it does NOT replace) the guidelines provided in the NSF Proposal & Award Policies & Procedures Guide (PAPPG).

Proposal Titles:

Proposal titles should take the form RTML followed by a colon, then the project class followed by a colon, then "Collaborative" followed by a colon (if a collaborative proposal), and then the title. For example, the title of each proposal of a collaborative set of proposals for a Large project would be RTML: Large: Collaborative: Title. Proposals from PIs in institutions that have RUI (Research in Undergraduate Institutions) eligibility should also include "RUI" followed by a colon immediately before the project title, for example, RTML: Small: RUI: Title. Similarly, GOALI (Grant Opportunities for Academic Liaison with Industry) proposals should include "GOALI" followed by a colon as the last identifier before the project title.

Project Summary:

The Project Summary consists of an overview, a statement on the intellectual merit of the proposed activity, a statement on the broader impacts of the proposed activity, and a set of keywords.

Please provide between 2 and 6 keywords at the end of the overview in the Project Summary. This information will be used in implementing the merit review process. The keywords should describe the main scientific/engineering areas explored in the proposal. Keywords should be prefaced with "Keywords" followed by a colon and keywords should be separated by semi-colons.

Project Description:

Length of Project Description - Describe the research and education activities to be undertaken in up to 15 pages for Small Projects, and in up to 20 pages for Large Projects. Proposals that exceed these limits will be returned without review.

All proposals are strongly encouraged to include meaningful plans to broaden and increase participation by underrepresented groups in computer science and engineering. These plans should be described within the Broader Impacts sections of the Project Description, should be clearly identifiable within the Project Description text, and should represent a clear, actionable effort with an evaluation plan. If a PI plans to become a part of an institutional broadening participation effort, then the PI must report on her/his specific contribution within that effort. An intervention that appeals to "all students" can be considered a broadening participation effort if the content is relevant to specific, identified underrepresented groups within the student body.

The Project Description must include the following subsections specifically labeled as below. Proposals that fail to include one or more of these sections will be returned without review (RWR), without exception.

Research Description: This is the intellectual heart of the Project Description and must also include "Intellectual Merit" as a subsection, as required in the PAPPG. The Research Description section must describe the technical rationale and technical approach of the RTML research. It should describe the challenges that drive the research problem. It must identify how the research integrates ML and hardware components. This section should also explain how the project research fits the Program Description for the class of proposal — Small or Large—as described in Section II.B. Classes of Projects. Specific activities for performing the research should be described as well. The section should additionally provide the project research plan including descriptions of major tasks, the primary organization responsible for each task, and milestones. The research description must include a Gantt chart which lays out the sequence of major activities and their inter-dependencies.

Evaluation/Experimentation Plan: This section should describe how the research concepts proposed will be demonstrated and validated. It should present metrics for success, and identify design choices, critical experiments, and describe how the research will be demonstrated, including through simulation, prototyping, and testing using synthetic or real-life datasets. For Large projects, the validation plan must include the physical implementation in FPGA or ASIC.

Broader Impacts: In addition to the specific information required in the PAPPG, this section should provide plans for disseminating the research outcomes (including the design and any reference implementation) and for integrating research outcomes into education. Information about broadening participation, as described above, should also be given here.

Project Management and Collaboration Plan [For Large Projects Only]: This section should summarize how the project team is appropriate to realize the project goals and how the team will assure effective collaboration. It should provide a compelling rationale for any multi-institution structure of the project, if appropriate. The plan should identify organizational responsibilities and how the project will be managed, including approaches for meeting project goals. It should also include: 1) the specific roles of the project participants in all involved organizations; 2) information on how the project will be managed across all the investigators, institutions, and/or disciplines; 3) approaches for integration of research components throughout the project; and 4) identification of the specific coordination mechanisms that will enable cross-investigator, cross-institution, and/or cross-discipline scientific integration.

Supplementary Documents:

In the Supplementary Documents Section, upload the following:

(1) A list of Project Personnel and Partner Institutions (Note: In collaborative proposals, the lead institution should provide this information for all participants):

Provide current, accurate information for all personnel and institutions involved in the project. NSF staff will use this information in the merit review process to manage reviewer selection. The list should include all PIs, co-PIs, Senior Personnel, paid/unpaid Consultants or Collaborators, Subawardees, Postdocs, and project-level advisory committee members. This list should be numbered and include (in this order) Full name, Organization(s), and Role in the project, with each item separated by a semi-colon. Each person listed should start a new numbered line. For example:

  1. Mary Smith; XYZ University; PI
  2. John Jones; University of PQR; Senior Personnel
  3. Jane Brown; XYZ University; Postdoc
  4. Bob Adams; ABC Community College; Paid Consultant
  5. Susan White; DEF Corporation; Unpaid Collaborator
  6. Tim Green; ZZZ University; Subawardee

(2) Data Management Plan (required):

Proposals must include a Supplementary Document of no more than two pages labeled "Data Management Plan." This Supplementary Document should describe how the proposal will conform to NSF policy on the dissemination and sharing of research results.

See Chapter II.C.2.j of the PAPPG for full policy implementation.

For additional information on the Dissemination and Sharing of Research Results, see: https://www.nsf.gov/bfa/dias/policy/dmp.jsp.

For specific guidance for Data Management Plans submitted to the Directorate for Computer and Information Science and Engineering (CISE) see: https://www.nsf.gov/cise/cise_dmp.jsp.

Single Copy Documents:

Collaborators and Other Affiliations Information:

Proposers should follow the guidance specified in Chapter II.C.1.e of the NSF PAPPG.

Note the distinction to item (1) under Supplementary Documents above: the listing of all project participants is collected by the project lead and entered as a Supplementary Document, which is then automatically included with all proposals in a project. The Collaborators and Other Affiliations are entered for each participant within each proposal and, as Single Copy Documents, are available only to NSF staff.

Submission Checklist:

In an effort to assist proposal preparation, the following checklists are provided as a reminder of the items that should be checked before submitting a proposal to this solicitation. These are a summary of the requirements described above. For the items marked with (RWR), the proposal will be returned without review if the required item is non-compliant at the submission deadline.

For all proposals:

  • The last line of the Project Summary should consist of the word "Keywords" followed by a colon and between 2-6 keywords, separated by semi-colons.
  • (RWR) Project Description not to exceed 15 pages for Small Projects, and not to exceed 20 pages for Large Projects.
  • (RWR) A section labeled "Research Description" is required within the Project Description.
  • (RWR) A section labeled "Evaluation/Experimentation Plan" is required within the Project Description.
  • (RWR) A section labeled "Broader Impacts" is required within the Project Description.
  • (RWR) For Large Projects, a section labeled "Project Management and Collaboration Plan" is required within the Project Description.
  • A subsection labeled "Intellectual Merit" is required in the "Research Description" section of the Project Description.
  • Project Personnel and Partner Institutions list as a Supplementary Document must be included.

Proposals that do not comply with the requirements marked as RWR will be returned without review.

B. Budgetary Information

Cost Sharing:

Inclusion of voluntary committed cost sharing is prohibited.

Other Budgetary Limitations:

Small Awards up to $500,000 and Large Awards up to $1,500,000.

Budget Preparation Instructions:

Proposals should budget for up to two project personnel to attend four joint NSF-DARPA workshops over the duration of the project, likely to be held in the Washington DC area.

C. Due Dates

  • Full Proposal Deadline(s) (due by 5 p.m. submitter's local time):

    June 06, 2019

D. FastLane/Grants.gov Requirements

For Proposals Submitted Via FastLane:

To prepare and submit a proposal via FastLane, see detailed technical instructions available at: https://www.fastlane.nsf.gov/a1/newstan.htm. For FastLane user support, call the FastLane Help Desk at 1-800-673-6188 or e-mail fastlane@nsf.gov. The FastLane Help Desk answers general technical questions related to the use of the FastLane system. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this funding opportunity.

For Proposals Submitted Via Grants.gov:

Before using Grants.gov for the first time, each organization must register to create an institutional profile. Once registered, the applicant's organization can then apply for any federal grant on the Grants.gov website. Comprehensive information about using Grants.gov is available on the Grants.gov Applicant Resources webpage: http://www.grants.gov/web/grants/applicants.html. In addition, the NSF Grants.gov Application Guide (see link in Section V.A) provides instructions regarding the technical preparation of proposals via Grants.gov. For Grants.gov user support, contact the Grants.gov Contact Center at 1-800-518-4726 or by email: support@grants.gov. The Grants.gov Contact Center answers general technical questions related to the use of Grants.gov. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this solicitation.

Submitting the Proposal: Once all documents have been completed, the Authorized Organizational Representative (AOR) must submit the application to Grants.gov and verify the desired funding opportunity and agency to which the application is submitted. The AOR must then sign and submit the application to Grants.gov. The completed application will be transferred to the NSF FastLane system for further processing.

Proposers that submitted via FastLane are strongly encouraged to use FastLane to verify the status of their submission to NSF. For proposers that submitted via Grants.gov, until an application has been received and validated by NSF, the Authorized Organizational Representative may check the status of an application on Grants.gov. After proposers have received an e-mail notification from NSF, Research.gov should be used to check the status of an application.

VI. NSF Proposal Processing And Review Procedures

Proposals received by NSF are assigned to the appropriate NSF program for acknowledgement and, if they meet NSF requirements, for review. All proposals are carefully reviewed by a scientist, engineer, or educator serving as an NSF Program Officer, and usually by three to ten other persons outside NSF either as ad hoc reviewers, panelists, or both, who are experts in the particular fields represented by the proposal. These reviewers are selected by Program Officers charged with oversight of the review process. Proposers are invited to suggest names of persons they believe are especially well qualified to review the proposal and/or persons they would prefer not review the proposal. These suggestions may serve as one source in the reviewer selection process at the Program Officer's discretion. Submission of such names, however, is optional. Care is taken to ensure that reviewers have no conflicts of interest with the proposal. In addition, Program Officers may obtain comments from site visits before recommending final action on proposals. Senior NSF staff further review recommendations for awards. A flowchart that depicts the entire NSF proposal and award process (and associated timeline) is included in PAPPG Exhibit III-1.

A comprehensive description of the Foundation's merit review process is available on the NSF website at: https://www.nsf.gov/bfa/dias/policy/merit_review/.

Proposers should also be aware of core strategies that are essential to the fulfillment of NSF's mission, as articulated in Building the Future: Investing in Discovery and Innovation - NSF Strategic Plan for Fiscal Years (FY) 2018 – 2022. These strategies are integrated in the program planning and implementation process, of which proposal review is one part. NSF's mission is particularly well-implemented through the integration of research and education and broadening participation in NSF programs, projects, and activities.

One of the strategic objectives in support of NSF's mission is to foster integration of research and education through the programs, projects, and activities it supports at academic and research institutions. These institutions must recruit, train, and prepare a diverse STEM workforce to advance the frontiers of science and participate in the U.S. technology-based economy. NSF's contribution to the national innovation ecosystem is to provide cutting-edge research under the guidance of the Nation's most creative scientists and engineers. NSF also supports development of a strong science, technology, engineering, and mathematics (STEM) workforce by investing in building the knowledge that informs improvements in STEM teaching and learning.

NSF's mission calls for the broadening of opportunities and expanding participation of groups, institutions, and geographic regions that are underrepresented in STEM disciplines, which is essential to the health and vitality of science and engineering. NSF is committed to this principle of diversity and deems it central to the programs, projects, and activities it considers and supports.

A. Merit Review Principles and Criteria

The National Science Foundation strives to invest in a robust and diverse portfolio of projects that creates new knowledge and enables breakthroughs in understanding across all areas of science and engineering research and education. To identify which projects to support, NSF relies on a merit review process that incorporates consideration of both the technical aspects of a proposed project and its potential to contribute more broadly to advancing NSF's mission "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense; and for other purposes." NSF makes every effort to conduct a fair, competitive, transparent merit review process for the selection of projects.

1. Merit Review Principles

These principles are to be given due diligence by PIs and organizations when preparing proposals and managing projects, by reviewers when reading and evaluating proposals, and by NSF program staff when determining whether or not to recommend proposals for funding and while overseeing awards. Given that NSF is the primary federal agency charged with nurturing and supporting excellence in basic research and education, the following three principles apply:

  • All NSF projects should be of the highest quality and have the potential to advance, if not transform, the frontiers of knowledge.
  • NSF projects, in the aggregate, should contribute more broadly to achieving societal goals. These "Broader Impacts" may be accomplished through the research itself, through activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. The project activities may be based on previously established and/or innovative methods and approaches, but in either case must be well justified.
  • Meaningful assessment and evaluation of NSF funded projects should be based on appropriate metrics, keeping in mind the likely correlation between the effect of broader impacts and the resources provided to implement projects. If the size of the activity is limited, evaluation of that activity in isolation is not likely to be meaningful. Thus, assessing the effectiveness of these activities may best be done at a higher, more aggregated, level than the individual project.

With respect to the third principle, even if assessment of Broader Impacts outcomes for particular projects is done at an aggregated level, PIs are expected to be accountable for carrying out the activities described in the funded project. Thus, individual projects should include clearly stated goals, specific descriptions of the activities that the PI intends to do, and a plan in place to document the outputs of those activities.

These three merit review principles provide the basis for the merit review criteria, as well as a context within which the users of the criteria can better understand their intent.

2. Merit Review Criteria

All NSF proposals are evaluated through use of the two National Science Board approved merit review criteria. In some instances, however, NSF will employ additional criteria as required to highlight the specific objectives of certain programs and activities.

The two merit review criteria are listed below. Both criteria are to be given full consideration during the review and decision-making processes; each criterion is necessary but neither, by itself, is sufficient. Therefore, proposers must fully address both criteria. (PAPPG Chapter II.C.2.d(i). contains additional information for use by proposers in development of the Project Description section of the proposal). Reviewers are strongly encouraged to review the criteria, including PAPPG Chapter II.C.2.d(i), prior to the review of a proposal.

When evaluating NSF proposals, reviewers will be asked to consider what the proposers want to do, why they want to do it, how they plan to do it, how they will know if they succeed, and what benefits could accrue if the project is successful. These issues apply both to the technical aspects of the proposal and the way in which the project may make broader contributions. To that end, reviewers will be asked to evaluate all proposals against two criteria:

  • Intellectual Merit: The Intellectual Merit criterion encompasses the potential to advance knowledge; and
  • Broader Impacts: The Broader Impacts criterion encompasses the potential to benefit society and contribute to the achievement of specific, desired societal outcomes.

The following elements should be considered in the review for both criteria:

  1. What is the potential for the proposed activity to
    1. Advance knowledge and understanding within its own field or across different fields (Intellectual Merit); and
    2. Benefit society or advance desired societal outcomes (Broader Impacts)?
  2. To what extent do the proposed activities suggest and explore creative, original, or potentially transformative concepts?
  3. Is the plan for carrying out the proposed activities well-reasoned, well-organized, and based on a sound rationale? Does the plan incorporate a mechanism to assess success?
  4. How well qualified is the individual, team, or organization to conduct the proposed activities?
  5. Are there adequate resources available to the PI (either at the home organization or through collaborations) to carry out the proposed activities?

Broader impacts may be accomplished through the research itself, through the activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. NSF values the advancement of scientific knowledge and activities that contribute to achievement of societally relevant outcomes. Such outcomes include, but are not limited to: full participation of women, persons with disabilities, and underrepresented minorities in science, technology, engineering, and mathematics (STEM); improved STEM education and educator development at any level; increased public scientific literacy and public engagement with science and technology; improved well-being of individuals in society; development of a diverse, globally competitive STEM workforce; increased partnerships between academia, industry, and others; improved national security; increased economic competitiveness of the United States; and enhanced infrastructure for research and education.

Proposers are reminded that reviewers will also be asked to review the Data Management Plan and the Postdoctoral Researcher Mentoring Plan, as appropriate.

Additional Solicitation Specific Review Criteria

The following additional review criteria will be applied to all proposals:

  • synergy in machine learning, software, algorithm, and hardware co-design to meet the real-time machine learning goals of this program.

For Large proposals, the following additional review criteria will also be applied:

  • strength of the Project Management and Collaboration Plan in the Project Description.
  • strength of the Evaluation/Experimentation plan (e.g., in FPGA/ASIC) in the Project Description.

B. Review and Selection Process

Proposals submitted in response to this program solicitation will be reviewed by Ad hoc Review and/or Panel Review.

Reviewers will be asked to evaluate proposals using two National Science Board approved merit review criteria and, if applicable, additional program specific criteria. A summary rating and accompanying narrative will generally be completed and submitted by each reviewer and/or panel. The Program Officer assigned to manage the proposal's review will consider the advice of reviewers and will formulate a recommendation.

After scientific, technical and programmatic review and consideration of appropriate factors, the NSF Program Officer recommends to the cognizant Division Director whether the proposal should be declined or recommended for award. NSF strives to be able to tell applicants whether their proposals have been declined or recommended for funding within six months. Large or particularly complex proposals or proposals from new awardees may require additional review and processing time. The time interval begins on the deadline or target date, or receipt date, whichever is later. The interval ends when the Division Director acts upon the Program Officer's recommendation.

After programmatic approval has been obtained, the proposals recommended for funding will be forwarded to the Division of Grants and Agreements for review of business, financial, and policy implications. After an administrative review has occurred, Grants and Agreements Officers perform the processing and issuance of a grant or other agreement. Proposers are cautioned that only a Grants and Agreements Officer may make commitments, obligations or awards on behalf of NSF or authorize the expenditure of funds. No commitment on the part of NSF should be inferred from technical or budgetary discussions with a NSF Program Officer. A Principal Investigator or organization that makes financial or personnel commitments in the absence of a grant or cooperative agreement signed by the NSF Grants and Agreements Officer does so at their own risk.

Once an award or declination decision has been made, Principal Investigators are provided feedback about their proposals. In all cases, reviews are treated as confidential documents. Verbatim copies of reviews, excluding the names of the reviewers or any reviewer-identifying information, are sent to the Principal Investigator/Project Director by the Program Officer. In addition, the proposer will receive an explanation of the decision to award or decline funding.

VII. Award Administration Information

A. Notification of the Award

Notification of the award is made to the submitting organization by a Grants Officer in the Division of Grants and Agreements. Organizations whose proposals are declined will be advised as promptly as possible by the cognizant NSF Program administering the program. Verbatim copies of reviews, not including the identity of the reviewer, will be provided automatically to the Principal Investigator. (See Section VI.B. for additional information on the review process.)

B. Award Conditions

An NSF award consists of: (1) the award notice, which includes any special provisions applicable to the award and any numbered amendments thereto; (2) the budget, which indicates the amounts, by categories of expense, on which NSF has based its support (or otherwise communicates any specific approvals or disapprovals of proposed expenditures); (3) the proposal referenced in the award notice; (4) the applicable award conditions, such as Grant General Conditions (GC-1)*; or Research Terms and Conditions* and (5) any announcement or other NSF issuance that may be incorporated by reference in the award notice. Cooperative agreements also are administered in accordance with NSF Cooperative Agreement Financial and Administrative Terms and Conditions (CA-FATC) and the applicable Programmatic Terms and Conditions. NSF awards are electronically signed by an NSF Grants and Agreements Officer and transmitted electronically to the organization via e-mail.

*These documents may be accessed electronically on NSF's Website at https://www.nsf.gov/awards/managing/award_conditions.jsp?org=NSF. Paper copies may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from nsfpubs@nsf.gov.

More comprehensive information on NSF Award Conditions and other important information on the administration of NSF awards is contained in the NSF Proposal & Award Policies & Procedures Guide (PAPPG) Chapter VII, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg.

Special Award Conditions:

DARPA will host four joint workshops for the awardees from both the DARPA and NSF RTML programs. At least one of the PIs/co-PIs for each project along with a student should attend these workshops.

C. Reporting Requirements

For all multi-year grants (including both standard and continuing grants), the Principal Investigator must submit an annual project report to the cognizant Program Officer no later than 90 days prior to the end of the current budget period. (Some programs or awards require submission of more frequent project reports). No later than 120 days following expiration of a grant, the PI also is required to submit a final project report, and a project outcomes report for the general public.

Failure to provide the required annual or final project reports, or the project outcomes report, will delay NSF review and processing of any future funding increments as well as any pending proposals for all identified PIs and co-PIs on a given award. PIs should examine the formats of the required reports in advance to assure availability of required data.

PIs are required to use NSF's electronic project-reporting system, available through Research.gov, for preparation and submission of annual and final project reports. Such reports provide information on accomplishments, project participants (individual and organizational), publications, and other specific products and impacts of the project. Submission of the report via Research.gov constitutes certification by the PI that the contents of the report are accurate and complete. The project outcomes report also must be prepared and submitted using Research.gov. This report serves as a brief summary, prepared specifically for the public, of the nature and outcomes of the project. This report will be posted on the NSF website exactly as it is submitted by the PI.

More comprehensive information on NSF Reporting Requirements and other important information on the administration of NSF awards is contained in the NSF Proposal & Award Policies & Procedures Guide (PAPPG) Chapter VII, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=pappg.

VIII. Agency Contacts

Please note that the program contact information is current at the time of publishing. See program website for any updates to the points of contact.

General inquiries regarding this program should be made to:

For questions related to the use of FastLane, contact:

For questions relating to Grants.gov contact:

  • Grants.gov Contact Center: If the Authorized Organizational Representatives (AOR) has not received a confirmation message from Grants.gov within 48 hours of submission of application, please contact via telephone: 1-800-518-4726; e-mail: support@grants.gov.

IX. Other Information

The NSF website provides the most comprehensive source of information on NSF Directorates (including contact information), programs and funding opportunities. Use of this website by potential proposers is strongly encouraged. In addition, "NSF Update" is an information-delivery system designed to keep potential proposers and other interested parties apprised of new NSF funding opportunities and publications, important changes in proposal and award policies and procedures, and upcoming NSF Grants Conferences. Subscribers are informed through e-mail or the user's Web browser each time new publications are issued that match their identified interests. "NSF Update" also is available on NSF's website.

Grants.gov provides an additional electronic capability to search for Federal government-wide grant opportunities. NSF funding opportunities may be accessed via this mechanism. Further information on Grants.gov may be obtained at http://www.grants.gov.

About The National Science Foundation

The National Science Foundation (NSF) is an independent Federal agency created by the National Science Foundation Act of 1950, as amended (42 USC 1861-75). The Act states the purpose of the NSF is "to promote the progress of science; [and] to advance the national health, prosperity, and welfare by supporting research and education in all fields of science and engineering."

NSF funds research and education in most fields of science and engineering. It does this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the US. The Foundation accounts for about one-fourth of Federal support to academic institutions for basic research.

NSF receives approximately 55,000 proposals each year for research, education and training projects, of which approximately 11,000 are funded. In addition, the Foundation receives several thousand applications for graduate and postdoctoral fellowships. The agency operates no laboratories itself but does support National Research Centers, user facilities, certain oceanographic vessels and Arctic and Antarctic research stations. The Foundation also supports cooperative research between universities and industry, US participation in international scientific and engineering efforts, and educational activities at every academic level.

Facilitation Awards for Scientists and Engineers with Disabilities (FASED) provide funding for special assistance or equipment to enable persons with disabilities to work on NSF-supported projects. See the NSF Proposal & Award Policies & Procedures Guide Chapter II.E.6 for instructions regarding preparation of these types of proposals.

The National Science Foundation has Telephonic Device for the Deaf (TDD) and Federal Information Relay Service (FIRS) capabilities that enable individuals with hearing impairments to communicate with the Foundation about NSF programs, employment or general information. TDD may be accessed at (703) 292-5090 and (800) 281-8749, FIRS at (800) 877-8339.

The National Science Foundation Information Center may be reached at (703) 292-5111.

The National Science Foundation promotes and advances scientific progress in the United States by competitively awarding grants and cooperative agreements for research and education in the sciences, mathematics, and engineering.

To get the latest information about program deadlines, to download copies of NSF publications, and to access abstracts of awards, visit the NSF Website at https://www.nsf.gov

  • Location:

2415 Eisenhower Avenue, Alexandria, VA 22314

  • For General Information
    (NSF Information Center):

(703) 292-5111

  • TDD (for the hearing-impaired):

(703) 292-5090

  • To Order Publications or Forms:
 

Send an e-mail to:

nsfpubs@nsf.gov

or telephone:

(703) 292-7827

  • To Locate NSF Employees:

(703) 292-5111

Privacy Act And Public Burden Statements

The information requested on proposal forms and project reports is solicited under the authority of the National Science Foundation Act of 1950, as amended. The information on proposal forms will be used in connection with the selection of qualified proposals; and project reports submitted by awardees will be used for program evaluation and reporting within the Executive Branch and to Congress. The information requested may be disclosed to qualified reviewers and staff assistants as part of the proposal review process; to proposer institutions/grantees to provide or obtain data regarding the proposal review process, award decisions, or the administration of awards; to government contractors, experts, volunteers and researchers and educators as necessary to complete assigned work; to other government agencies or other entities needing information regarding applicants or nominees as part of a joint application review process, or in order to coordinate programs or policy; and to another Federal agency, court, or party in a court or Federal administrative proceeding if the government is a party. Information about Principal Investigators may be added to the Reviewer file and used to select potential candidates to serve as peer reviewers or advisory committee members. See Systems of Records, NSF-50, "Principal Investigator/Proposal File and Associated Records," 69 Federal Register 26410 (May 12, 2004), and NSF-51, "Reviewer/Proposal File and Associated Records," 69 Federal Register 26410 (May 12, 2004). Submission of the information is voluntary. Failure to provide full and complete information, however, may reduce the possibility of receiving an award.

An agency may not conduct or sponsor, and a person is not required to respond to, an information collection unless it displays a valid Office of Management and Budget (OMB) control number. The OMB control number for this collection is 3145-0058. Public reporting burden for this collection of information is estimated to average 120 hours per response, including the time for reviewing instructions. Send comments regarding the burden estimate and any other aspect of this collection of information, including suggestions for reducing this burden, to:

Suzanne H. Plimpton
Reports Clearance Officer
Office of the General Counsel
National Science Foundation
Alexandria, VA 22314