This document has been archived and replaced by NSF 17-500 https://www.acpt.nsf.gov/publications/pub_summ.jsp?ods_key=nsf17500 Title: Data Infrastructure Building Blocks (DIBBs) (nsf15530) | NSF - National Science Foundation Date: 02/05/2016 Replaces: NSF 14-530 Data Infrastructure Building Blocks (DIBBs) [1]Program Solicitation NSF 16-530 Replaces Document(s): NSF 14-530 NSF Logo National Science Foundation Directorate for Biological Sciences Directorate for Computer & Information Science & Engineering Directorate for Education & Human Resources Directorate for Engineering Directorate for Geosciences Directorate for Mathematical & Physical Sciences Directorate for Social, Behavioral & Economic Sciences Office of International Science and Engineering Full Proposal Deadline(s) (due by 5 p.m. proposer's local time): April 04, 2016 IMPORTANT INFORMATION AND REVISION NOTES This solicitation updates the previous Data Infrastructure Building Blocks (DIBBs) solicitation, NSF 14-530, that was issued on January 8, 2014. As a cross-directorate program focused upon data challenges confronting NSF's scientific and engineering communities, this solicitation is seeking scalable cyberinfrastructure capabilities that build upon existing community infrastructure and address programmatic areas of interest across the participating directorates. Any proposal submitted in response to this solicitation should be submitted in accordance with the revised NSF Proposal & Award Policies & Procedures Guide (PAPPG) (NSF 16-1), which is effective for proposals submitted, or due, on or after January 25, 2016. Please be advised that proposers who opt to submit prior to January 25, 2016, must also follow the guidelines contained in NSF 16-1. SUMMARY OF PROGRAM REQUIREMENTS General Information Program Title: Data Infrastructure Building Blocks (DIBBs) Synopsis of Program: The NSF vision for a Cyberinfrastructure Framework for 21^st Century Science and Engineering (CIF21) considers an integrated, scalable, and sustainable cyberinfrastructure to be crucial for innovation in science and engineering (see [2]www.nsf.gov/cif21). The Data Infrastructure Building Blocks (DIBBs) program is an integral part of CIF21. The DIBBs program encourages development of robust and shared data-centric cyberinfrastructure capabilities, to accelerate interdisciplinary and collaborative research in areas of inquiry stimulated by data. DIBBs investments enable new data-focused services, capabilities, and resources to advance scientific discoveries, collaborations, and innovations. The investments are expected to build upon, integrate with, and contribute to existing community cyberinfrastructure, serving as evaluative resources while developments in national-scale access, policy, interoperability and sustainability continue to evolve. Effective solutions will bring together cyberinfrastructure expertise and domain researchers , to ensure that the resulting cyberinfrastructure address researchers' data needs. The activities should address the data challenges arising in a disciplinary or cross-disciplinary context. (Throughout this solicitation, `community' refers to a group of researchers interested in solving one or more linked scientific questions, while `domains' and `disciplines' refer to areas of expertise or application.) The projects should stimulate data-driven scientific discoveries and innovations, and address broad community needs. This solicitation includes two classes of science data pilot awards: 1. Early Implementations are large "at scale" evaluations, building upon cyberinfrastructure capabilities of existing research communities or recognized community data collections, and extending those data-focused cyberinfrastructure capabilities to additional research communities and domains with broad community engagement. 2. Pilot Demonstrations address advanced cyberinfrastructure challenges across emerging research communities, building upon recognized community data collections and disciplinary research interests, to address specific challenges in science and engineering research. Prospective PIs should be aware that DIBBs is a multi-directorate activity, and are encouraged to submit proposals that have broad, interdisciplinary interest. PIs are encouraged to refer to NSF core program descriptions, Dear Colleague Letters, and recently posted initiatives on directorate and divisional home pages to gain and engineering in which their proposals may be responsive. It is strongly recommended that a prospective PI contact a Cognizant Program Officer in the organization(s) closest to the major disciplinary impact of the proposed work to ascertain whether the scientific focus and budget of the proposed work are appropriate for this solicitation. Cognizant Program Officer(s): Please note that the following information is current at the time of publishing. See program website for any updates to the points of contact. * Amy Walton, Program Director, CISE/ACI and DIBBs Solicitation Manager, telephone: (703) 292-8970, email: [3]DIBBsQueries@nsf.gov * Robert Chadduck, Program Director, CISE/ACI, telephone: (703) 292-8970, email: [4]DIBBsQueries@nsf.gov * Anita Nikolich, Program Director, CISE/ACI, telephone: (703) 292-8970, email: [5]DIBBsQueries@nsf.gov * Peter H. McCartney, Program Director, BIO/DBI, telephone: (703) 292-8470, email: [6]DIBBsQueries@nsf.gov * Sylvia Spengler, Program Director, CISE/IIS, telephone: (703) 292-8930, email: [7]DIBBsQueries@nsf.gov * John C. Cherniavsky, Senior Advisor, EHR, telephone: (703) 292-5136, email: [8]DIBBsQueries@nsf.gov * Dimitrios V. Papavassiliou, Program Director, ENG/CBET, telephone: (703) 292-4480, email: [9]DIBBsQueries@nsf.gov * Joanne D. Culbertson, Program Director, ENG/CMMI, telephone: (703) 292-4602, email: [10]DIBBsQueries@nsf.gov * Eva Zanzerkia, Program Director, GEO/EAR, telephone: (703) 292-8556, email: [11]DIBBsQueries@nsf.gov * Lin He, Program Director, MPS/CHE, telephone: (703) 292-4956, email: [12]DIBBsQueries@nsf.gov * Bogdan Mihaila, Program Director, MPS/PHY, telephone: (703) 292-8235, email: [13]DIBBsQueries@nsf.gov * Cheryl L. Eavey, Program Director, SBE/SES, telephone: (703) 292-7269, email: [14]DIBBsQueries@nsf.gov * Seta Bogosyan, Program Director, OD/OISE, telephone: (703) 292-4766, email: [15]DIBBsQueries@nsf.gov Applicable Catalog of Federal Domestic Assistance (CFDA) Number(s): * 47.041 --- Engineering * 47.049 --- Mathematical and Physical Sciences * 47.050 --- Geosciences * 47.070 --- Computer and Information Science and Engineering * 47.074 --- Biological Sciences * 47.075 --- Social Behavioral and Economic Sciences * 47.076 --- Education and Human Resources * 47.079 --- Office of International Science and Engineering Award Information Anticipated Type of Award: Standard Grant or Continuing Grant Estimated Number of Awards: 12 * Early Implementation Awards: up to 6 awards, pending availability of funds. * Pilot Demonstration Awards: up to 6 awards, pending availability of funds. Anticipated Funding Amount: $23,500,000 pending availability of funds. * The award size for Early Implementation Awards is anticipated to be up to $4,000,000 total per award for up to 5 years. * The award size for Pilot Demonstration Awards is anticipated to be up to $500,000 total per award for up to 3 years. Project size should be commensurate with the size / breadth of the community served. Eligibility Information Who May Submit Proposals: Proposals may only be submitted by the following: * Universities and Colleges - Universities and two- and four-year colleges (including community colleges) accredited in, and having a campus located in, the US acting on behalf of their faculty members. Such organizations also are referred to as academic institutions. * Non-profit, non-academic organizations: Independent museums, observatories, research labs, professional societies and similar organizations in the U.S. associated with educational or research activities. * NSF-funded Federally Funded Research and Development Centers (FFRDCs). Who May Serve as PI: There are no restrictions or limits. Limit on Number of Proposals per Organization: There are no restrictions or limits. Limit on Number of Proposals per PI or Co-PI: 1 An individual may propose as a PI or Co-PI on only one proposal; however, the individual may be included among the listed senior personnel on more than one proposal in more than one class. In the event that an individual exceeds this limit, any proposal submitted to this solicitation with this individual listed as a PI or Co-PI, after the first proposal is received at NSF, will be returned without review. No exceptions will be made. Proposal Preparation and Submission Instructions A. Proposal Preparation Instructions * Letters of Intent: Not required * Preliminary Proposal Submission: Not required * Full Proposals: + Full Proposals submitted via FastLane: NSF Proposal and Award Policies and Procedures Guide, Part I: Grant Proposal Guide (GPG) Guidelines apply. The complete text of the GPG is available electronically on the NSF website at: [16]http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg. + Full Proposals submitted via Grants.gov: NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov Guidelines apply (Note: The NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: [17]http://www.nsf.gov/publications/pub_summ.jsp?ods_key=grant sgovguide) B. Budgetary Information * Cost Sharing Requirements: Inclusion of voluntary committed cost sharing is prohibited. * Indirect Cost (F&A) Limitations: Not Applicable * Other Budgetary Limitations: Not Applicable C. Due Dates * Full Proposal Deadline(s) (due by 5 p.m. proposer's local time): April 04, 2016 Proposal Review Information Criteria Merit Review Criteria: National Science Board approved criteria. Additional merit review considerations apply. Please see the full text of this solicitation for further information. Award Administration Information Award Conditions: Additional award conditions apply. Please see the full text of this solicitation for further information. Reporting Requirements: Standard NSF reporting requirements apply. TABLE OF CONTENTS [18]Summary of Program Requirements I. [19]Introduction II. [20]Program Description III. [21]Award Information IV. [22]Eligibility Information V. [23]Proposal Preparation and Submission Instructions A. [24]Proposal Preparation Instructions B. [25]Budgetary Information C. [26]Due Dates D. [27]FastLane/Grants.gov Requirements VI. [28]NSF Proposal Processing and Review Procedures A. [29]Merit Review Principles and Criteria B. [30]Review and Selection Process VII. [31]Award Administration Information A. [32]Notification of the Award B. [33]Award Conditions C. [34]Reporting Requirements VIII. [35]Agency Contacts IX. [36]Other Information I. INTRODUCTION NSF's Cyberinfrastructure Framework for 21st Century Science and Engineering (CIF21) ([37]http://www.nsf.gov/pubs/2010/nsf10015/nsf10015.jsp) focuses investment on the interconnected cyberinfrastructure components necessary to realize the research potential of theoretical, experimental, observational, and simulation-based efforts in science and engineering. The Data Infrastructure Building Blocks(DIBBs) program is an integral part of CIF21, supporting interdisciplinary and collaborative research in areas of inquiry stimulated by data through the development of robust, shared resources and the means for enabling partnerships across diverse communities. DIBBs investments are expected to develop the robust, scalable, well-designed cyberinfrastructure (the `building blocks') contributing to future discovery and innovation across the various scientific and engineering disciplines. These investments result in clear, tangible cyberinfrastructure products -- early demonstrations of new or expanded capabilities, evaluated by relevant communities. The capabilities will integrate with and leverage existing campus, institutional, and regional cyberinfrastructure, enhancing the governance and long-term sustainability of the data infrastructure as well as the data. This solicitation seeks partnerships that bring together cyberinfrastructure expertise and domain researchers, to expand and contribute to the cyberinfrastructure resources that serve the community. It builds upon existing community development activities in and across NSF directorates, and experience with previous DIBBs investments. Proposed projects will expand the scale and scope of directorate and multi-directorate pilot investments in science and engineering data infrastructure. The DIBBs program will be guided by the research needs and priorities of the science, engineering, and education communities. Submissions to the DIBBs solicitation are expected to be grounded in well-vetted, community-based plans. Based on individual directorates' investment priorities, expanded scope could include data reproducibility; interoperability of specific research data; sustainability; data policy and governance; data security, privacy, integrity and trustworthiness; exploration of innovative economic/operating models for archiving and curation; and learning and workforce development. Depending on scientific and engineering priorities, increased scale toward national-level and multi-agency activity will be explored. Alternative funding mechanisms should be used for concepts still in exploratory development. For example, EArly-concept Grants for Exploratory Research (EAGER) are available for untested, but potentially transformative, concepts; and Research Coordination Networks (RCNs) grants may be used to advance a field or create new directions in research or education by supporting groups of investigators to communicate and coordinate their research, training and educational activities across disciplinary, organizational, geographic and international boundaries. Prospective PIs should consult with the Cognizant Program Officers in the relevant research area(s) prior to submitting a proposal to ascertain whether the focus and budget of the proposed work are appropriate for this solicitation. Successful proposals are expected to be of interest to multiple directorates/offices participating in the DIBBs program, and are expected to be responsive to programmatic areas of interest to the participating directorates/offices. Additional information on the programmatic areas of interest for each participating NSF organization is provided at the end of the Program Description section. II. PROGRAM DESCRIPTION Program Goals The Data Infrastructure Building Blocks (DIBBs) program supports CIF21 goals by providing: * Opportunities for scientific disciplines to collectively define data requirements and develop prototypes and evaluative implementations to meet those needs; * Advanced cyberinfrastructure components meeting the needs of multiple disciplines, expanding the data resources available to scientific and engineering communities; and * Connections among existing data cyberinfrastructure capabilities, to maximize effective sharing of resources and support a broader range of scientific and engineering disciplines. Specific goals of the DIBBs program are to: * Address data sharing issues and capabilities across scientific and engineering domains, by fostering collaborations between researchers in scientific domains and cyberinfrastructure experts; and * Extend those data capabilities to other research communities and domains, through development or expansion of data-focused cyberinfrastructure, and building upon the capabilities of existing research communities, community-recognized data collections, and disciplinary research interests. Program investments are expected to: * Encourage the development of robust and shared data-centric cyberinfrastructure capabilities, contributing to the acceleration of interdisciplinary and collaborative research in areas of inquiry stimulated by data; * Build upon the advanced cyberinfrastructure capabilities of existing research communities, community-recognized data collections, and disciplinary research interests, to address specific challenges in science and engineering research; * Extend the data-focused cyberinfrastructure capabilities to additional research communities and domains with broad community engagement; and * Result in tangible outputs -- pilot systems evaluated by relevant communities, or early demonstrations of new or expanded capabilities. DIBBs projects support development and implementation of technologies across the data access and preservation lifecycle, including acquisition; documentation; security and integrity; storage; access, analysis and dissemination; migration; and de-accession. The awards can also support creation of governance structures that respond to community input on data infrastructure needs, promote solutions to domain cyberinfrastructure problems, and avoid unnecessary duplication of resources. Projects focused on data privacy, confidentiality, and protection from loss or corruption are also responsive to this solicitation. Data is often reused and combined with other data in ways that go beyond the intent of the original collection. Projects should demonstrate the use of methods by which to maintain the privacy of personally identifiable information when applied to very large data sets, where traditional methods of privacy protections are often unsuccessful. Successful projects must address a range of topics: * The need within and across the scientific, engineering and education community for the proposed data cyberinfrastructure; * Data elements and frameworks relevant to the specified community and the sustainability challenges to be addressed; * Data storage architectures and lifecycle processes, development, testing and deployment methodologies, validation and verification of proposed data management techniques, and any additional measures addressing trustworthiness and data security; * Usability and interface considerations, data curation and required infrastructure and technologies; * The required organizational, personnel and management structures, project plans and operational processes; * A plan for governance and long-term sustainability of the data infrastructure as well as the data themselves; * The likely impact on the target communities, through direct engagement with the affected community; * How adoption and usage will be monitored, and how effectiveness of the new capabilities will be measured (the composition of collaborative teams should include the skills and expertise to implement, test and evaluate the data technologies and approaches being proposed); and * Plans for data management and sharing of the products of research. Data accessibility, across a broad community, is an important attribute of crosscutting research. Data Management Plans should explicitly state how the data generated by a given project will be managed, stored, and made accessible. They should also clearly define rights, obligations, roles and responsibilities of all parties, and any anticipated intellectual property (IP) issues associated with expanded access. Consistent with CIF21, the DIBBs program encourages partnerships between academia, government laboratories and industry for the development and stewardship of a data infrastructure that can sustain and accelerate innovation and productivity, nationally and internationally. NSF recognizes the importance of enabling U.S. researchers and educators to advance their work through international partnerships, where the proposed collaboration can provide unique advantages of scope, scale, flexibility, or facilities, enabling advances that would not readily occur otherwise. In view of this, U.S. investigators may include international components in DIBBs proposals. Furthermore, strong, well-defined international collaborations may incorporate opportunities for U.S. students and early-career researchers to participate in substantive international research experiences abroad. The inclusion of new researchers, post-docs, graduate students, and undergraduates in relevant activities, as well as participation by underrepresented groups (women, persons with disabilities, and underrepresented minorities), is also encouraged. Such activity would contribute to the creation of career paths for data scientists who are necessarily multi-disciplinary. Classes of Investment The DIBBs program will accomplish its goals by making awards in two science data pilot classes: 1. Early Implementation Awards 2. Pilot Demonstration Awards 1.Early Implementation Awards Early Implementation Awards are large "at scale" evaluations, building upon the data resource capabilities of existing research communities or recognized community data collections, enhancing and extending the capabilities with broad community engagement. These awards will develop frameworks that provide consistency or commonality of design across communities and implementation for data acquisition, management, preservation, sharing, dissemination, etc. This includes data and metadata format and content conventions, standardized constructs or protocols, taxonomies, or ontologies. The development of interoperability frameworks through community-based mechanisms provides a means for ensuring that existing conventions and practices are appropriately recognized and integrated, that implementation is made realistic and feasible, and that, most importantly, the real needs of the community are identified and met. Specific Requirements for Early Implementation Awards: * Collaboration/Synergy: Early Implementations Awards will address data challenges by fostering collaborations between cyberinfrastructure experts and existing data resource efforts across scientific and engineering domains. Proposals must identify specific scientific and engineering communities and data/cyberinfrastructure facilities that will participate in the efforts. Similarly, proposals must identify participating cyberinfrastructure, computer science, industry, international and agency partners. * Benefits to Research Communities: Proposals should articulate the rationale for the proposed capability: its responsiveness to community needs, and the anticipated impact on advancing science, engineering, and education. Proposals should also describe the potential for extending the capabilities to other research communities. Proposals will be evaluated on the innovation and breadth of the science outcomes, as well as how other researchers will benefit from the developed capabilities. * Innovativeness of the Cyberinfrastructure Approach: Proposals should describe how the technology/cyberinfrastructure addresses data-related scientific and engineering challenges. Proposals should provide evidence that the approach is a creative solution to community-wide, data-related issues and policies, including comparisons with alternative approaches. Describe the approach for enhancing interoperability (access controls, discoverability, connectivity). Discuss the ability to address challenges in data quality and assurance (data privacy, integrity, confidentiality, and security). As applicable, include a rationale for the governance, platform architecture (network, cloud, reusability), and economic/operating models for archiving and curation. Identify strategies for adapting to new opportunities and technologies, and plans for sustainability beyond the scope of the award. * Management and metrics: Projects are expected to result in clear, tangible cyberinfrastructure outputs - new or expanded capabilities, evaluated by relevant communities. Proposals must clearly articulate specific steps that will be taken to implement the proposed capabilities, how project outcome(s) relate to specific needs articulated by the scientific and engineering communities involved, and what mechanisms will be used to engage users, particularly those in other communities, in adopting the capability. Early Implementation Awards must produce a yearly demonstration of technical capabilities, with an associated written document describing the capabilities that becomes a public, shared resource. These demonstrations will give the broader community the opportunity to understand and assess capabilities and synergies. Proposals must discuss what will be demonstrated and evaluated. A discussion of a given project's lifecycle/sustainability must be presented. Proposers should describe in their Management Plans how their anticipated structure, personnel, and work plan and management accommodate these responsibilities. Additional Review Criteria for Early Implementation Awards: * What are the science outcomes described in the proposal? Are they innovative and made possible by the development? How are outcomes tied to grand challenges, and of interest to and involving multiple science and engineering domains? Are the science outcomes possible given the team and work plan? * How does the implementation expand and contribute to the set of resources that serve the community? Are the components extensible and potentially useful to other communities? Is there a clear description of the data, software, or standards that will be produced by the project? (Software is intended in this instance to refer to scientific analysis, visualization or modeling tools necessary to achieve scientific outcomes). * Is the management plan and team appropriate for the goals of the project? What is the plan to demonstrate the proposed capability or resource? * Characterize the community that will benefit from the project: How many researchers and which domains will directly benefit from the outcomes of the project? How does the project involve and serve more than one research field? Are participants from various communities explicitly identified, and are their roles clear? How does the project clearly demonstrate end user involvement in development and use of a community capability? * Indicate how the community is represented in governance of the resulting capability, including data management and deaccession. A sustainability plan must be included describing how any capabilities developed by the implementation project could be supported beyond the award duration. This may include integration into long-term data or cyberinfrastructure resources either supported by NSF or other institutions, agencies or partners. Sustainability plans will be evaluated on the viability of the sustainable resource, community representation in governance, the fit to the infrastructure being developed, and the likelihood of ingestion into the long-term system. 2. Pilot Demonstration Awards Pilot Demonstration Awards address advanced cyberinfrastructure challenges across emerging research communities, building upon recognized community data collections and disciplinary research interests, to address specific challenges in science and engineering research. A small number of awards will be made in this category; awards will target small groups that create and deploy robust data capabilities for which there is a demonstrated need that will advance one or more significant areas of science and engineering. It is expected that the created capabilities will be designed so as to demonstrate potential for addressing issues of interoperability, usability, manageability, and sustainability, and will be disseminated into the community as reusable data resources. Proposed funding amounts should be commensurate with the work being proposed, the size of the community that will be affected, and the level of impact anticipated. As with all proposals, projects should be discussed with program officers from the divisions that would be impacted. Not all directorates are inviting Pilot Demonstration proposals under this solicitation. Alternative funding mechanisms should be used where they currently exist. For example, EArly-concept Grants for Exploratory Research (EAGER) are available for untested, but potentially transformative, concepts and Research Coordination Networks (RCNs) may be used to advance a field or create new directions in research or education by supporting groups of investigators to communicate and coordinate their research, training and educational activities across disciplinary, organizational, geographic and international boundaries. PIs with predominantly biological-sciences-related concepts should consider the Advances in Biological Informatics (ABI) solicitation, and PIs with predominantly geosciences-related concepts should consider the EarthCube Capabilities activity within the EarthCube solicitation. Prospective PIs should consult with the Cognizant Program Officers in the relevant research area(s) prior to submitting any proposal to ascertain whether the focus and budget of the proposed work are appropriate for this solicitation. Pilot Demonstration Awards for a community are opportunities for scientific and engineering disciplines to collectively define data requirements, metadata, community-recognized data collections and data types, outputs from models and other codes that will improve the community's ability to address identified research challenges. Projects may develop prototypes and evaluative pilots to meet common data access and discovery needs and to extend data resources, of interest either to a large number of researchers within a research domain or extending beyond to encompass other domains. These projects are not intended for long-term resource or infrastructure support, but rather are meant to serve as initial pilots to make the products more openly and readily available to the identified community. Successful proposals will include detailed descriptions of options for sustaining infrastructure or community data after the end of the awards. This may be accomplished through existing data facilities, ingestion into other infrastructure at the institutional, regional, national or international levels, or through other mechanisms that successfully demonstrate long-term maintenance. Re-use of existing modern tools and resources is highly encouraged, and proposals should describe elements of reuse. Specific Requirements for Pilot Demonstration Awards: * Collaboration: Pilot Demonstration Awards will address data challenges across emerging research communities by fostering collaborations between cyberinfrastructure experts and any existing data resource efforts in the scientific and engineering domains. Proposals must identify specific scientific and engineering communities and data/cyberinfrastructure facilities that will participate in the efforts. Similarly, proposals must identify participating cyberinfrastructure, computer science, industry, international and agency partners. * Benefits to Research Communities: Proposals should articulate the rationale for the proposed capability including its responsiveness to well-defined community needs and the anticipated impact on advancing science, engineering, and education. Proposals should also describe the potential for extending the capabilities to other research communities. Proposals will be evaluated on the innovation and breadth of the science outcomes, as well as how other researchers will benefit from the developed capabilities. * Innovativeness of the Cyberinfrastructure Approach: Proposals should describe how the technology/cyberinfrastructure addresses data-related scientific and engineering challenges. Provide evidence that the approach is a creative solution to community-wide, data-related issues and policies including comparisons with alternative approaches. * Outcomes and metrics: Projects are expected to result in outputs -- new or expanded capabilities, evaluated by relevant communities. Proposals must clearly articulate specific steps that will be taken to implement the proposed capabilities, how project outcome(s) relate to specific needs articulated by the scientific and engineering communities involved, and what mechanisms will be used engage users, particularly those in other communities, in adopting the capability. Pilot Demonstration Awards must produce a yearly demonstration of technical capabilities, with an associated written document describing the capabilities that becomes a public, shared resource. These demonstrations will give the broader community the opportunity to understand and assess capabilities and synergies. Proposals must discuss what will be demonstrated and evaluated. A discussion of a given project's lifecycle/sustainability must be presented. Proposers should describe in their Management Plans how their anticipated structure, personnel, and work plan and management accommodate these responsibilities. Additional Review Criteria for Pilot Demonstration Awards: * Is there a clear description of the community data infrastructure development that will be met by this project? Is any prototype, pilot, platform or tool development appropriately conceived for the intended outcomes of the project? What is the likelihood of successful creation and adoption of any product? How extensible is the technology or capability development? Is the resource development modern, robust and responsive to community needs? * Is the management plan and team appropriate for the goals of the project? What is the plan to demonstrate the proposed capability or resource? * Characterize the community that will benefit from the project: How many researchers and which domains will benefit from the outcomes of the project? How does the project involve and serve more than one research field? Are participants from appropriate science and engineering communities explicitly identified, and are their roles clear? How does the project clearly demonstrate end user involvement in development and use of a community capability? * Indicate how the community would be represented in governance of the resulting capability, including data management and de-accession. A sustainability plan must be included for any cyberinfrastructure component of the project that is intended to continue. The sustainability plan must describe how the cyberinfrastructure will be supported beyond the award duration, and may include integration into long-term data or cyberinfrastructure resources either supported by NSF or other institutions, agencies or partners. Sustainability plans will be evaluated on the viability of the sustainable resource, community representation in governance, the fit to the infrastructure being developed and the likelihood of ingestion into the long-term system. Programmatic Areas of Interest Successful proposals are expected to be of interest to multiple directorates/offices participating in the DIBBs program, and are expected to be responsive to programmatic areas of interest for these participating directorates/offices. In particular: The Directorate for Biological Sciences (BIO) has identified a number of grand-challenge problems, including, but not limited to: * environmental research at macro scales; * predicting phenotypes from genotypes; * characterizing and understanding dimensions of biodiversity on the planet; * understanding complexity in biological systems; and * understanding the brain. When considering such emerging topics, it is possible to identify cyberinfrastructure challenges that, if addressed, could enable one or more areas to make significant advances. These include overcoming limits to data storage and transport, achieving high-throughput performance in critical analytic workflows, and extracting complex data from multi-media sources, and multi-scale modeling and analysis. BIO has a number of programs through which it supports research and development of data cyberinfrastructure to meet these challenges and advance biological discovery (Advances in Biological Informatics, Advances in Digitization of Biological Collections, Protein Data Bank, National Environmental Observatory Network, and others). What is expected to come from the DIBBs program is the development of robust, fundamental infrastructure (or building blocks) that could contribute to ongoing efforts to develop data solutions sought by current and future BIO-funded efforts. The foundational research divisions within the Directorate for Computer and Information Science and Engineering (CISE) -- Computing and Communication Foundations (CCF), Computer and Network Services (CNS), and Information and Intelligent Systems (IIS) -- are interested in Pilot Demonstrations and Early Implementation projects to support investigator-initiated research in all areas of computer and information science and engineering. Programmatic areas of interest include the acceleration of discovery and innovation in computing foundations, communication and network systems, and information and intelligent systems. Data building blocks for energy sustainability, smart and connected health, cyber-physical systems, cooperative robotic systems and secure cyberspace are of interest. Data from target areas could include systems of security and monitoring devices, annotated corpora, spectrum and protocol analyzers, system testbeds, suites of robots, networks of wireless and mobile devices, data clusters, integrated systems of sensors, and data repositories for CISE programs. The Directorate for Education and Human Resources (EHR) encourages proposals to utilize DIBBs to follow up on activities begun by other CIF21 initiatives such as Building Community and Capacity for Data-Intensive Research in Education and Human Resources (BCC-EHR) [38]https://www.nsf.gov/publications/pub_summ.jsp?WT.z_pims_id=505161&o ds_key=nsf15563. In particular, EHR is interested in fostering novel, transformative, multidisciplinary approaches that address the use of large data sets to create actionable knowledge for improving STEM teaching and learning environments (formal and informal) in the medium term, and to revolutionize learning in the longer term. The Directorate for Engineering (ENG) is interested in effective strategies for capturing, storing, characterizing and providing access to experimental and computational data, models, and software tools in support of advancing research in areas that include: * Innovations at the Nexus of Food, Energy, and Water Systems (INFEWS); * Advanced Manufacturing; * Understanding the Brain; * Energy, Sustainability, and Infrastructure; * Microelectronics, Sensing, Communications and Information Technology; and * Nanoscale Science and Engineering. The ENG Division of Civil, Mechanical and Manufacturing Innovation (CMMI) is interested in supporting pilot demonstrations that define scalable approaches to building and sustaining data infrastructures that address data capture and characterization, curation, storage and sharing of experimental and computational data and models that will advance fundamental research in the areas of: * Multiscale transformation and use of engineering materials; and * Mitigation, preparedness, response and recovery for potential or actual hazard events. CMMI would be interested in supporting an Early Implementation project that advances fundamental research on modeling the underlying physics of the performance of buildings and other structures subjected to multi-hazards over their lifetime. The Geosciences (GEO) Directorate is interested in the following research fields: atmospheric and geospace science, earth science, ocean science, and polar science. GEO looks to support projects that integrate proven technologies and interfaces into the existing, robust geosciences cyberinfrastructure, such as national data centers, large-scale community facilities, MREFCs, and/or research operations, to significantly improve discovery and access for a broader set of geoscientists, as well as other scientists, such as biologists, materials scientists, social scientists or others. PIs must demonstrate the proven effectiveness of the approach and science outcomes that are significant to geoscientists and demonstrate the utility for the broader community. GEO is interested in projects that: * Demonstrate strong connections to academic geoscientists through existing resources, community organizations, and documented needs in community reports, such as EarthCube reports or National Academies reports; * Aim to develop new resources that are compatible and interoperable with existing geoscience cyberinfrastructure and that serve broad communities; * Demonstrate awareness of existing cyberinfrastructure and describe the relationship and connection to established geosciences infrastructure; and * Demonstrate and commit to involvement in community activities, including EarthCube forums, websites and meetings. Across the Mathematical and Physical Sciences (MPS) Directorate there is an acute need for cyberinfrastructure that supports the data generated by the scientific community in ways that ensure longevity, coherence, and accessibility in order to increase scientific productivity and enable new insights within and across scientific domains. Data here will include experimental and observational information, the results of calculations and simulations, and software. The MPS priorities can be ascribed to three basic areas: * Data Analysis: Support algorithms, software, and computational and network infrastructure for extracting information from complex and heterogeneous data, especially where the analysis is too large for researchers to perform locally. * Data Curation: Archive and manage data to ensure long-term data preservation and reuse. This includes managing large data sets, collecting data from distributed sources, and accessing and integrating multi-source data. Active data management will require standards, data certification and new models for balancing data distribution and intellectual property rights. * Data Access/"Knowledge Portals": Ensure access to data at the appropriate stages and in organized forms. Support a multilevel system that enables central processing of large data sets, with or without remote processing from smaller sites, when downloading a data set is prohibitive. Support middleware to make data accessible with user interfaces that provide visualization and context for the data requested. MPS is interested in DIBBs proposals that address one or more of these priority areas. The Directorate for Social, Behavioral and Economic Sciences (SBE) is interested in proposals that support the Directorate's research priorities, such as those outlined in SBE 2020 ([39]http://www.nsf.gov/sbe/sbe_2020/). In particular, SBE is interested in using DIBBs to support follow-up activities begun by other CIF21 activities such as Metadata for Long-standing Large-Scale Social Science Surveys (META-SSS) ([40]http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=504705) and Resource Implementations for Data Intensive Research in the Social, Behavioral and Economic Sciences (RIDIR) ([41]http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=505168&org=SES&fro m=home). SBE also welcomes innovative approaches to big data problems in SBE-focused domains consistent with the cross-directorate Critical Techniques, Technologies and Methodologies for Advancing Foundations and Applications of Big Data Sciences and Engineering (BIGDATA) solicitation ([42]http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=504767). Important Note: Prospective PIs should consult with Cognizant Program Officers in the relevant research area(s) prior to submitting a proposal to ascertain whether the focus and budget of the proposed work are appropriate for this solicitation. III. AWARD INFORMATION Anticipated Funding Amount: up to $23,500,000, pending availability of funds. * The award size for Early Implementation Awards is anticipated to be up to $4,000,000 total per award for up to 5 years. * The award size for Pilot Demonstration Awards is anticipated to be up to $500,000 total per award for up to 3 years. Project size should be commensurate with the size / breadth of the community served. IV. ELIGIBILITY INFORMATION Who May Submit Proposals: Proposals may only be submitted by the following: * Universities and Colleges - Universities and two- and four-year colleges (including community colleges) accredited in, and having a campus located in, the US acting on behalf of their faculty members. Such organizations also are referred to as academic institutions. * Non-profit, non-academic organizations: Independent museums, observatories, research labs, professional societies and similar organizations in the U.S. associated with educational or research activities. * NSF-funded Federally Funded Research and Development Centers (FFRDCs). Who May Serve as PI: There are no restrictions or limits. Limit on Number of Proposals per Organization: There are no restrictions or limits. Limit on Number of Proposals per PI or Co-PI: 1 An individual may propose as a PI or Co-PI on only one proposal; however, the individual may be included among the listed senior personnel on more than one proposal in more than one class. In the event that an individual exceeds this limit, any proposal submitted to this solicitation with this individual listed as a PI or Co-PI, after the first proposal is received at NSF, will be returned without review. No exceptions will be made. Additional Eligibility Info: * Organizations eligible to serve as lead are U.S. academic institutions, U.S. non-profit research organizations directly associated with educational and/or research activities, and NSF-funded FFRDCs. * Organizations eligible to serve as subawardees are all those organizations eligible under the provisions of the NSF Grant Proposal Guide (GPG). * In the interest of project management, there must be a single centralized award with subawardees as needed. * Proposals involving non-NSF FFRDCs or Federal agency personnel must be approved prior to submission to ensure appropriate submission parameters related to funding personnel at these institutions. PIs should contact the cognizant PO; in all cases non-NSF FFRDC or Federal agency contributors must appear as subawardees on a proposal submitted by an academic or non-profit institution. V. PROPOSAL PREPARATION AND SUBMISSION INSTRUCTIONS A. Proposal Preparation Instructions Full Proposal Preparation Instructions: Proposers may opt to submit proposals in response to this Program Solicitation via Grants.gov or via the NSF FastLane system. * Full proposals submitted via FastLane: Proposals submitted in response to this program solicitation should be prepared and submitted in accordance with the general guidelines contained in the NSF Grant Proposal Guide (GPG). The complete text of the GPG is available electronically on the NSF website at: [43]http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg. Paper copies of the GPG may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from [44]nsfpubs@nsf.gov. Proposers are reminded to identify this program solicitation number in the program solicitation block on the NSF Cover Sheet For Proposal to the National Science Foundation. Compliance with this requirement is critical to determining the relevant proposal processing guidelines. Failure to submit this information may delay processing. * Full proposals submitted via Grants.gov: Proposals submitted in response to this program solicitation via Grants.gov should be prepared and submitted in accordance with the NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov. The complete text of the NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: ([45]http://www.nsf.gov/publications/pub_summ.jsp?ods_key=grantsgov guide). To obtain copies of the Application Guide and Application Forms Package, click on the Apply tab on the Grants.gov site, then click on the Apply Step 1: Download a Grant Application Package and Application Instructions link and enter the funding opportunity number, (the program solicitation number without the NSF prefix) and press the Download Package button. Paper copies of the Grants.gov Application Guide also may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from [46]nsfpubs@nsf.gov. See Chapter II.C.2 of the [47]GPG for guidance on the required sections of a full research proposal submitted to NSF. Please note that the proposal preparation instructions provided in this program solicitation may deviate from the GPG instructions. Proposals must be received as a single submission with one organization serving as the lead and all others as subawardees. Linked proposals submitted using the collaborative mechanism of Fastlane will be returned without review. While the lead institution submitting the proposal must be a U.S. academic institution, a U.S. non-profit, non-academic organization or a NSF-funded FFRDC, subawardees may be any entity eligible under the provisions of the NSF GPG. The following instructions supplement and/or deviate from the guidance in the GPG or NSF Grants.gov Application Guide. Title - The title of the proposed project must begin with the words "CIF21 DIBBs." Early Implementation projects must begin with "CIF21 DIBBs: EI:" followed by the title of the project; and Pilot Demonstration projects must begin with "CIF21 DIBBs: PD:" followed by the title of the project. Cover Sheet: FastLane allows one PI and at most four Co-PIs to be designated for each proposal. If the project involves international partners, check the international activities box and list the countries involved. If needed, additional lead personnel should be designated as Senior Personnel on the Budget form. Project Summary (1-page limit, 4,600 characters in total for all three text boxes) - In the Overview text box provide a summary description of the project, including its research and education goals, the innovative data infrastructure being proposed, and the community (communities) that will be impacted. The Project Summary must include a list of all the other senior personnel and collaborating institutions involved in the proposal irrespective of whether they are receiving funds. In separate text boxes, provide a succinct summary of the intellectual merit and broader impacts of the proposed project. Project Description - This section may be no longer than 15 pages. In addition to intellectual merit and broader impacts, the project description should describe how the work meets the specific requirements and any additional review criteria indicated. The Project Description must include a Management Plan that describes plans and procedures for the development and assessment of the proposed activity. The plan should include a list of all participating members of the collaboration, including non-funded participants, their institutions and roles in the project. A clear timeline of expected outcomes should be included. A Sustainability Plan must also be included for any infrastructure developments that may need support beyond the term of the project. Results from Prior Research - Describe only prior research of the PI or Co-PIs funded by NSF that is directly and immediately relevant to this proposal. References Cited - Indicate with an asterisk any cited publications resulting from prior research funded by NSF for the PI or Co-PIs when following the guidelines for all references cited. Biographical Sketches - Provide biographical sketches for the PI, Co-PIs, and other Senior Personnel listed on the Project Summary page. Current and Pending Support - Provide this information for the PI, Co-PIs and other senior personnel listed on the Project Summary page. Budget - Participation in the DIBBs annual meeting will be a requirement of an award. Funds for travel by up to two project personnel to this annual meeting to be held in the Washington, DC area, must be included in the budget. For Early Implementation proposals, proposal budgets should also include travel funds for the PI to attend an annual reverse site visit in the Washington, DC area. (Pilot Demonstration proposals do not need to include travel funds for the PI to attend an annual reverse site visit.) Follow the instructions in the GPG or NSF Grants.gov Application Guide for preparing the budget. Multi-institutional proposals must be submitted through the lead organization with a single budget including all other participating organizations as subawardees (see GPG guidelines, Chapter II.D.3). Provide a detailed budget justification separately for the lead organization (up to 3 pages) and for each subawardee budget (up to 3 pages each). Funds for facility construction or renovation may not be requested. Special Information and Supplementary Documentation - In addition to the Data Management Plan (please follow the CISE Data Management Plan Guidance - [48]http://www.nsf.gov/cise/cise_dmp.jsp) and the Postdoctoral Research Mentoring Plan (if required), the following items are the only items permitted as Supplementary Documentation: Project Management - System Architecture Diagram. Proposals may include in Supplementary Documents a 1-page system architecture design diagram specifying all critical components, including hardware and/or software and any necessary dependencies affecting system use by the scientific community. The diagram should be referenced in the Project Description. Letters of Collaboration: These must be provided for any organization or individuals mentioned in the Project Description and Management Plan but not receiving funds (i.e., mentioned in the proposal and not listed in any of the associated budgets). Letters of Collaboration must list the personnel participating in the project and their affiliations and describe the work that the unfunded collaborators will be conducting for the project. Single Copy Documents: The following information is required in addition to that included within the provisions of the GPG or NSF Grants.gov Application Guide: Integrated Conflicts of Interests Lists: (1) A list of Project Personnel and Partner Institutions: Provide current, accurate information for all personnel and institutions involved in the project. NSF staff will use this information in the merit review process to manage conflicts of interest. The list should include all PIs, Co-PIs, Senior Personnel, paid/unpaid Consultants or Collaborators, Subawardees, Postdocs, and project-level advisory committee members. This list should be numbered and include (in this order) Full name, Organization(s), and Role in the project, with each item separated by a semi-colon. Each person listed should start a new numbered line. For example: 1. Mary Smith; XYZ University; PI 2. John Jones; University of PQR; Senior Personnel 3. Jane Brown; XYZ University; Postdoc 4. Bob Adams; ABC Community College; Paid Consultant 5. Susan White; DEF Corporation; Unpaid Collaborator 6. Tim Green; ZZZ University; Subawardee (2) A list of past and present Collaborators not related to this proposal: Provide current, accurate information for all active or recent collaborators of personnel and institutions involved in the project. NSF staff will use this information in the merit review process to manage conflicts of interest. This list -- distinct from (1) above -- must include all active or recent Collaborators of all personnel involved with the proposed project. Collaborators include any individual with whom any member of the project team -- including PIs, Co-PIs, Senior Personnel, paid/unpaid Consultants or Collaborators, Subawardees, Postdocs, and project-level advisory committee members -- has collaborated on a project, book, article, report, or paper within the preceding 48 months; or co-edited a journal, compendium, or conference proceedings within the preceding 24 months. This list should include (in this order) Full name and Organization(s), with each item separated by a semi-colon. Each person listed should start a new numbered line. The following is a sample format; other similar formats are acceptable. 1. Collaborators for Mary Smith; XYZ University; PI a. Helen Gupta; ABC University b. John Jones; University of PQR c. Fred Gonzales; DEF Corporation d. Susan White; DEF Corporation 2. Collaborators for John Jones; University of PQR; Senior Personnel a. Tim Green; ZZZ University b. Ping Chang; ZZZ University c. Mary Smith; XYZ University 3. Collaborators for Jane Brown; XYZ University; Postdoc a. Fred Gonzales; DEF Corporation 4. Collaborators for Bob Adams; ABC Community College; Paid Consultant a. None 5. Collaborators for Susan White; DEF Corporation; Unpaid Collaborator a. Mary Smith; XYZ University b. Harry Nguyen; Welldone Institution 6. Collaborators for Tim Green; ZZZ University; Subawardee a. John Jones; University of PQR In addition to the Conflict of Interest List, other correspondence to the program not intended to be sent to reviewers, such as a list of potential reviewers, can be sent through the Single Copy Document section of FastLane or Grants.gov. Please note that key project personnel may be required, prior to an award decision, to submit copies of any intellectual property agreements or material transfer agreements they have signed, or are planning to sign, that would have an impact on the unrestricted and timely distribution of the outcomes of NSF-funded research. Submission of a Single Copy Document will allow these materials to be reviewed by NSF officials only, and they will remain confidential. B. Budgetary Information Cost Sharing: Inclusion of voluntary committed cost sharing is prohibited. Budget Preparation Instructions: Participation in the DIBBs annual meeting will be a requirement of an award. Funds for travel by up to two project personnel to this annual meeting to be held in the Washington, DC area, must be included in the budget. For Early Implementation proposals, proposal budgets should also include travel funds for the PI to attend an annual reverse site visit in the Washington, DC area. (Pilot Demonstration proposals do not need to include travel funds for the PI to attend an annual reverse site visit.) Follow the instructions in the GPG or NSF Grants.gov Application Guide for preparing the budget. Multi-institutional proposals must be submitted through the lead organization with a single budget including all other participating organizations as subawardees (see GPG guidelines, Chapter II.D.3). Provide a detailed budget justification separately for the lead organization (up to 3 pages) and for each subawardee budget (up to 3 pages each). Funds for facility construction or renovation may not be requested. C. Due Dates * Full Proposal Deadline(s) (due by 5 p.m. proposer's local time): April 04, 2016 D. FastLane/Grants.gov Requirements For Proposals Submitted Via FastLane: To prepare and submit a proposal via FastLane, see detailed technical instructions available at: [49]https://www.fastlane.nsf.gov/a1/newstan.htm. For FastLane user support, call the FastLane Help Desk at 1-800-673-6188 or e-mail [50]fastlane@nsf.gov. The FastLane Help Desk answers general technical questions related to the use of the FastLane system. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this funding opportunity. For Proposals Submitted Via Grants.gov: Before using Grants.gov for the first time, each organization must register to create an institutional profile. Once registered, the applicant's organization can then apply for any federal grant on the Grants.gov website. Comprehensive information about using Grants.gov is available on the Grants.gov Applicant Resources webpage: [51]http://www.grants.gov/web/grants/applicants.html. In addition, the NSF Grants.gov Application Guide (see link in Section V.A) provides instructions regarding the technical preparation of proposals via Grants.gov. For Grants.gov user support, contact the Grants.gov Contact Center at 1-800-518-4726 or by email: [52]support@grants.gov. The Grants.gov Contact Center answers general technical questions related to the use of Grants.gov. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this solicitation. Submitting the Proposal: Once all documents have been completed, the Authorized Organizational Representative (AOR) must submit the application to Grants.gov and verify the desired funding opportunity and agency to which the application is submitted. The AOR must then sign and submit the application to Grants.gov. The completed application will be transferred to the NSF FastLane system for further processing. Proposers that submitted via FastLane are strongly encouraged to use FastLane to verify the status of their submission to NSF. For proposers that submitted via Grants.gov, until an application has been received and validated by NSF, the Authorized Organizational Representative may check the status of an application on Grants.gov. After proposers have received an e-mail notification from NSF, Research.gov should be used to check the status of an application. VI. NSF PROPOSAL PROCESSING AND REVIEW PROCEDURES Proposals received by NSF are assigned to the appropriate NSF program for acknowledgement and, if they meet NSF requirements, for review. All proposals are carefully reviewed by a scientist, engineer, or educator serving as an NSF Program Officer, and usually by three to ten other persons outside NSF either as ad hoc reviewers, panelists, or both, who are experts in the particular fields represented by the proposal. These reviewers are selected by Program Officers charged with oversight of the review process. Proposers are invited to suggest names of persons they believe are especially well qualified to review the proposal and/or persons they would prefer not review the proposal. These suggestions may serve as one source in the reviewer selection process at the Program Officer's discretion. Submission of such names, however, is optional. Care is taken to ensure that reviewers have no conflicts of interest with the proposal. In addition, Program Officers may obtain comments from site visits before recommending final action on proposals. Senior NSF staff further review recommendations for awards. A flowchart that depicts the entire NSF proposal and award process (and associated timeline) is included in the GPG as [53]Exhibit III-1. A comprehensive description of the Foundation's merit review process is available on the NSF website at: [54]http://www.nsf.gov/bfa/dias/policy/merit_review/. Proposers should also be aware of core strategies that are essential to the fulfillment of NSF's mission, as articulated in [55]Investing in Science, Engineering, and Education for the Nation's Future: NSF Strategic Plan for 2014-2018. These strategies are integrated in the program planning and implementation process, of which proposal review is one part. NSF's mission is particularly well-implemented through the integration of research and education and broadening participation in NSF programs, projects, and activities. One of the strategic objectives in support of NSF's mission is to foster integration of research and education through the programs, projects, and activities it supports at academic and research institutions. These institutions must recruit, train, and prepare a diverse STEM workforce to advance the frontiers of science and participate in the U.S. technology-based economy. NSF's contribution to the national innovation ecosystem is to provide cutting-edge research under the guidance of the Nation's most creative scientists and engineers. NSF also supports development of a strong science, technology, engineering, and mathematics (STEM) workforce by investing in building the knowledge that informs improvements in STEM teaching and learning. NSF's mission calls for the broadening of opportunities and expanding participation of groups, institutions, and geographic regions that are underrepresented in STEM disciplines, which is essential to the health and vitality of science and engineering. NSF is committed to this principle of diversity and deems it central to the programs, projects, and activities it considers and supports. A. Merit Review Principles and Criteria The National Science Foundation strives to invest in a robust and diverse portfolio of projects that creates new knowledge and enables breakthroughs in understanding across all areas of science and engineering research and education. To identify which projects to support, NSF relies on a merit review process that incorporates consideration of both the technical aspects of a proposed project and its potential to contribute more broadly to advancing NSF's mission "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense; and for other purposes." NSF makes every effort to conduct a fair, competitive, transparent merit review process for the selection of projects. 1. Merit Review Principles These principles are to be given due diligence by PIs and organizations when preparing proposals and managing projects, by reviewers when reading and evaluating proposals, and by NSF program staff when determining whether or not to recommend proposals for funding and while overseeing awards. Given that NSF is the primary federal agency charged with nurturing and supporting excellence in basic research and education, the following three principles apply: * All NSF projects should be of the highest quality and have the potential to advance, if not transform, the frontiers of knowledge. * NSF projects, in the aggregate, should contribute more broadly to achieving societal goals. These "Broader Impacts" may be accomplished through the research itself, through activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. The project activities may be based on previously established and/or innovative methods and approaches, but in either case must be well justified. * Meaningful assessment and evaluation of NSF funded projects should be based on appropriate metrics, keeping in mind the likely correlation between the effect of broader impacts and the resources provided to implement projects. If the size of the activity is limited, evaluation of that activity in isolation is not likely to be meaningful. Thus, assessing the effectiveness of these activities may best be done at a higher, more aggregated, level than the individual project. With respect to the third principle, even if assessment of Broader Impacts outcomes for particular projects is done at an aggregated level, PIs are expected to be accountable for carrying out the activities described in the funded project. Thus, individual projects should include clearly stated goals, specific descriptions of the activities that the PI intends to do, and a plan in place to document the outputs of those activities. These three merit review principles provide the basis for the merit review criteria, as well as a context within which the users of the criteria can better understand their intent. 2. Merit Review Criteria All NSF proposals are evaluated through use of the two National Science Board approved merit review criteria. In some instances, however, NSF will employ additional criteria as required to highlight the specific objectives of certain programs and activities. The two merit review criteria are listed below. Both criteria are to be given full consideration during the review and decision-making processes; each criterion is necessary but neither, by itself, is sufficient. Therefore, proposers must fully address both criteria. ([56]GPG Chapter II.C.2.d.i. contains additional information for use by proposers in development of the Project Description section of the proposal.) Reviewers are strongly encouraged to review the criteria, including [57]GPG Chapter II.C.2.d.i., prior to the review of a proposal. When evaluating NSF proposals, reviewers will be asked to consider what the proposers want to do, why they want to do it, how they plan to do it, how they will know if they succeed, and what benefits could accrue if the project is successful. These issues apply both to the technical aspects of the proposal and the way in which the project may make broader contributions. To that end, reviewers will be asked to evaluate all proposals against two criteria: * Intellectual Merit: The Intellectual Merit criterion encompasses the potential to advance knowledge; and * Broader Impacts: The Broader Impacts criterion encompasses the potential to benefit society and contribute to the achievement of specific, desired societal outcomes. The following elements should be considered in the review for both criteria: 1. What is the potential for the proposed activity to a. Advance knowledge and understanding within its own field or across different fields (Intellectual Merit); and b. Benefit society or advance desired societal outcomes (Broader Impacts)? 2. To what extent do the proposed activities suggest and explore creative, original, or potentially transformative concepts? 3. Is the plan for carrying out the proposed activities well-reasoned, well-organized, and based on a sound rationale? Does the plan incorporate a mechanism to assess success? 4. How well qualified is the individual, team, or organization to conduct the proposed activities? 5. Are there adequate resources available to the PI (either at the home organization or through collaborations) to carry out the proposed activities? Broader impacts may be accomplished through the research itself, through the activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. NSF values the advancement of scientific knowledge and activities that contribute to achievement of societally relevant outcomes. Such outcomes include, but are not limited to: full participation of women, persons with disabilities, and underrepresented minorities in science, technology, engineering, and mathematics (STEM); improved STEM education and educator development at any level; increased public scientific literacy and public engagement with science and technology; improved well-being of individuals in society; development of a diverse, globally competitive STEM workforce; increased partnerships between academia, industry, and others; improved national security; increased economic competitiveness of the United States; and enhanced infrastructure for research and education. Proposers are reminded that reviewers will also be asked to review the Data Management Plan and the Postdoctoral Researcher Mentoring Plan, as appropriate. Additional Solicitation Specific Review Criteria In addition to the standard NSF review criteria for Intellectual Merit and Broader Impacts, DIBBs proposals will also be reviewed using the program-specific criteria identified in the Program Description section (Section II) of this solicitation. The program-specific review criteria for each type of award are reiterated below: Additional Review Criteria for Early Implementation Awards: * What are the science outcomes described in the proposal? Are they innovative and made possible by the development? How are outcomes tied to grand challenges, and of interest to and involving multiple science and engineering domains? Are the science outcomes possible given the team and work plan? * How does the implementation expand and contribute to the set of resources that serve the community? Are the components extensible and potentially useful to other communities? Is there a clear description of the data, software, or standards that will be produced by the project? (Software is intended in this instance to refer to scientific analysis, visualization or modeling tools necessary to achieve scientific outcomes). * Is the management plan and team appropriate for the goals of the project? What is the plan to demonstrate the proposed capability or resource? * Characterize the community that will benefit from the project: How many researchers and which domains will directly benefit from the outcomes of the project? How does the project involve and serve more than one research field? Are participants from various communities explicitly identified, and are their roles clear? How does the project clearly demonstrate end user involvement in development and use of a community capability? * Indicate how the community is represented in governance of the resulting capability, including data management and deaccession. A sustainability plan must be included describing how any capabilities developed by the implementation project could be supported beyond the award duration. This may include integration into long-term data or cyberinfrastructure resources either supported by NSF or other institutions, agencies or partners. Sustainability plans will be evaluated on the viability of the sustainable resource, community representation in governance, the fit to the infrastructure being developed, and the likelihood of ingestion into the long-term system. Additional Review Criteria for Pilot Demonstration Awards: * Is there a clear description of the community data infrastructure development that will be met by this project? Is any prototype, pilot, platform or tool development appropriately conceived for the intended outcomes of the project? What is the likelihood of successful creation and adoption of any product? How extensible is the technology or capability development? Is the resource development modern, robust and responsive to community needs? * Is the management plan and team appropriate for the goals of the project? What is the plan to demonstrate the proposed capability or resource? * Characterize the community that will benefit from the project: How many researchers and which domains will benefit from the outcomes of the project? How does the project involve and serve more than one research field? Are participants from appropriate science and engineering communities explicitly identified, and are their roles clear? How does the project clearly demonstrate end user involvement in development and use of a community capability? * Indicate how the community would be represented in governance of the resulting capability, including data management and de-accession. A sustainability plan must be included for any cyberinfrastructure component of the project that is intended to continue. The sustainability plan must describe how the cyberinfrastructure will be supported beyond the award duration, and may include integration into long-term data or cyberinfrastructure resources either supported by NSF or other institutions, agencies or partners. Sustainability plans will be evaluated on the viability of the sustainable resource, community representation in governance, the fit to the infrastructure being developed and the likelihood of ingestion into the long-term system. B. Review and Selection Process Proposals submitted in response to this program solicitation will be reviewed by Ad hoc Review and/or Panel Review. Reviewers will be asked to evaluate proposals using two National Science Board approved merit review criteria and, if applicable, additional program specific criteria. A summary rating and accompanying narrative will generally be completed and submitted by each reviewer and/or panel. The Program Officer assigned to manage the proposal's review will consider the advice of reviewers and will formulate a recommendation. After scientific, technical and programmatic review and consideration of appropriate factors, the NSF Program Officer recommends to the cognizant Division Director whether the proposal should be declined or recommended for award. NSF strives to be able to tell applicants whether their proposals have been declined or recommended for funding within six months. Large or particularly complex proposals or proposals from new awardees may require additional review and processing time. The time interval begins on the deadline or target date, or receipt date, whichever is later. The interval ends when the Division Director acts upon the Program Officer's recommendation. After programmatic approval has been obtained, the proposals recommended for funding will be forwarded to the Division of Grants and Agreements for review of business, financial, and policy implications. After an administrative review has occurred, Grants and Agreements Officers perform the processing and issuance of a grant or other agreement. Proposers are cautioned that only a Grants and Agreements Officer may make commitments, obligations or awards on behalf of NSF or authorize the expenditure of funds. No commitment on the part of NSF should be inferred from technical or budgetary discussions with a NSF Program Officer. A Principal Investigator or organization that makes financial or personnel commitments in the absence of a grant or cooperative agreement signed by the NSF Grants and Agreements Officer does so at their own risk. Once an award or declination decision has been made, Principal Investigators are provided feedback about their proposals. In all cases, reviews are treated as confidential documents. Verbatim copies of reviews, excluding the names of the reviewers or any reviewer-identifying information, are sent to the Principal Investigator/Project Director by the Program Officer. In addition, the proposer will receive an explanation of the decision to award or decline funding. VII. AWARD ADMINISTRATION INFORMATION A. Notification of the Award Notification of the award is made to the submitting organization by a Grants Officer in the Division of Grants and Agreements. Organizations whose proposals are declined will be advised as promptly as possible by the cognizant NSF Program administering the program. Verbatim copies of reviews, not including the identity of the reviewer, will be provided automatically to the Principal Investigator. (See Section VI.B. for additional information on the review process). B. Award Conditions An NSF award consists of: (1) the award notice, which includes any special provisions applicable to the award and any numbered amendments thereto; (2) the budget, which indicates the amounts, by categories of expense, on which NSF has based its support (or otherwise communicates any specific approvals or disapprovals of proposed expenditures); (3) the proposal referenced in the award notice; (4) the applicable award conditions, such as Grant General Conditions (GC-1)*; or Research Terms and Conditions* and (5) any announcement or other NSF issuance that may be incorporated by reference in the award notice. Cooperative agreements also are administered in accordance with NSF Cooperative Agreement Financial and Administrative Terms and Conditions (CA-FATC) and the applicable Programmatic Terms and Conditions. NSF awards are electronically signed by an NSF Grants and Agreements Officer and transmitted electronically to the organization via e-mail. *These documents may be accessed electronically on NSF's Website at [58]http://www.nsf.gov/awards/managing/award_conditions.jsp?org=NSF. Paper copies may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from [59]nsfpubs@nsf.gov. More comprehensive information on NSF Award Conditions and other important information on the administration of NSF awards is contained in the NSF Award & Administration Guide (AAG) Chapter II, available electronically on the NSF Website at [60]http://www.nsf.gov/publications/pub_summ.jsp?ods_key=aag. Special Award Conditions: * Awardees are expected to participate in annual PI meetings with travel costs supported by the award. * PIs of Early Implementation Awards also participate in annual reverse site visits to be held in the Washington, DC area. C. Reporting Requirement For all multi-year grants (including both standard and continuing grants), the Principal Investigator must submit an annual project report to the cognizant Program Officer no later than 90 days prior to the end of the current budget period. (Some programs or awards require submission of more frequent project reports). No later than 120 days following expiration of a grant, the PI also is required to submit a final project report, and a project outcomes report for the general public. Failure to provide the required annual or final project reports, or the project outcomes report, will delay NSF review and processing of any future funding increments as well as any pending proposals for all identified PIs and co-PIs on a given award. PIs should examine the formats of the required reports in advance to assure availability of required data. PIs are required to use NSF's electronic project-reporting system, available through Research.gov, for preparation and submission of annual and final project reports. Such reports provide information on accomplishments, project participants (individual and organizational), publications, and other specific products and impacts of the project. Submission of the report via Research.gov constitutes certification by the PI that the contents of the report are accurate and complete. The project outcomes report also must be prepared and submitted using Research.gov. This report serves as a brief summary, prepared specifically for the public, of the nature and outcomes of the project. This report will be posted on the NSF website exactly as it is submitted by the PI. More comprehensive information on NSF Reporting Requirements and other important information on the administration of NSF awards is contained in the NSF Award & Administration Guide (AAG) Chapter II, available electronically on the NSF Website at [61]http://www.nsf.gov/publications/pub_summ.jsp?ods_key=aag. VIII. AGENCY CONTACTS Please note that the program contact information is current at the time of publishing. See program website for any updates to the points of contact. General inquiries regarding this program should be made to: * Amy Walton, Program Director, CISE/ACI and DIBBs Solicitation Manager, telephone: (703) 292-8970, email: [62]DIBBsQueries@nsf.gov * Robert Chadduck, Program Director, CISE/ACI, telephone: (703) 292-8970, email: [63]DIBBsQueries@nsf.gov * Anita Nikolich, Program Director, CISE/ACI, telephone: (703) 292-8970, email: [64]DIBBsQueries@nsf.gov * Peter H. McCartney, Program Director, BIO/DBI, telephone: (703) 292-8470, email: [65]DIBBsQueries@nsf.gov * Sylvia Spengler, Program Director, CISE/IIS, telephone: (703) 292-8930, email: [66]DIBBsQueries@nsf.gov * John C. Cherniavsky, Senior Advisor, EHR, telephone: (703) 292-5136, email: [67]DIBBsQueries@nsf.gov * Dimitrios V. Papavassiliou, Program Director, ENG/CBET, telephone: (703) 292-4480, email: [68]DIBBsQueries@nsf.gov * Joanne D. Culbertson, Program Director, ENG/CMMI, telephone: (703) 292-4602, email: [69]DIBBsQueries@nsf.gov * Eva Zanzerkia, Program Director, GEO/EAR, telephone: (703) 292-8556, email: [70]DIBBsQueries@nsf.gov * Lin He, Program Director, MPS/CHE, telephone: (703) 292-4956, email: [71]DIBBsQueries@nsf.gov * Bogdan Mihaila, Program Director, MPS/PHY, telephone: (703) 292-8235, email: [72]DIBBsQueries@nsf.gov * Cheryl L. Eavey, Program Director, SBE/SES, telephone: (703) 292-7269, email: [73]DIBBsQueries@nsf.gov * Seta Bogosyan, Program Director, OD/OISE, telephone: (703) 292-4766, email: [74]DIBBsQueries@nsf.gov For questions related to the use of FastLane, contact: * FastLane Help Desk, telephone: 1-800-673-6188; e-mail: [75]fastlane@nsf.gov. For questions relating to Grants.gov contact: * Grants.gov Contact Center: If the Authorized Organizational Representatives (AOR) has not received a confirmation message from Grants.gov within 48 hours of submission of application, please contact via telephone: 1-800-518-4726; e-mail: [76]support@grants.gov. IX. OTHER INFORMATION The NSF website provides the most comprehensive source of information on NSF Directorates (including contact information), programs and funding opportunities. Use of this website by potential proposers is strongly encouraged. In addition, "NSF Update" is an information-delivery system designed to keep potential proposers and other interested parties apprised of new NSF funding opportunities and publications, important changes in proposal and award policies and procedures, and upcoming NSF [77]Grants Conferences. Subscribers are informed through e-mail or the user's Web browser each time new publications are issued that match their identified interests. "NSF Update" also is available on [78]NSF's website. Grants.gov provides an additional electronic capability to search for Federal government-wide grant opportunities. NSF funding opportunities may be accessed via this mechanism. Further information on Grants.gov may be obtained at [79]http://www.grants.gov. ABOUT THE NATIONAL SCIENCE FOUNDATION The National Science Foundation (NSF) is an independent Federal agency created by the National Science Foundation Act of 1950, as amended (42 USC 1861-75). The Act states the purpose of the NSF is "to promote the progress of science; [and] to advance the national health, prosperity, and welfare by supporting research and education in all fields of science and engineering." NSF funds research and education in most fields of science and engineering. It does this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the US. The Foundation accounts for about one-fourth of Federal support to academic institutions for basic research. NSF receives approximately 55,000 proposals each year for research, education and training projects, of which approximately 11,000 are funded. In addition, the Foundation receives several thousand applications for graduate and postdoctoral fellowships. The agency operates no laboratories itself but does support National Research Centers, user facilities, certain oceanographic vessels and Arctic and Antarctic research stations. The Foundation also supports cooperative research between universities and industry, US participation in international scientific and engineering efforts, and educational activities at every academic level. Facilitation Awards for Scientists and Engineers with Disabilities provide funding for special assistance or equipment to enable persons with disabilities to work on NSF-supported projects. See Grant Proposal Guide Chapter II, Section D.2 for instructions regarding preparation of these types of proposals. The National Science Foundation has Telephonic Device for the Deaf (TDD) and Federal Information Relay Service (FIRS) capabilities that enable individuals with hearing impairments to communicate with the Foundation about NSF programs, employment or general information. TDD may be accessed at (703) 292-5090 and (800) 281-8749, FIRS at (800) 877-8339. The National Science Foundation Information Center may be reached at (703) 292-5111. The National Science Foundation promotes and advances scientific progress in the United States by competitively awarding grants and cooperative agreements for research and education in the sciences, mathematics, and engineering. To get the latest information about program deadlines, to download copies of NSF publications, and to access abstracts of awards, visit the NSF Website at [80]http://www.nsf.gov * Location: 4201 Wilson Blvd. Arlington, VA 22230 * For General Information (NSF Information Center): (703) 292-5111 * TDD (for the hearing-impaired): (703) 292-5090 * To Order Publications or Forms: Send an e-mail to: [81]nsfpubs@nsf.gov or telephone: (703) 292-7827 * To Locate NSF Employees: (703) 292-5111 PRIVACY ACT AND PUBLIC BURDEN STATEMENTS The information requested on proposal forms and project reports is solicited under the authority of the National Science Foundation Act of 1950, as amended. The information on proposal forms will be used in connection with the selection of qualified proposals; and project reports submitted by awardees will be used for program evaluation and reporting within the Executive Branch and to Congress. The information requested may be disclosed to qualified reviewers and staff assistants as part of the proposal review process; to proposer institutions/grantees to provide or obtain data regarding the proposal review process, award decisions, or the administration of awards; to government contractors, experts, volunteers and researchers and educators as necessary to complete assigned work; to other government agencies or other entities needing information regarding applicants or nominees as part of a joint application review process, or in order to coordinate programs or policy; and to another Federal agency, court, or party in a court or Federal administrative proceeding if the government is a party. Information about Principal Investigators may be added to the Reviewer file and used to select potential candidates to serve as peer reviewers or advisory committee members. See Systems of Records, [82]NSF-50, "Principal Investigator/Proposal File and Associated Records," 69 Federal Register 26410 (May 12, 2004), and [83]NSF-51, "Reviewer/Proposal File and Associated Records," 69 Federal Register 26410 (May 12, 2004). Submission of the information is voluntary. Failure to provide full and complete information, however, may reduce the possibility of receiving an award. An agency may not conduct or sponsor, and a person is not required to respond to, an information collection unless it displays a valid Office of Management and Budget (OMB) control number. The OMB control number for this collection is 3145-0058. Public reporting burden for this collection of information is estimated to average 120 hours per response, including the time for reviewing instructions. Send comments regarding the burden estimate and any other aspect of this collection of information, including suggestions for reducing this burden, to: Suzanne H. Plimpton Reports Clearance Officer Office of the General Counsel National Science Foundation Arlington, VA 22230 [84]Policies and Important Links | [85]Privacy | [86]FOIA | [87]Help | [88]Contact NSF | [89]Contact Web Master | [90]SiteMap National Science Foundation The National Science Foundation, 4201 Wilson Boulevard, Arlington, Virginia 22230, USA Tel: (703) 292-5111, FIRS: (800) 877-8339 | TDD: (800) 281-8749 [91]Text Only References 1. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#toc 2. http://www.nsf.gov/cif21 3. mailto:DIBBsQueries@nsf.gov 4. mailto:DIBBsQueries@nsf.gov 5. mailto:DIBBsQueries@nsf.gov 6. mailto:DIBBsQueries@nsf.gov 7. mailto:DIBBsQueries@nsf.gov 8. mailto:DIBBsQueries@nsf.gov 9. mailto:DIBBsQueries@nsf.gov 10. mailto:DIBBsQueries@nsf.gov 11. mailto:DIBBsQueries@nsf.gov 12. mailto:DIBBsQueries@nsf.gov 13. mailto:DIBBsQueries@nsf.gov 14. mailto:DIBBsQueries@nsf.gov 15. mailto:DIBBsQueries@nsf.gov 16. http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg 17. http://www.nsf.gov/publications/pub_summ.jsp?ods_key=grantsgovguide 18. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#summary 19. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#pgm_intr_txt 20. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#pgm_desc_txt 21. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#awd_info 22. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#elig 23. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#prep 24. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#prep 25. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#budg_cst_shr_txt 26. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#dates 27. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#fastlane 28. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#review 29. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#reviewcrit 30. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#reviewprot 31. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#awardadmin 32. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#awardnotify 33. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#grantcond 34. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#reportreq 35. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#cont 36. http://www.nsf.gov/pubs/2016/nsf16530/nsf16530.htm#othpgm 37. http://www.nsf.gov/pubs/2010/nsf10015/nsf10015.jsp 38. https://www.nsf.gov/publications/ pub_summ.jsp?WT.z_pims_id=505161&ods_key=nsf15563 39. http://www.nsf.gov/sbe/sbe_2020/ 40. http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=504705 41. http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=505168&org=SES&from=home 42. http://www.nsf.gov/funding/pgm_summ.jsp?pims_id=504767 43. http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg 44. mailto:nsfpubs@nsf.gov 45. http://www.nsf.gov/publications/ pub_summ.jsp?ods_key=grantsgovguide 46. mailto:nsfpubs@nsf.gov 47. http://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg 48. http://www.nsf.gov/cise/cise_dmp.jsp 49. https://www.fastlane.nsf.gov/a1/newstan.htm 50. mailto:fastlane@nsf.gov 51. http://www.grants.gov/web/grants/applicants.html 52. mailto:support@grants.gov 53. http://www.nsf.gov/pubs/policydocs/ pappguide/nsf15001/gpg_3ex1.pdf 54. http://www.nsf.gov/bfa/dias/policy/merit_review/ 55. http://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf14043 56. http://www.nsf.gov/pubs/policydocs/ pappguide/nsf15001/gpg_2.jsp#IIC2di 57. http://www.nsf.gov/pubs/policydocs/ pappguide/nsf15001/gpg_2.jsp#IIC2di 58. http://www.nsf.gov/awards/managing/ award_conditions.jsp?org=NSF 59. mailto:nsfpubs@nsf.gov 60. http://www.nsf.gov/publications/pub_summ.jsp?ods_key=aag 61. http://www.nsf.gov/publications/pub_summ.jsp?ods_key=aag 62. mailto:DIBBsQueries@nsf.gov 63. mailto:DIBBsQueries@nsf.gov 64. mailto:DIBBsQueries@nsf.gov 65. mailto:DIBBsQueries@nsf.gov 66. mailto:DIBBsQueries@nsf.gov 67. mailto:DIBBsQueries@nsf.gov 68. mailto:DIBBsQueries@nsf.gov 69. mailto:DIBBsQueries@nsf.gov 70. mailto:DIBBsQueries@nsf.gov 71. mailto:DIBBsQueries@nsf.gov 72. mailto:DIBBsQueries@nsf.gov 73. mailto:DIBBsQueries@nsf.gov 74. mailto:DIBBsQueries@nsf.gov 75. mailto:fastlane@nsf.gov 76. mailto:support@grants.gov 77. http://www.nsf.gov/bfa/dias/policy/outreach.jsp 78. https://public.govdelivery.com/accounts/ USNSF/subscriber/new?topic_id=USNSF_179 79. http://www.grants.gov/ 80. http://www.nsf.gov/ 81. mailto:nsfpubs@nsf.gov 82. http://www.nsf.gov/policies/ SOR_PA_NSF-50_Principal_Investigator_Proposal_File.pdf 83. http://www.nsf.gov/policies/ SOR_PA_NSF-51_Reviewer_Proposal_File.pdf 84. https://www.nsf.gov/policies 85. https://www.nsf.gov/policies/privacy.jsp 86. https://www.nsf.gov/policies/foia.jsp 87. https://www.nsf.gov/help/ 88. https://www.nsf.gov/help/contact.jsp 89. mailto:webmaster@nsf.gov 90. https://www.nsf.gov/help/sitemap.jsp 91. https://assistive.usablenet.com/tt/referrer