Archived funding opportunity

This document has been archived. The latest version is NSF 23-624.

NSF 11-587: Cyberlearning: Transforming Education

Program Solicitation

Document Information

Document History

Program Solicitation NSF 11-587

NSF Logo

National Science Foundation

Directorate for Computer & Information Science & Engineering

Directorate for Education & Human Resources

Directorate for Social, Behavioral & Economic Sciences

Office of Cyberinfrastructure


Letter of Intent Due Date(s) (required) (due by 5 p.m. proposer's local time): 

     May 14, 2012

      for Integration and Deployment Projects (INDPs) only

     May 14, 2013

      for Integration and Deployment Projects (INDPs) only

Full Proposal Deadline(s) (due by 5 p.m. proposer's local time):

     December 15, 2011

      Exploration Projects (EXPs)

     January 18, 2012

      Design and Implementation Projects (DIPs)

     February 15, 2012

      Cyberlearning Resource Center (CRC)

     July 16, 2012

      Integration and Deployment Projects (INDPs)

     December 17, 2012

      Exploration Projects (EXPs)

     January 16, 2013

      Design and Implementation Projects (DIPs)

     July 15, 2013

      Integration and Deployment Projects (INDPs)

Full Proposal Target Date(s): 

     March 16, 2012

      Capacity-Building Projects (CAPs)

     October 15, 2012

      Capacity-Building Projects (CAPs)

     March 15, 2013

      Capacity-Building Projects (CAPs)

Important Information And Revision Notes

A revised version of the NSF Proposal & Award Policies & Procedures Guide (PAPPG), NSF 13-1, was issued on October 4, 2012 and is effective for proposals submitted, or due, on or after January 14, 2013. Please be advised that the guidelines contained in NSF 13-1 apply to proposals submitted in response to this funding opportunity. Proposers who opt to submit prior to January 14, 2013, must also follow the guidelines contained in NSF 13-1.

Please be aware that significant changes have been made to the PAPPG to implement revised merit review criteria based on the National Science Board (NSB) report, National Science Foundation's Merit Review Criteria: Review and Revisions. While the two merit review criteria remain unchanged (Intellectual Merit and Broader Impacts), guidance has been provided to clarify and improve the function of the criteria. Changes will affect the project summary and project description sections of proposals. Annual and final reports also will be affected.

A by-chapter summary of this and other significant changes is provided at the beginning of both the Grant Proposal Guide and the Award & Administration Guide.

Please note that this program solicitation may contain supplemental proposal preparation guidance and/or guidance that deviates from the guidelines established in the Grant Proposal Guide.

Revision Summary

This solicitation replaces NSF 10-620. The solicitation has been revised in the following ways. Additional details about each can be found in the body of the solicitation.

Types of Awards: Two new types of awards are solicited: Capacity-Building Projects (CAPs) and a Cyberlearning Resource Center (CRC). See the sub-section entitled "PROJECT CATEGORIES" in Section II. Program Description for descriptions of all types of awards.

Change of Full-Proposal Deadline: Exploration Projects (EXPs) are now due in mid-December.

Clarifications: The following clarifications have been made in the solicitation document.

  • Research expectations are more clearly presented. For general expectations, see the bullet "Research" in the Section II. Project Description. For details about what is expected for each project type, see the sub-section "PROJECT CATEGORIES" in Section II. Project Description.
  • Guidelines about iterative refinement of the technological innovation are better specified. General guidelines can be found under the bullet "Technological innovation and plan for its iterative refinement" in Section II. Project Description. For details about what is expected for each project type, see the sub-section "PROJECT CATEGORIES" in Section II. Project Description.
  • Guidelines about measuring effectiveness of the technological innovation have been added. General guidelines can be found under the bullet "Measurement of the effectiveness of the technological innovation" in Section II. Project Description. For details about what is expected for each project type, see the sub-section "PROJECT CATEGORIES" in Section II. Project Description.
  • The expertise required on Interdisciplinary project teams has been spelled out in greater detail. See the bullet "Interdisciplinary project teams" in Section II. Project Description for general guidelines. For specific guidelines for each project type, see the sub-section "PROJECT CATEGORIES" in Section II. Project Description.
  • Requirements for reports of prior support have been modified. Reports of prior support should include only prior support directly related to the proposed activities. See the sub-section "Supplementary Documents" in Section V., Sub-section A. Proposal Preparation Instructions.
  • The additional solicitation-specific review criteria have been revised. See the sub-section "Additional Solicitation Specific Review Criteria" in Section VI. NSF Proposal Processing and Review Procedures.

Screen shots: Up to five diagrams or screen shots are allowed in the supplementary materials to give readers a chance to understand how learners will experience the proposed technology. See the subsection "Supplementary Documents" in Section V., Sub-section A. Proposal Preparation Instructions.

Collaboration and Management Plan: A Collaboration and Management Plan is required in all proposals. It should detail how the collaborative team will interact to ensure that issues of learning, technology, and context are considered from the beginning. For details, see Section V., Sub-section A. Proposal Preparation Instructions.

Additional References: Additional references related to the solicitation are cited. See the sub-section "REFERENCES" in Section II. Project Description.

Summary Of Program Requirements

General Information

Program Title:

Cyberlearning: Transforming Education (Cyberlearning)

Synopsis of Program:

Through the Cyberlearning: Transforming Education program, NSF seeks to integrate advances in technology with advances in what is known about how people learn to

  • better understand how people learn with technology and how technology can be used productively to help people learn, through individual use and/or through collaborations mediated by technology;
  • better use technology for collecting, analyzing, sharing, and managing data to shed light on learning, promoting learning, and designing learning environments; and
  • design new technologies for these purposes, and advance understanding of how to use those technologies and integrate them into learning environments so that their potential is fulfilled.

Of particular interest are technological advances that allow more personalized learning experiences, draw in and promote learning among those in populations not served well by current educational practices, allow access to learning resources anytime and anywhere, and provide new ways of assessing capabilities. It is expected that Cyberlearning research will shed light on how technology can enable new forms of educational practice and that broad implementation of its findings will result in a more actively-engaged and productive citizenry and workforce.

Cyberlearning awards will be made in three research categories, each focusing on a different stage of research and development: Exploratory (EXP), Design and Implementation (DIP), and Integration and Deployment (INDP). The Cyberlearning program will also support small Capacity-Building Projects (CAP) and a Cyberlearning Resource Center (CRC).

Cognizant Program Officer(s):

Please note that the following information is current at the time of publishing. See program website for any updates to the points of contact.

  • Janet Kolodner, Program Officer, CISE/IIS and EHR/DRL, 1125, telephone: 703-292-8930, email: jkolodne@nsf.gov

  • Lee L. Zia, Program Officer, EHR/DUE, 835N, telephone: 703-292-5140, email: lzia@nsf.gov

  • Sharon Tettegah, Program Officer, EHR/DRL, 885 S, telephone: 703-292-5092, email: stettega@nsf.gov

  • Mimi McClure, Program Officer, OD/OCI, 1145 S, telephone: 703-292-5197, email: mmcclure@nsf.gov

  • Soo-Siang Lim, Program Officer, SBE/OAD, 905 N, telephone: 703-292-7878, email: slim@nsf.gov

Applicable Catalog of Federal Domestic Assistance (CFDA) Number(s):

  • 47.070 --- Computer and Information Science and Engineering
  • 47.075 --- Social Behavioral and Economic Sciences
  • 47.076 --- Education and Human Resources
  • 47.080 --- Office of Cyberinfrastructure

Award Information

Anticipated Type of Award: Standard Grant or Continuing Grant or Cooperative Agreement

Estimated Number of Awards: 28 to 49 awards will be made, contingent on the availability of funds.

Anticipated Funding Amount: $36,000,000 Contingent upon availability of funds, up to $36 million will be available in FYs 2012 and 2013 combined to fund proposals submitted in response to this solicitation. The intention is to fund 12 to 18 EXPs, 6 to 12 DIPs, 2 to 4 INDPs, 7 to 14 CAPs, and 1 CRC over that 2-year period.

Eligibility Information

Organization Limit:

The categories of proposers eligible to submit proposals to the National Science Foundation are identified in the Grant Proposal Guide, Chapter I, Section E.

PI Limit:

None Specified

Limit on Number of Proposals per Organization:

None Specified

Limit on Number of Proposals per PI: 3

An individual may participate as PI or Co-PI in no more than three (3) EXP, DIP, and INDP proposals in any fiscal year (October to September): at most, two (2) proposals in the Exploratory (EXP) and Design and Implementation (DIP) categories combined, and at most, one (1) proposal in the Integration and Deployment Project category. These eligibility conditions will be strictly enforced in order to treat everyone fairly and consistently. In the event that an individual exceeds this limit, proposals will be accepted based on earliest date and time of proposal submission. Proposals that exceed the limit will be returned without review. No exceptions will be made.

It is expected that PIs will participate in no more than one CAP at a time; PIs should talk to a Program Officer for permission to participate in more than one CAP.

Proposal Preparation and Submission Instructions

A. Proposal Preparation Instructions

  • Letters of Intent: Submission of Letters of Intent is required for Integration and Deployment Projects ONLY. Please see the full text of this solicitation for further information.
  • Preliminary Proposal Submission: Not Applicable
  • Full Proposals:
    • Full Proposals submitted via FastLane: NSF Proposal and Award Policies and Procedures Guide, Part I: Grant Proposal Guide (GPG) Guidelines apply. The complete text of the GPG is available electronically on the NSF website at: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg.
    • Full Proposals submitted via Grants.gov: NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov Guidelines apply (Note: The NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=grantsgovguide)

B. Budgetary Information

  • Cost Sharing Requirements: Inclusion of voluntary committed cost sharing is prohibited.
  • Indirect Cost (F&A) Limitations: Not Applicable
  • Other Budgetary Limitations: Not Applicable

C. Due Dates

  • Letter of Intent Due Date(s) (required) (due by 5 p.m. proposer's local time):

    May 14, 2012

    for Integration and Deployment Projects (INDPs) only

    May 14, 2013

    for Integration and Deployment Projects (INDPs) only

  • Full Proposal Deadline(s) (due by 5 p.m. proposer's local time):

    December 15, 2011

    Exploration Projects (EXPs)

    January 18, 2012

    Design and Implementation Projects (DIPs)

    February 15, 2012

    Cyberlearning Resource Center (CRC)

    July 16, 2012

    Integration and Deployment Projects (INDPs)

    December 17, 2012

    Exploration Projects (EXPs)

    January 16, 2013

    Design and Implementation Projects (DIPs)

    July 15, 2013

    Integration and Deployment Projects (INDPs)

  • Full Proposal Target Date(s):

    March 16, 2012

    Capacity-Building Projects (CAPs)

    October 15, 2012

    Capacity-Building Projects (CAPs)

    March 15, 2013

    Capacity-Building Projects (CAPs)

Proposal Review Information Criteria

Merit Review Criteria: National Science Board approved criteria. Additional merit review considerations apply. Please see the full text of this solicitation for further information.

Award Administration Information

Award Conditions: Standard NSF award conditions apply.

Reporting Requirements: Standard NSF reporting requirements apply.

  • Agency Contacts

  • Other Information
  • I. Introduction

    Among society's central challenges are amplifying, expanding, and transforming opportunities people have for learning and more effectively drawing in, motivating, and engaging young learners. Engaging actively as a citizen and productively in the workforce requires understanding a broad variety of concepts and possessing the ability to collaborate, learn, solve problems, and make decisions. Whether learning is facilitated in school or out of school, and whether learners are youngsters or adults, to develop such knowledge and capabilities, learners must be motivated to learn, actively engage over the long term in learning activities, and put forth sustained cognitive and social effort.

    Research supported by the Cyberlearning program will therefore explore the opportunities for promoting and assessing learning made possible by new technologies, ways to help learners capitalize on those opportunities, new practices that are made possible by learning technologies, and ways of using technology to promote deep and lasting learning of content, practices, skills, attitudes, and/or dispositions needed for engaged and productive citizenship. Cyberlearning research will marry what is known about the processes by which people learn with advances in information and communications technologies to advance understanding of how to cultivate a citizenry that engages productively in learning both in and out of school and throughout a lifetime; and that possesses the knowledge and capabilities to make informed decisions and judgments about problems ranging from their immediate lives to ethics, privacy, and security concerns to global issues such as war and peace, economics, health and well being, and the environment.

    II. Program Description

    The goals of the Cyberlearning program are:

    • To better understand how people learn with technology and how technology can be used productively to help people learn, through individual use and/or through collaborations mediated by technology;
    • To better use technology for collecting, analyzing, sharing, and managing data to shed light on learning, promoting learning, and designing learning environments; and
    • To design new technologies for these purposes, and advance understanding of how to use those technologies and integrate them into learning environments so that their potential is fulfilled.

    The program will fund projects that explore opportunities for promoting and assessing learning made possible by new technologies, ways to help learners capitalize on those opportunities, new practices that are made possible by learning technologies, and ways of using technology to promote deep and lasting learning of content, practices, skills, attitudes, and/or dispositions needed for engaged and productive citizenship. Every project should therefore seek to advance understanding of how to better promote learning, how to promote better learning, or how learning happens in technology-rich environments (including relationships between people and technology that result in productive learning, access provided via technology to learning resources, such as data and scientific information, and opportunities for promoting learning through better linking of assessment to learning). Each project should also focus, concurrently, on furthering some technological innovation. The technological innovation may be targeted at advancing some innovative technology design or exploring new ways of using technologies for learning or assessment, coherently integrating such technologies with each other, and/or integrating such technologies into targeted learning environments. Especially sought are projects in which technology allows the tailoring of learning experiences to special needs and interests of groups or individuals or allows expanding formal education beyond classroom settings. Targeted learning environments may be formal or informal, traditional or non-traditional, collaborative or individual, or may seek to combine or bridge several different types of learning venues. Proposed research and innovations must be grounded in theories of and literatures on learning and learning with technology.

    Cyberlearning innovations will not effect transformations unless they are substantively integrated into authentic learning environments, taking into account the affordances (opportunities offered) and constraints of the environment, including the capabilities, needs, and goals of agents in the environment, the resources that are available, and the physical space. At the same time, integration of technologies into learning environments may change those environments, prompting a need to understand, predict, and design for those changes. Indeed, it is expected that some technology designs and some ways of integrating technology into learning environments may challenge conventional educational practices.

    Cyberlearning projects must therefore include both research and development components. A significant amount of effort in all projects should go into iterative refinement of the design, implementation, or use of a technological innovation based on systematic analysis of formative data. Except in the case of some exploratory projects, formative analysis of the technological innovation should be carried out in one or more of the real-world contexts for which the technology is targeted. The research component of each project should be carried out in the context of using the technology and should advance understanding of learning with technology or learning in technology-rich environments. Projects should take into account both theoretical and practical issues, focusing on new directions while, at the same time, taking into account a future in which research outcomes inform implementations on broader and larger scales.

    It is important for all projects to be grounded in the latest research on how people learn and to aim to maximize the affordances of chosen technologies. Therefore, every project team, even those for exploratory projects, should include people with expertise in how people learn, the targeted technology, the targeted learners, practices of educating in the targeted learning environment, the targeted content and/or practices, and learning of the targeted content and/or practices.

    Cyberlearning awards will be made in three research categories, each focusing on a different stage of research and development: Exploratory (EXP), Design and Implementation (DIP), and Integration and Deployment (INDP). The Cyberlearning program will also support Capacity-Building Projects (CAP) and a Cyberlearning Resource Center (CRC).

    All EXP, DIP, and INDP proposals should include the following components. Additional details about what is expected for each of the types of proposals are described below in the next subsection.

    • Research

    All Cyberlearning projects should advance understanding about how people learn with technology, how to use technology to help people learn, and/or how to use technology to enhance assessment or education practices. Hence, each project should endeavor to answer or shed light on the answers to one or a set of fundamental research questions about learning or promoting learning. Research should aim to advance understanding about why, how, to what extent, or under what circumstances learning phenomena happen.

    Note that research is defined here differently from evaluation. While evaluation efforts typically judge the quality of a particular implementation and the reasons for its outcomes, the research component of Cyberlearning projects must contribute new understandings that endure beyond the implementation being proposed and beyond the particular technology being used. Research questions should be articulated as "why," "to what extent," "how", and/or "under what circumstances" questions. Proposals should make clear the fundamental research question(s) being addressed and the data collection and analysis plans that support that.

    • Technological innovation and plan for its iterative refinement

    Proposed technological innovations must improve significantly on the status quo and have potential to be significantly scaled. The innovation may be a new technology, a new use of a technology, a new way of combining technologies, or an innovation in the way an advanced technology is used to promote or assess learning. Proposed innovations should also have the potential to significantly advance opportunities for learning -- by amplifying, expanding, or transforming opportunities for learning, or by better drawing in, motivating, or engaging learners.

    Proposed technological innovations must be based on or supported by the literature on processes involved in learning -- cognitive, social, cultural, developmental, neural, and/or volitional. Plans for iteratively analyzing and refining innovations should be supported by this body of literature as well. Examples of such literature can be found in the citations listed in the solicitation. Proposals should make clear the works that inform their innovation.

    Iterative refinement of the technological innovation over the years of the proposed project should be aimed at uncovering affordances of the technology for affecting productive learning or assessment, fruitful directions for further research or development, or the conditions under which the innovation could fulfill its transformative potential. It is expected that most projects will take the form of design studies (see, e.g., chapters by Confrey and Barab in Sawyer, 2006, the special issue of Journal of the Learning Sciences on Design-Based Research (Volume 13, No. 1, 2004), the special issue of Educational Researcher on Design-Based Research (Volume 39, N0. 4, 2004)) or design experiments (Brown, 1992) in which an initial innovation (detailed in the proposal) is deployed in a real-world learning environment and formative data are collected to both inform refinement of the innovation and to identify the opportunities it offers (affordances) for promoting or assessing learning and/or guidelines for its effective use. Proposals should make clear how they will focus their iterative refinements, the data they will collect to inform refinements, and the literature that informs that focus.

    • Measurement of the effectiveness of the technological innovation

    One can judge the potential of an innovation to fulfill its transformative potential only by collecting data that inform on its effectiveness. Thus, data should be collected and analyzed during each iteration of the technology to produce evidence of effectiveness.

    It is expected that Cyberlearning innovations will be aimed towards two types of outcomes: a long-term potential outcome and the shorter-term outcome of the proposed implementation. Proposals should make clear both the potential long-term and shorter-term learning-related outcomes they are targeting and justify the significance of both. Proposals should also make clear which previous results from the literatures on learning they are drawing on in the design and iterative refinement of their innovations. Finally, they should make clear what data they are collecting and how they are analyzing the data in order to (i) judge effectiveness of their implementation (short term), (ii) judge potential effectiveness of their innovation (long-term), and (iii) identify affordances of their innovation, constraints on its use, and/or guidelines for its potential effective use.

    • Interdisciplinary project teams

    It is expected that all EXP, DIP, INDP, and CAP projects funded through Cyberlearning will have interdisciplinary expertise. The project team (including PIs, senior personnel and supporting investigators, post-docs, advisory-board members, and others) should be appropriate for addressing proposed technological and research goals. Each team is expected to carry out the data collection and analysis necessary to evaluate and refine their innovation and answer their research questions. Teams should be formed accordingly.

    Composition of teams will necessarily vary with the targeted outcomes of projects. However, whichever the project category and whatever the proposed outcomes, every team must be multi-disciplinary as described above. More detail can be found below regarding the specifics of these requirements for each class of project.

    Teams should work together to develop their proposal, and it should be clear from the proposal that the team is already an operational entity. Proposers should make clear the challenges associated with assessment and evaluation, robustness and broader usability that they anticipate, and the team members who will help with each of these.

    Proposals should make clear the roles of all team members, why the proposed team is an appropriate one, what expertise each team member brings, and how the team will work together. The proposal should make clear how the integrated contributions of the members of each proposal team are greater than the sum of the contributions of each individual member of the team. Since successful collaborative research depends on thoughtful coordination mechanisms, a Collaboration Plan is required for all proposals. The length of and level of detail provided in the Collaboration Plan should be commensurate with the complexity of the collaboration. Please see Proposal Preparation Instructions Section V.A for additional submission guidelines.

    For DIP and INDP proposals, project proposers should also include on their teams people who can help them plan towards fulfilling the transformational potential of their work, including, as appropriate, those who can help them transition their technology to broad use and those from stakeholder groups who will need to be integrated into the project as innovations move towards scalability, broad dissemination, and continuation over time. As appropriate to the proposed work, project teams should include members who will help in building bridges between communities, helping to make sure the proposed work is appropriate for targeted stakeholders, helping stakeholders and researchers participate in design together, and helping stakeholders understand and come to enthusiastically embrace proposed innovations. It will be appropriate for some projects to include representatives of private-sector or non-profit companies who might be involved with technology transfer.

    PROJECT CATEGORIES

    As stated above, Cyberlearning awards will be made in three research categories, each focusing on a different stage of research and development: Exploration (EXP), Design and Implementation (DIP), and Integration and Deployment (INDP). The Cyberlearning program will also support Capacity-Building Projects (CAP) and a Cyberlearning Resource Center (CRC). The table below summarizes the purposes and prerequisites of each project category.

    Project Type

    Due Dates

    Budget and Duration

    Characteristics and Requirements

    Exploration (EXP)

    December 15, 2011 and December 17, 2012

    $550,000 over 2 to 3 years

    Purpose: to explore the feasibility of a technological innovation and to shed light on the answers to fundamental research questions related to learning with technology

    Prerequisites: team with a shared vision that takes into account what is known about how people learn, learning in the targeted domain, use of technology for such learning, and challenges to technology use

    Development and Implementation (DIP)

    January 18, 2012 and January 16, 2013

    $1,350,000 over 4 or 5 years

    Purpose: to ascertain the potential of ideas, develop guidelines for use of an innovation, and answer research questions about learning with technology

    Prerequisites: same as EXP plus completed work equivalent to one or more Cyberlearning EXP projects

    Integration and Deployment (INDP)

    July 16, 2012 and July 15, 2013

    $2,500,000 over 4 or 5 years

    Purpose: to integrate or extend the use of one or more technologically-sophisticated efforts that have already shown promise and answer a variety of research questions related to learning with technology

    Prerequisites: same as EXP plus completed work equivalent to one or more Cyberlearning DIP projects

    Capacity Building (CAP)

    March 16, and October 13, 2012 and March 15, 2013

    Varies

    Purpose: partnership building and community building, including conferences, workshops, and short courses

    Cyberlearning Resource Center

    February 15, 2012

    Up to $500,000 in the first year and up to $1M in subsequent years, to be awarded for up to 5 years

    Purpose: to support Cyberlearning projects and programmatic efforts

    Prerequisites: lead institution should have cyberlearning expertise and demonstrated capacity to plan, develop, and manage a national center that provides technical support for a diverse porfolio of projects

    Note: to be awarded as a cooperative agreement

    The paragraphs that follow include additional detail about requirements for projects in each category.

    Exploration Projects (EXP projects) explore the proof-of-concept or feasibility of a novel or innovative technology or use of such technology for assessment or to promote learning. EXP projects are for the purpose of trying out new ideas. EXP projects might explore how existing technologies can be used for assessment or to promote learning or explore the opportunities for assessment, promotion of learning, or engaging in learning of a new or existing technology).

    • Prerequisites: The proposal project team should have a shared vision that takes into account from the outset what is known about how people learn, learning in the targeted domain, the use of technology for such learning, and challenges to such use.
    • Project characteristics: EXP projects should take into account what is known about processes involved in learning, characteristics of the targeted learner population, and affordances (opportunities offered) of the technology being investigated.
    • Research: EXP projects should aim to shed light on the answers to foundational questions related to learning, learning with technology, linking learning and assessment, and/or learning in technology-rich environments. Proposals should make clear the research question(s) they propose to shed light on, the extent to which they expect to be able to shed light, and the data collection and analysis plans that support that.
    • Technological innovations, iterative refinements and formative analyses: EXP projects should include two or more cycles of iterative refinement. At a minimum, formative analyses should focus on the usability of the technology, effective ways of using the technology for learning or assessment, and challenges to effective use. Projects focused on technologies for promoting learning should also explore pathways towards engaging learners in the technology's sustained use.
    • Effectiveness: It is not expected that Exploration projects will include summative evaluations or efficacy studies. The proposal must, however, measure effectiveness in some way relevant to the project goals, at a minimum for the purposes of iterative refinement. Proposals should make clear the targeted outcomes and how effectiveness of the innovation will be measured.
    • Project team: Project teams should include, at a minimum, partners with the expertise listed above. In choosing experts on how people learn, PIs should consider the range of cognitive, engagement, social, volitional, and other learning issues that need to be addressed to achieve the transformative potential of their technological innovation and should include in their advisory committee researchers who can help them consider these issues from the beginning. Since iterative refinement of the technological innovation will focus on identifying its affordances and challenges to its effectiveness, the team should include experts on collecting and analyzing data that can inform about usability and effectiveness. Expertise may reside in a single PI and his/her advisory committee or may be distributed across co-PIs and an advisory committee.
    • Duration and funding: EXP awards will be funded over a 2 or 3 year period for up to $550,000 total.

    Design and Implementation Projects (DIP projects) are for ascertaining the potential of ideas, developing guidelines for use of technology to support assessment, learning, and/or engagement, and answering research questions about learning with technology. These projects might advance understanding about how to more broadly or productively use technology that holds promise or how to coherently integrate several technological innovations that hold promise. DIP research and development should be carried out in the everyday environments in which people spend their lives, e.g., schools, homes, museums, parks, and the workplace.

    • Prerequisites: Work equivalent to one or more Cyberlearning EXP projects should already be completed prior to applying for a DIP. The proposal should make clear the results of such previous efforts - (i) the technological innovations that resulted from those projects, (ii) the knowledge gained about affordances of the innovation for assessment or promoting learning or engagement, and challenges to effective use, and (iii) the answers to research questions pertaining to assessment, engagement, learning, learning with technology, linking learning and assessment, and/or learning in technology-rich environments derived from those projects.
    • Project characteristics: Innovations should take into account not only what is known about processes involved in learning but also how to sustain engagement over long periods of time, and proposers should make clear how their innovation addresses the needs and capabilities of targeted learners (or users). Innovations should also be designed taking into account real-world affordances (opportunities offered) and constraints of the targeted learning environment, including the people and resources that might be available. By later years of the project, leadership roles should be assigned to persons employed to implement such innovations in the chosen learning environment.
    • Research: DIP projects should aim to answer foundational questions related to learning, learning with technology, linking learning and assessment, and/or learning in technology-rich environments. This is in addition to uncovering guidelines for design or productive use of the technological innovation and identifying the effects of the innovation on learning.
    • Technological innovations, iterative refinements and formative analyses: DIP projects should include three or more cycles of iterative refinement. Formative analyses of DIP innovations should focus, at a minimum, on the usability of the technology, its effects on learning and/or engagement, and effective ways of integrating use of the technology into activities in the learning environment, including good practices for promoting learning and means of engaging learners in the technology's sustained and effective use. Data collection and analysis should answer questions about the design and efficacy of the proposed innovation as well as questions about learning with technology and the practicality and sustainability of using the technology within the targeted environment.
    • Effectiveness: Effectiveness of the innovation for promoting learning must be measured in all DIP projects, both for purposes of iterative refinement and to judge potential of the innovation. Proposals should make clear near-term and potential long-term targeted outcomes, how effectiveness of the innovation will be measured, and why selected measures and approaches are appropriate. Measurement may be qualitative or quantitative, as appropriate to the targeted outcome goals and maturity of the innovation.
    • Project team: The project team should include the types of partners required for EXP projects. In addition, as appropriate, the team should include representatives of stakeholder groups, to help the team plan towards broader use and deployment, and/or organizations that will help with technology transfer. The team should also include teachers and/or mentors who would normally take on leadership responsibilities in targeted environments.
    • Duration and funding: DIP projects will be funded over a 4 or 5-year period up to a total of $1,350,000 total.

    Integration and Deployment Projects (INDP Projects) projects should build on one or more technologically-sophisticated efforts that have already demonstrated measures of success beyond proof of concept. Research and development should be carried out in the everyday environments in which people spend their lives, and like other types of projects, they will answer questions about learning and about design or use of technology for learning. These projects will build on research that has already shown the promise of some technology or set of technologies for promoting learning or advancing our understanding of learning. These projects might advance understanding of how to more broadly or productively use technology that holds promise or how to coherently integrate several technological innovations that hold promise.

    • Prerequisites: INDP projects are the largest Cyberlearning awards, and work equivalent to one or more Cyberlearning DIP projects should already be completed prior to applying for an INDP. The proposal should make clear the results of such previous efforts - (i) the technological innovations that resulted from those projects and measures of their effectiveness, (ii) the knowledge gained about making such innovations successful, and (iii) the answers to research questions pertaining to assessment, engagement, learning, learning with technology, linking learning and assessment, and/or learning in technology-rich environments derived from those projects.
    • Project characteristics: Proposed innovations should take into account the broad range of issues important to successful learning and deployment, including what is known about processes involved in learning, how to engage and sustain engagement among learners, characteristics of the learner population and the targeted learning environments, and the preparation of those who will introduce and take on leadership responsibilities in promoting learning with the technology (e.g., teachers and mentors). These projects may be of several different types:
      • They may advance understanding of how to productively integrate and use a variety of established technologies to better promote learning or promote better learning in a target population and environment.
      • They may provide guidelines on extending the use of some promising technology or technologies over a larger variety of learner populations, advancing understanding of how to better address learning needs of different populations.
      • They may provide guidelines on extending the use of some promising technology or technologies over a larger variety of learning contexts, advancing understanding of learning processes that underlie disciplinary areas or the constraints and affordances (opportunities offered) of different environments for learning.
      • They may combine advances in two or more of these areas.
    • Research: It is expected that INDPs will address a wide variety of foundational research questions related to learning, learning with technology, linking learning and assessment, and/or learning in technology-rich environments. This is in addition to uncovering guidelines for design or productive use of the technological innovation and identifying the effects of the innovation.
    • Technological innovations, iterative refinement and formative analyses: INDP projects should include multiple cycles of iterative refinement, as appropriate to the project. As for DIP projects, formative analyses should answer questions about usability, learning, effective use, and sustained use. In addition, formative analyses should address, as appropriate, issues associated with scale-up, sustainability, workforce development, and/or long-term efficacy. It is expected that technologies will be deployed and evaluated in a large variety of learning environments, that by the end of the project, the technology will be ready for technology transfer, and that the guidelines proposed will be broadly applicable beyond the particular technology being deployed. Throughout the project's duration, facilitation of technology use should be done by those who would naturally be the facilitators in the chosen learning environment (e.g., teachers, scout leaders, parents, peers).
    • Effectiveness: Effectiveness of the innovation for promoting learning must be measured in all INDP projects, both for purposes of iterative refinement and to judge long-term potential of the innovation. Proposals should make clear both the near-term and potential long-term targeted outcomes, how effectiveness of the innovation will be measured, and why that means of measurement is appropriate.
    • Project team: INDP projects are expected to be wide-reaching enough that they require highly-interdisciplinary and highly-collaborative teams from across organizations. While it may be possible that a competitive INDP can be done within a single organization, it is envisioned that any project at a stage of maturity appropriate for an INDP will require a range of experts drawn from across multiple disciplines and multiple organizations, including the types of collaborators listed above and also collaborators who can advise about scale-up and sustainability issues. The team should also include collaborators who can provide guidance in helping teachers or other facilitators learn to integrate the technology into learning activities. Planning toward scale up will require, for many projects, partnerships with school systems and other potential stakeholder groups. The project team should include the full range of partners needed to consider issues in all relevant areas. These teams should include representatives of stakeholder groups, and it is expected that PIs will negotiate formal collaboration relationships with school districts, museums, or other organizations that would potentially deploy the technology. It will usually be appropriate for these teams to include representatives of organizations that will aid technology transfer.
    • Duration and funding: INDP awards will be for up to 5 years and up to $2,500,000 total.

    Capacity-Building Projects (CAPs) may be submitted as proposals or as supplements to funded projects. These projects are for the purpose of partnership building, expanding the Cyberlearning community and strengthening the capabilities of those new members, strengthening the ties between the several different Cyberlearning communities, moving new ideas to the fore, and enhancing capabilities and/or vision of the Cyberlearning community. CAP proposals will be considered twice during the year - in October and March. Proposers should contact a program officer before submitting CAP proposals. CAPs may take any of several forms, including the following. Other forms may be proposed.

    • Conferences, workshops, and short courses: Budgets are expected to be consistent with the duration of the event and the number of participants, but the cost will normally not exceed a total of $100,000 for up to two years. Proposed events should be well focused and related to the goals of the program. See the Proposal and Award Policies and Procedures Guide/Grant Proposal Guide Section II. D. for additional information about conference and workshop proposals. All conference, workshop, and short-course proposals should provide for an evaluation of the impact of the event to be conducted at least 12 months after the conference is completed.
    • Partnership-building activities: Budgets should promote developing and consolidating partnerships that take advantage of the complementary strengths and expertise of investigators and facilitate the preliminary work needed to develop a long-term Cyberlearning project. Partnerships will be funded for up to 1 year. Funding may be used for travel and materials and supplies needed for joint exploration, must be appropriate to the proposed exploration, and may not exceed $50,000.

    Cyberlearning Resource Center (CRC): One Cyberlearning Resource Center will be funded as a cooperative agreement to support Cyberlearning projects and programmatic efforts. The Cyberlearning Resource Center (CRC) will have responsibility for promoting collaboration among grantees; national dissemination of program findings, technologies, models, materials, and best practices; providing collaborative assessment, evaluation, and technical assistance to Cyberlearning projects; helping to bridge the gap between research and practice; creating a national presence for Cyberlearning; helping the disparate Cyberlearning research and development communities coordinate their efforts in a way that builds capacity; and providing infrastructure (technological and social) for supporting these efforts. The Resource Center will also conduct comprehensive evaluation of program effectiveness. Because projects in the Cyberlearning portfolio cover a broad range of technologies and learner populations, and because Cyberlearning projects have been awarded across NSF programs, the Center should have capacity to support diverse needs of both grantees and the program.

    • Project Characteristics: Proposals for the Cyberlearning Resource Center should strike a balance between support for grantees, program evaluation, capacity building, and dissemination; between public and a private faces. It is also anticipated that proposals will reflect exemplary use of cyberinfrastructure to the function of the Center itself.
    • Lead Institution: It is anticipated that the lead institution for the Cyberlearning Resource Center will be a service-oriented educational organization or institution with demonstrated capacity to plan, develop, and manage a national center that provides technical support for a diverse portfolio of projects across the United States. It should have known expertise of the targeted program areas. Finally, it is expected that the lead institution will be well known as having foundational and cyberlearning expertise.
    • Technical Support: The Resource Center is expected to provide technical support for Cyberlearning projects in different stages of implementation. This may include, but is not limited to, organizing and holding meetings, and identifying resources -- including print and electronic -- and professionals in the field that may augment or enhance projects in meeting their goals. In addition, the Resource Center is expected to support discussions, provide supporting materials to projects, and disseminate ideas and materials from the projects to the field.
    • Evaluation: The Resource Center is expected to carry out evaluation of the Cyberlearning program. While each project will have its own individual evaluation plan, the Resource Center is tasked with developing a plan to collect data across projects and to address overall impact, success in meeting Cyberlearning goals, and practices for moving results from research to practice. Proposals must also include evaluation of the impacts of the CRC by an external evaluator.
    • Dissemination: The primary responsibility for the dissemination of project findings to the field rests with the Resource Center. In addition to submitting a comprehensive report to NSF, the Center should include a plan for dissemination of findings to education professionals.
    • Collaboration: The CRC is intended to be synergistic with existing activities of professional associations and other resources offered through organizations and institutions engaged in cyberlearning research, development, and dissemination.
    • Duration and funding: The CRC will be funded for up to 5 years as a cooperative agreement. A first-year budget of $500,000 is anticipated, with budgets between $500,000 and $1,000,000 in subsequent years, depending on availability of funds and scope of work.

    IMPORTANT PROJECT CHARACTERISTICS

    The Cyberlearning program will fund a portfolio of projects representing exciting, potentially transformative research with potential for high impact and significant advancement of the state of the art. Proposals should demonstrate that their innovation will offer rich learning experiences for a diverse population of learners. It will be appropriate for many proposals to include the development of innovative curricula or educational materials in addition to proposing technological innovations. Interdisciplinary (including collaborators from the arts and humanities), international, and/or academic-industry collaborations that promise to result in major science or engineering advances are welcome. The program seeks proposals from investigators at a broad range of learning institutions, including faculty at minority-serving and predominantly undergraduate institutions.

    A successful research project should be potentially transformative; grounded in existing learning and education research; seek to answer questions about learning with technology; measure learning gains, taking into account appropriate elements of the learning ecology in designing its innovation, evaluating its innovation, and answering research questions; include team members with all necessary expertise, including expertise for outreach and dissemination; take into account potential scalability and sustainability issues; and use appropriate methodologies to evaluate innovations and measure learning gains. Our expectation is that many grants made by this program will seed long-term research enterprises. The transformative potential of proposed projects may be many years out, so proposers should make clear what that potential is and the predicted time horizon.

    COOPERATION WITH THE CYBERLEARNING RESOURCE CENTER

    A Cyberlearning Resource Center (CRC) will provide assessment, technology transfer, dissemination, and evaluation aid to PIs. This CRC will help Cyberlearning PIs collaborate to synthesize findings across the Cyberlearning portfolio, will provide technical assistance to Cyberlearning projects, will promote national awareness of research contributions from the Cyberlearning portfolio, and will build the Cyberlearning community through PI and special interest meetings. All Cyberlearning projects will be required to share their proposals and findings with the resource network and other Cyberlearning PIs, to participate in annual PI meetings and synthesis, and to be responsive to requests for information from other Cyberlearning PIs and from the CRC.

    REFERENCES

    Bell, Phillip, Bruce Lewenstein, Andrew W. Shouse, and Michael A. Feder (Eds.) (2009). Learning Science in Informal Environments: People, Places, and Pursuits. National Academies Press: Washington.

    Bransford, John D., Ann L. Brown, and Rodney R. Cocking (2000). How People Learn: Brain, Mind, Experience, and School. Washington: National Academies Press.

    Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. The Journal of the Learning Sciences, 2(2), 141-178.

    Dede, Chris, Honan, James P. & Peters, Laurence, C. (Eds.) (2005). Scaling Up Success: Lessons Learned from Technology-Based Educational Improvement. Jossey-Bass: New York.

    Donovan, Suzanne and John D. Bransford (2005). How Students Learn: History, Science, and Mathematics in the Classroom. Washington: National Academies Press, Washington.

    Duschl, Richard A., Schweingruber, Heidi A. & Shouse, Andrew W. (Eds.) (2007). Taking Science to School: Learning and Teaching Science in Grades K-8. The National Academies Press.

    Educational Researcher (2004). Special issue on Design-Based Research 39(4).

    Greeno, J. G., Collins, A. M, and Resnick, L. (1996). Cognition and Learning. In D. Berliner and R. Calfee (Eds.). Handbook of Educational Psychology (pp. 15-46). New York: MacMillan.

    Honey, Margaret A. & Hilton, Margaret (Eds.) (2011). Learning Science Through Computer Games and Simulations. The National Academies Press.

    Journal of the Learning Sciences (2004). Special issue on Design-Based Research. 13(1).

    NSF Taskforce on Cyberlearning (2008). Fostering Learning in the Networked World: The Cyberlearning Opportunity and Challenge. National Science Foundation. https://www.nsf.gov/publications/pub_summ.jsp?ods_key=nsf08204.

    Sawyer, Keith (Ed.) (2006). Handbook of the Learning Sciences, Cambridge, MA: Cambridge University Press.

    III. Award Information

    Contingent upon availability of funds, up to $36 million will be available in FYs 2012 and 2013 combined to fund proposals submitted in response to this solicitation. The intention is to fund 12 to 18 EXPs, 6 to 12 DIPs, 2 to 4 INDPs, 7 to 14 CAPs, and 1 CRC over that 2-year period.

    IV. Eligibility Information

    Organization Limit:

    The categories of proposers eligible to submit proposals to the National Science Foundation are identified in the Grant Proposal Guide, Chapter I, Section E.

    PI Limit:

    None Specified

    Limit on Number of Proposals per Organization:

    None Specified

    Limit on Number of Proposals per PI: 3

    An individual may participate as PI or Co-PI in no more than three (3) EXP, DIP, and INDP proposals in any fiscal year (October to September): at most, two (2) proposals in the Exploratory (EXP) and Design and Implementation (DIP) categories combined, and at most, one (1) proposal in the Integration and Deployment Project category. These eligibility conditions will be strictly enforced in order to treat everyone fairly and consistently. In the event that an individual exceeds this limit, proposals will be accepted based on earliest date and time of proposal submission. Proposals that exceed the limit will be returned without review. No exceptions will be made.

    It is expected that PIs will participate in no more than one CAP at a time; PIs should talk to a Program Officer for permission to participate in more than one CAP.

    V. Proposal Preparation And Submission Instructions

    A. Proposal Preparation Instructions

    Letters of Intent(required): A Letter of Intent (LOI) is required for Integration and Deployment Project (INDP) proposals. LOIs are due on or before May 14 of the year the proposal will be submitted. The LOI must contain (1) a proposed title; (2) the names of Principal Investigators and Co-Principal Investigators, including organizational affiliations and departments; (3) a list of the partnering institutions; (4) a brief synopsis (limited to 250 words) describing the proposed project in sufficient detail to permit selection of reviewers. LOIs will not be used to encourage or discourage the submission of full proposals. They will be used only to help NSF plan for the merit review process, and they are nonbinding. Thus, changes may be made between the submission of the LOI and submission of the full proposal.

    Letter of Intent Preparation Instructions:

    When submitting a Letter of Intent for INDP through FastLane in response to this Program Solicitation please note the conditions outlined below:

    • Sponsored Projects Office (SPO) Submission is not required when submitting Letters of Intent
    • Submission of multiple Letters of Intent is not allowed

    Full Proposal Preparation Instructions: Proposers may opt to submit proposals in response to this Program Solicitation via Grants.gov or via the NSF FastLane system.

    • Full proposals submitted via FastLane: Proposals submitted in response to this program solicitation should be prepared and submitted in accordance with the general guidelines contained in the NSF Grant Proposal Guide (GPG). The complete text of the GPG is available electronically on the NSF website at: https://www.nsf.gov/publications/pub_summ.jsp?ods_key=gpg. Paper copies of the GPG may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from nsfpubs@nsf.gov. Proposers are reminded to identify this program solicitation number in the program solicitation block on the NSF Cover Sheet For Proposal to the National Science Foundation. Compliance with this requirement is critical to determining the relevant proposal processing guidelines. Failure to submit this information may delay processing.
    • Full proposals submitted via Grants.gov: Proposals submitted in response to this program solicitation via Grants.gov should be prepared and submitted in accordance with the NSF Grants.gov Application Guide: A Guide for the Preparation and Submission of NSF Applications via Grants.gov. The complete text of the NSF Grants.gov Application Guide is available on the Grants.gov website and on the NSF website at: (https://www.nsf.gov/publications/pub_summ.jsp?ods_key=grantsgovguide). To obtain copies of the Application Guide and Application Forms Package, click on the Apply tab on the Grants.gov site, then click on the Apply Step 1: Download a Grant Application Package and Application Instructions link and enter the funding opportunity number, (the program solicitation number without the NSF prefix) and press the Download Package button. Paper copies of the Grants.gov Application Guide also may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from nsfpubs@nsf.gov.

    In determining which method to utilize in the electronic preparation and submission of the proposal, please note the following:

    Collaborative Proposals. All collaborative proposals submitted as separate submissions from multiple organizations must be submitted via the NSF FastLane system. Chapter II, Section D.4 of the Grant Proposal Guide provides additional information on collaborative proposals.

    Important Proposal Preparation Information: FastLane will check for required sections of the proposal, in accordance with Grant Proposal Guide (GPG) instructions described in Chapter II.C.2. The GPG requires submission of: Project Summary; Project Description; References Cited; Biographical Sketch(es); Budget; Budget Justification; Current and Pending Support; Facilities, Equipment & Other Resources; Data Management Plan; and Postdoctoral Mentoring Plan, if applicable. If a required section is missing, FastLane will not accept the proposal.

    Please note that the proposal preparation instructions provided in this program solicitation may deviate from the GPG instructions. If the solicitation instructions do not require a GPG-required section to be included in the proposal, insert text or upload a document in that section of the proposal that states, "Not Applicable for this Program Solicitation." Doing so will enable FastLane to accept your proposal.

    The following information SUPPLEMENTS (not replaces) the guidelines provided in the NSF Grant Proposal Guide (GPG) and the NSF Grants.gov Application Guide.

    Proposal Titles: Proposal titles must begin with an acronym that indicates the categories in which proposals are being submitted, as follows:

    • Exploration Projects - EXP
    • Design and Implementation Projects - DIP
    • Integration and Deployment Projects - INDP
    • Capacity Building - CAP
    • Resource Network -- CRC

    The acronym should be followed with a colon then the title of the proposed project. If you submit a proposal as one in a set of collaborative proposals, the title of your proposal should begin with the acronym that indicates the project category, followed by a colon, then "Collaborative Research" followed by a colon, and then the project title. For example, if you are submitting an Exploration Project, the title of each collaborative proposal would be EXP: Collaborative Research: Project Title.

    Project Summary: The Project Summary consists of an overview, a statement on the intellectual merit of the proposed activity, and a statement on the broader impacts of the proposed activity. Proposals that do not contain the Project Summary, including an overview and separate statements on intellectual merit and broader impacts will not be accepted by FastLane or will be returned without review.

    Project Description: Project Descriptions should include the following sections:

    • Vision and Goals. For EXP, DIP, and INDP proposals, describe the following. For CAP and CRC proposals, include the following as appropriate.
      • The national need investigators are addressing
      • The investigators' big-picture vision of addressing that need
      • The theories of learning and technological possibilities investigators are drawing from in that vision.
      • The proposed technological or socio-technological innovation and its role in the proposed vision
      • Learning objectives: what learners are expected to learn and how the proposed innovation or its integration into the learning environment is expected to promote that learning.
      • The population of learners, including any needs, abilities or interests relevant to achieving the learning objectives.
      • How the proposed innovation is matched to the needs, abilities, and interests of targeted learners.
      • Because deep understanding and facile capabilities emerge only over long periods of time, how the proposed innovation or its integration into some learning environment is expected to sustain engagement.
      • The foundational research questions that arise from the national need and that will be answered in the context of the proposed innovation
    • Research Plan (for EXP, DIP, INDP, and partnership-building CAP proposals)

    With appropriate references to the literature, support the significance of and need for answering the research questions that have been proposed, and provide a comprehensive research plan to answer them. Distinguish between what is already known and what you will add to the literature. Describe the data to be gathered and analytic approaches to be taken to analyze the data.

    • Technological Innovation Plan and Expected Outcomes (for EXP, DIP, INDP, and partnership-building CAP proposals)

    Describe how the proposed innovations and ways of integrating them into the learning environment take into account the environmental and human factors important to learner success (e.g., the cognitive, developmental, affective, and social needs of learners, the cultural milieu in which the learning technologies will be used, and the capabilities and expectations of human agents in the environment). Make clear the learning domain to be explored (e.g., content, subject matter, topics, skills, practices), and make a research-based case for the promise of the particular technological innovation for promoting targeted learning. All claims about the appropriateness of the proposed innovation should be supported with evidence from the literature.

    In describing the technological innovation, make clear your vision of the experiences of learners and others interacting with the proposed technology. Include up to five diagrams and/or screen shots in the supplementary materials to help readers have a feel for those experiences.

    It is anticipated that technological innovations will be iteratively refined over the course of the project based on analysis of formative data. Describe the plan for iterative refinement, including the data that will be collected and analyzed in support of formative evaluation, including means of assessing learning and engagement. Describe the project outcomes you expect to generate, including products. Discuss how you will collect and analyze data to supply evidence of learning outcomes.

    DIP and INDP projects should include efficacy studies. Describe your vision of the products that will emerge from iterative refinements. Discuss how you will judge the efficacy of the innovation, the data you will collect, and analysis plans.

    • Prior Support.

    Only prior support directly related to the proposed activities should be included.

    Please note that per guidance in the GPG, the Project Description must contain, as a separate section within the narrative, a discussion of the broader impacts of the proposed activities. You can decide where to include this section within the Project Description.

    • Collaboration and Management Plan.

    A Collaboration and Management Plan is required for all Cyberlearning proposals. The length of and degree of detail provided in the Collaboration and Management Plan should be commensurate with the complexity of the proposed project. Collaboration and Management Plans should be included at the end of the Project Description in a section entitled "Collaboration and Management Plan". Up to 3 additional pages are allowed for these plans. The Collaboration and Management Plan should describe:

    • the specific roles of the project participants in all organizations involved;
    • information on how the project will be managed across all the investigators, institutions, and/or disciplines;
    • identification of the specific coordination mechanisms that will enable cross-investigator, cross-institution, and/or cross-discipline scientific integration (e.g., yearly workshops, graduate student exchange, project meetings at conferences, use of videoconferencing resources or social media technologies, software repositories, etc.); and
    • specific references to budget line items that support collaboration and coordination mechanisms.

    Supplementary Documents: The following supplementary documents are required and should be uploaded into the Supplementary Documents Section. No other supplementary materials are allowed.

    1. List of Project Personnel and Partner Institutions (Note - In collaborative proposals, only the lead institution should provide this information): Provide current, accurate information for all personnel and institutions involved in the project. NSF staff will use this information in the merit review process to manage conflicts of interest. The list should include all PIs, Co-PIs, Senior Personnel, paid/unpaid Consultants or Collaborators, Sub awardees, Postdocs, and project-level advisory committee members. This list should be numbered, in alphabetical order by last name, and include for each entry (in this order) Full name, Organization(s), and Role in the project, with each item separated by a semi-colon. Each person listed should start a new numbered line. For example:

    1. Mary Adams; XYZ University; PI

    2. John Brown; University of PQR; Senior Personnel

    3. Jane Green; XYZ University; Postdoc

    4. Bob Jones; ABC Inc.; Paid Consultant

    5. Mary Smith; Welldone Institution; Unpaid Collaborator

    6. Tim White; ZZZ University; Subawardee

    2. Letters of commitment from participating personnel and institution (no other letters are allowed)

    3. Diagrams and/or screen shots (for EXP, DIP, and INDP proposals): Up to five (5) diagrams or screen shots that will help readers grasp the envisioned experiences of learners interacting with the proposed technological innovation. Short captions that name the diagram or screen shot and point to its essential elements are allowed; additional textual material is not allowed with the diagrams.

    4. Postdoctoral Researcher Mentoring Plan: Proposals that include funding to support postdoctoral researchers must include a Postdoctoral Researcher Mentoring Plan as a supplementary document. The plan should describe a description of the mentoring activities that will be provided for such individuals. Please be advised that a proposal that requires a Postdoctoral Research Mentoring Plan but does not include one cannot be funded. See Chapter II.C.2.j of the GPG for further information about the implementation of this requirement.

    5. Data Management Plan: All proposals must include a data-management plan or assert the absence of the need for such a plan. A data-management plan specifies the procedures you will use for keeping, storing, and sharing your data. It should include the method for making the data anonymous. FastLane will not permit submission of a proposal that is missing a Data Management Plan. The Data Management Plan will be reviewed as part of the intellectual merit or broader impacts of the proposal, or both, as appropriate. See Chapter II.C.2.j of the GPG for further information about the implementation of this requirement.

    B. Budgetary Information

    Cost Sharing: Inclusion of voluntary committed cost sharing is prohibited

    Budget Preparation Instructions:

    The budget must include funds to support travel to annual PI meetings.

    C. Due Dates

    • Letter of Intent Due Date(s) (required) (due by 5 p.m. proposer's local time):

      May 14, 2012

      for Integration and Deployment Projects (INDPs) only

      May 14, 2013

      for Integration and Deployment Projects (INDPs) only

    • Full Proposal Deadline(s) (due by 5 p.m. proposer's local time):

      December 15, 2011

      Exploration Projects (EXPs)

      January 18, 2012

      Design and Implementation Projects (DIPs)

      February 15, 2012

      Cyberlearning Resource Center (CRC)

      July 16, 2012

      Integration and Deployment Projects (INDPs)

      December 17, 2012

      Exploration Projects (EXPs)

      January 16, 2013

      Design and Implementation Projects (DIPs)

      July 15, 2013

      Integration and Deployment Projects (INDPs)

    • Full Proposal Target Date(s):

      March 16, 2012

      Capacity-Building Projects (CAPs)

      October 15, 2012

      Capacity-Building Projects (CAPs)

      March 15, 2013

      Capacity-Building Projects (CAPs)

    D. FastLane/Grants.gov Requirements

    • For Proposals Submitted Via FastLane:

      Detailed technical instructions regarding the technical aspects of preparation and submission via FastLane are available at: https://www.fastlane.nsf.gov/a1/newstan.htm. For FastLane user support, call the FastLane Help Desk at 1-800-673-6188 or e-mail fastlane@nsf.gov. The FastLane Help Desk answers general technical questions related to the use of the FastLane system. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this funding opportunity.

      Submission of Electronically Signed Cover Sheets. The Authorized Organizational Representative (AOR) must electronically sign the proposal Cover Sheet to submit the required proposal certifications (see Chapter II, Section C of the Grant Proposal Guide for a listing of the certifications). The AOR must provide the required electronic certifications within five working days following the electronic submission of the proposal. Further instructions regarding this process are available on the FastLane Website at: https://www.fastlane.nsf.gov/fastlane.jsp.

    • For Proposals Submitted Via Grants.gov:

      Before using Grants.gov for the first time, each organization must register to create an institutional profile. Once registered, the applicant's organization can then apply for any federal grant on the Grants.gov website. Comprehensive information about using Grants.gov is available on the Grants.gov Applicant Resources webpage: http://www07.grants.gov/applicants/app_help_reso.jsp. In addition, the NSF Grants.gov Application Guide provides additional technical guidance regarding preparation of proposals via Grants.gov. For Grants.gov user support, contact the Grants.gov Contact Center at 1-800-518-4726 or by email: support@grants.gov. The Grants.gov Contact Center answers general technical questions related to the use of Grants.gov. Specific questions related to this program solicitation should be referred to the NSF program staff contact(s) listed in Section VIII of this solicitation.

      Submitting the Proposal: Once all documents have been completed, the Authorized Organizational Representative (AOR) must submit the application to Grants.gov and verify the desired funding opportunity and agency to which the application is submitted. The AOR must then sign and submit the application to Grants.gov. The completed application will be transferred to the NSF FastLane system for further processing.

    VI. NSF Proposal Processing And Review Procedures

    Proposals received by NSF are assigned to the appropriate NSF program for acknowledgement and, if they meet NSF requirements, for review. All proposals are carefully reviewed by a scientist, engineer, or educator serving as an NSF Program Officer, and usually by three to ten other persons outside NSF either as ad hoc reviewers, panelists, or both, who are experts in the particular fields represented by the proposal. These reviewers are selected by Program Officers charged with oversight of the review process. Proposers are invited to suggest names of persons they believe are especially well qualified to review the proposal and/or persons they would prefer not review the proposal. These suggestions may serve as one source in the reviewer selection process at the Program Officer's discretion. Submission of such names, however, is optional. Care is taken to ensure that reviewers have no conflicts of interest with the proposal. In addition, Program Officers may obtain comments from site visits before recommending final action on proposals. Senior NSF staff further review recommendations for awards. A flowchart that depicts the entire NSF proposal and award process (and associated timeline) is included in the GPG as Exhibit III-1.

    A comprehensive description of the Foundation's merit review process is available on the NSF website at: https://www.nsf.gov/bfa/dias/policy/meritreview/.

    Proposers should also be aware of core strategies that are essential to the fulfillment of NSF's mission, as articulated in Empowering the Nation Through Discovery and Innovation: NSF Strategic Plan for Fiscal Years (FY) 2011-2016. These strategies are integrated in the program planning and implementation process, of which proposal review is one part. NSF's mission is particularly well-implemented through the integration of research and education and broadening participation in NSF programs, projects, and activities.

    One of the core strategies in support of NSF's mission is to foster integration of research and education through the programs, projects and activities it supports at academic and research institutions. These institutions provide abundant opportunities where individuals may concurrently assume responsibilities as researchers, educators, and students, and where all can engage in joint efforts that infuse education with the excitement of discovery and enrich research through the variety of learning perspectives.

    Another core strategy in support of NSF's mission is broadening opportunities and expanding participation of groups, institutions, and geographic regions that are underrepresented in STEM disciplines, which is essential to the health and vitality of science and engineering. NSF is committed to this principle of diversity and deems it central to the programs, projects, and activities it considers and supports.

    A. Merit Review Principles and Criteria

    The National Science Foundation strives to invest in a robust and diverse portfolio of projects that creates new knowledge and enables breakthroughs in understanding across all areas of science and engineering research and education. To identify which projects to support, NSF relies on a merit review process that incorporates consideration of both the technical aspects of a proposed project and its potential to contribute more broadly to advancing NSF's mission "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense; and for other purposes." NSF makes every effort to conduct a fair, competitive, transparent merit review process for the selection of projects.

    1. Merit Review Principles

    These principles are to be given due diligence by PIs and organizations when preparing proposals and managing projects, by reviewers when reading and evaluating proposals, and by NSF program staff when determining whether or not to recommend proposals for funding and while overseeing awards. Given that NSF is the primary federal agency charged with nurturing and supporting excellence in basic research and education, the following three principles apply:

    • All NSF projects should be of the highest quality and have the potential to advance, if not transform, the frontiers of knowledge.
    • NSF projects, in the aggregate, should contribute more broadly to achieving societal goals. These "Broader Impacts" may be accomplished through the research itself, through activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. The project activities may be based on previously established and/or innovative methods and approaches, but in either case must be well justified.
    • Meaningful assessment and evaluation of NSF funded projects should be based on appropriate metrics, keeping in mind the likely correlation between the effect of broader impacts and the resources provided to implement projects. If the size of the activity is limited, evaluation of that activity in isolation is not likely to be meaningful. Thus, assessing the effectiveness of these activities may best be done at a higher, more aggregated, level than the individual project.

    With respect to the third principle, even if assessment of Broader Impacts outcomes for particular projects is done at an aggregated level, PIs are expected to be accountable for carrying out the activities described in the funded project. Thus, individual projects should include clearly stated goals, specific descriptions of the activities that the PI intends to do, and a plan in place to document the outputs of those activities.

    These three merit review principles provide the basis for the merit review criteria, as well as a context within which the users of the criteria can better understand their intent.

    2. Merit Review Criteria

    All NSF proposals are evaluated through use of the two National Science Board approved merit review criteria. In some instances, however, NSF will employ additional criteria as required to highlight the specific objectives of certain programs and activities.

    The two merit review criteria are listed below. Both criteria are to be given full consideration during the review and decision-making processes; each criterion is necessary but neither, by itself, is sufficient. Therefore, proposers must fully address both criteria. (GPG Chapter II.C.2.d.i. contains additional information for use by proposers in development of the Project Description section of the proposal.) Reviewers are strongly encouraged to review the criteria, including GPG Chapter II.C.2.d.i., prior to the review of a proposal.

    When evaluating NSF proposals, reviewers will be asked to consider what the proposers want to do, why they want to do it, how they plan to do it, how they will know if they succeed, and what benefits could accrue if the project is successful. These issues apply both to the technical aspects of the proposal and the way in which the project may make broader contributions. To that end, reviewers will be asked to evaluate all proposals against two criteria:

    • Intellectual Merit: The Intellectual Merit criterion encompasses the potential to advance knowledge; and
    • Broader Impacts: The Broader Impacts criterion encompasses the potential to benefit society and contribute to the achievement of specific, desired societal outcomes.

    The following elements should be considered in the review for both criteria:

    1. What is the potential for the proposed activity to
      1. Advance knowledge and understanding within its own field or across different fields (Intellectual Merit); and
      2. Benefit society or advance desired societal outcomes (Broader Impacts)?
    2. To what extent do the proposed activities suggest and explore creative, original, or potentially transformative concepts?
    3. Is the plan for carrying out the proposed activities well-reasoned, well-organized, and based on a sound rationale? Does the plan incorporate a mechanism to assess success?
    4. How well qualified is the individual, team, or organization to conduct the proposed activities?
    5. Are there adequate resources available to the PI (either at the home organization or through collaborations) to carry out the proposed activities?

    Broader impacts may be accomplished through the research itself, through the activities that are directly related to specific research projects, or through activities that are supported by, but are complementary to, the project. NSF values the advancement of scientific knowledge and activities that contribute to achievement of societally relevant outcomes. Such outcomes include, but are not limited to: full participation of women, persons with disabilities, and underrepresented minorities in science, technology, engineering, and mathematics (STEM); improved STEM education and educator development at any level; increased public scientific literacy and public engagement with science and technology; improved well-being of individuals in society; development of a diverse, globally competitive STEM workforce; increased partnerships between academia, industry, and others; improved national security; increased economic competitiveness of the United States; and enhanced infrastructure for research and education.

    Proposers are reminded that reviewers will also be asked to review the Data Management Plan and the Postdoctoral Researcher Mentoring Plan, as appropriate.

    Additional Solicitation Specific Review Criteria

    All EXP, DIP, INDP, and CAP projects will be judged according to the following additional criteria:

    • The proposed technological innovation, the research questions to be addressed, and the plans for research and development will all be evaluated for intellectual merit and potential broader impacts.
    • The transformative potential of the proposed project.
    • The degree to which the Collaboration and Management Plan adequately demonstrates that participating investigators and advisors will work synergistically to accomplish the program objectives.

    For Design and Implementation Projects (DIP) and Integration and Deployment Projects (INDP), reviewers will be asked to comment on the extent to which the project scope justifies the level of investment requested.

    CRCs will be judged according to the criteria laid out in describing the requirements.

    B. Review and Selection Process

    Proposals submitted in response to this program solicitation will be reviewed by Ad hoc Review and/or Panel Review.

    Reviewers will be asked to formulate a recommendation to either support or decline each proposal. The Program Officer assigned to manage the proposal's review will consider the advice of reviewers and will formulate a recommendation.

    After scientific, technical and programmatic review and consideration of appropriate factors, the NSF Program Officer recommends to the cognizant Division Director whether the proposal should be declined or recommended for award. NSF is striving to be able to tell applicants whether their proposals have been declined or recommended for funding within six months. The time interval begins on the deadline or target date, or receipt date, whichever is later. The interval ends when the Division Director accepts the Program Officer's recommendation.

    A summary rating and accompanying narrative will be completed and submitted by each reviewer. In all cases, reviews are treated as confidential documents. Verbatim copies of reviews, excluding the names of the reviewers, are sent to the Principal Investigator/Project Director by the Program Officer. In addition, the proposer will receive an explanation of the decision to award or decline funding.

    In all cases, after programmatic approval has been obtained, the proposals recommended for funding will be forwarded to the Division of Grants and Agreements for review of business, financial, and policy implications and the processing and issuance of a grant or other agreement. Proposers are cautioned that only a Grants and Agreements Officer may make commitments, obligations or awards on behalf of NSF or authorize the expenditure of funds. No commitment on the part of NSF should be inferred from technical or budgetary discussions with a NSF Program Officer. A Principal Investigator or organization that makes financial or personnel commitments in the absence of a grant or cooperative agreement signed by the NSF Grants and Agreements Officer does so at their own risk.

    VII. Award Administration Information

    A. Notification of the Award

    Notification of the award is made to the submitting organization by a Grants Officer in the Division of Grants and Agreements. Organizations whose proposals are declined will be advised as promptly as possible by the cognizant NSF Program administering the program. Verbatim copies of reviews, not including the identity of the reviewer, will be provided automatically to the Principal Investigator. (See Section VI.B. for additional information on the review process.)

    B. Award Conditions

    An NSF award consists of: (1) the award letter, which includes any special provisions applicable to the award and any numbered amendments thereto; (2) the budget, which indicates the amounts, by categories of expense, on which NSF has based its support (or otherwise communicates any specific approvals or disapprovals of proposed expenditures); (3) the proposal referenced in the award letter; (4) the applicable award conditions, such as Grant General Conditions (GC-1); * or Research Terms and Conditions * and (5) any announcement or other NSF issuance that may be incorporated by reference in the award letter. Cooperative agreements also are administered in accordance with NSF Cooperative Agreement Financial and Administrative Terms and Conditions (CA-FATC) and the applicable Programmatic Terms and Conditions. NSF awards are electronically signed by an NSF Grants and Agreements Officer and transmitted electronically to the organization via e-mail.

    *These documents may be accessed electronically on NSF's Website at https://www.nsf.gov/awards/managing/award_conditions.jsp?org=NSF. Paper copies may be obtained from the NSF Publications Clearinghouse, telephone (703) 292-7827 or by e-mail from nsfpubs@nsf.gov.

    More comprehensive information on NSF Award Conditions and other important information on the administration of NSF awards is contained in the NSF Award & Administration Guide (AAG) Chapter II, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=aag.

    C. Reporting Requirements

    For all multi-year grants (including both standard and continuing grants), the Principal Investigator must submit an annual project report to the cognizant Program Officer at least 90 days prior to the end of the current budget period. (Some programs or awards require submission of more frequent project reports). Within 90 days following expiration of a grant, the PI also is required to submit a final project report, and a project outcomes report for the general public.

    Failure to provide the required annual or final project reports, or the project outcomes report, will delay NSF review and processing of any future funding increments as well as any pending proposals for all identified PIs and co-PIs on a given award. PIs should examine the formats of the required reports in advance to assure availability of required data.

    PIs are required to use NSF's electronic project-reporting system, available through Research.gov, for preparation and submission of annual and final project reports. Such reports provide information on accomplishments, project participants (individual and organizational), publications, and other specific products and impacts of the project. Submission of the report via Research.gov constitutes certification by the PI that the contents of the report are accurate and complete. The project outcomes report also must be prepared and submitted using Research.gov. This report serves as a brief summary, prepared specifically for the public, of the nature and outcomes of the project. This report will be posted on the NSF website exactly as it is submitted by the PI.

    More comprehensive information on NSF Reporting Requirements and other important information on the administration of NSF awards is contained in the NSF Award & Administration Guide (AAG) Chapter II, available electronically on the NSF Website at https://www.nsf.gov/publications/pub_summ.jsp?ods_key=aag.

    VIII. Agency Contacts

    Please note that the program contact information is current at the time of publishing. See program website for any updates to the points of contact.

    General inquiries regarding this program should be made to:

    • Janet Kolodner, Program Officer, CISE/IIS and EHR/DRL, 1125, telephone: 703-292-8930, email: jkolodne@nsf.gov

    • Lee L. Zia, Program Officer, EHR/DUE, 835N, telephone: 703-292-5140, email: lzia@nsf.gov

    • Sharon Tettegah, Program Officer, EHR/DRL, 885 S, telephone: 703-292-5092, email: stettega@nsf.gov

    • Mimi McClure, Program Officer, OD/OCI, 1145 S, telephone: 703-292-5197, email: mmcclure@nsf.gov

    • Soo-Siang Lim, Program Officer, SBE/OAD, 905 N, telephone: 703-292-7878, email: slim@nsf.gov

    For questions related to the use of FastLane, contact:

    For questions relating to Grants.gov contact:

    • Grants.gov Contact Center: If the Authorized Organizational Representatives (AOR) has not received a confirmation message from Grants.gov within 48 hours of submission of application, please contact via telephone: 1-800-518-4726; e-mail: support@grants.gov.

    IX. Other Information

    The NSF Website provides the most comprehensive source of information on NSF Directorates (including contact information), programs and funding opportunities. Use of this Website by potential proposers is strongly encouraged. In addition, National Science Foundation Update is a free e-mail subscription service designed to keep potential proposers and other interested parties apprised of new NSF funding opportunities and publications, important changes in proposal and award policies and procedures, and upcoming NSF Regional Grants Conferences. Subscribers are informed through e-mail when new publications are issued that match their identified interests. Users can subscribe to this service by clicking the "Get NSF Updates by Email" link on the NSF web site.

    Grants.gov provides an additional electronic capability to search for Federal government-wide grant opportunities. NSF funding opportunities may be accessed via this new mechanism. Further information on Grants.gov may be obtained at http://www.grants.gov.

    About The National Science Foundation

    The National Science Foundation (NSF) is an independent Federal agency created by the National Science Foundation Act of 1950, as amended (42 USC 1861-75). The Act states the purpose of the NSF is "to promote the progress of science; [and] to advance the national health, prosperity, and welfare by supporting research and education in all fields of science and engineering."

    NSF funds research and education in most fields of science and engineering. It does this through grants and cooperative agreements to more than 2,000 colleges, universities, K-12 school systems, businesses, informal science organizations and other research organizations throughout the US. The Foundation accounts for about one-fourth of Federal support to academic institutions for basic research.

    NSF receives approximately 55,000 proposals each year for research, education and training projects, of which approximately 11,000 are funded. In addition, the Foundation receives several thousand applications for graduate and postdoctoral fellowships. The agency operates no laboratories itself but does support National Research Centers, user facilities, certain oceanographic vessels and Arctic and Antarctic research stations. The Foundation also supports cooperative research between universities and industry, US participation in international scientific and engineering efforts, and educational activities at every academic level.

    Facilitation Awards for Scientists and Engineers with Disabilities provide funding for special assistance or equipment to enable persons with disabilities to work on NSF-supported projects. See Grant Proposal Guide Chapter II, Section D.2 for instructions regarding preparation of these types of proposals.

    The National Science Foundation has Telephonic Device for the Deaf (TDD) and Federal Information Relay Service (FIRS) capabilities that enable individuals with hearing impairments to communicate with the Foundation about NSF programs, employment or general information. TDD may be accessed at (703) 292-5090 and (800) 281-8749, FIRS at (800) 877-8339.

    The National Science Foundation Information Center may be reached at (703) 292-5111.

    The National Science Foundation promotes and advances scientific progress in the United States by competitively awarding grants and cooperative agreements for research and education in the sciences, mathematics, and engineering.

    To get the latest information about program deadlines, to download copies of NSF publications, and to access abstracts of awards, visit the NSF Website at https://www.nsf.gov

    • Location:

    4201 Wilson Blvd. Arlington, VA 22230

    • For General Information
      (NSF Information Center):

    (703) 292-5111



     

    Policies and Important Links

    |

    Privacy | FOIA | Help | Contact NSF | Contact Web Master | SiteMap  

    National Science Foundation

    The National Science Foundation, 4201 Wilson Boulevard, Arlington, Virginia 22230, USA
    Tel: (703) 292-5111, FIRS: (800) 877-8339 | TDD: (800) 281-8749

    Last Updated:
    11/07/06
    Text Only