Award Abstract # 1939115
Workshop: Increasing Reviewer Risk Tolerance Through Awareness

NSF Org: CMMI
Division of Civil, Mechanical, and Manufacturing Innovation
Recipient: GPRA STRATEGIC MANAGEMENT, INC.
Initial Amendment Date: August 19, 2019
Latest Amendment Date: April 2, 2020
Award Number: 1939115
Award Instrument: Standard Grant
Program Manager: Kathryn Jablokow
CMMI
 Division of Civil, Mechanical, and Manufacturing Innovation
ENG
 Directorate for Engineering
Start Date: September 1, 2019
End Date: August 31, 2020 (Estimated)
Total Intended Award Amount: $99,840.00
Total Awarded Amount to Date: $119,591.00
Funds Obligated to Date: FY 2019 = $99,840.00
FY 2020 = $19,751.00
History of Investigator:
  • Carmen Rivera (Principal Investigator)
    crivera@gprasm.com
Recipient Sponsored Research Office: GPRA Strategic Management, Inc.
3929 SHAFTSBURY CT
WHITE PLAINS
MD  US  20695-4428
(240)518-8667
Sponsor Congressional District: 05
Primary Place of Performance: GPRA Strategic Management, Inc.
White Plains
MD  US  20695-4428
Primary Place of Performance
Congressional District:
05
Unique Entity Identifier (UEI): NJ2KHL3H5H81
Parent UEI: NJ2KHL3H5H81
NSF Program(s): EDSE-Engineering Design and Sy
Primary Program Source: 01001920DB NSF RESEARCH & RELATED ACTIVIT
01002021DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7556, 9102
Program Element Code(s): 072Y00
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.041

ABSTRACT

This award assists the Civil, Mechanical, and Manufacturing Innovation Division at NSF in achieving NSF's mission "to promote the progress of science" by improving the capacity of the peer review process to recognize and appropriately evaluate proposals that advance discovery. The goals of this project are to evaluate methods based on rigorous social science research that (1) ameliorate bias and mitigate its effects in evaluating non-traditional scholars and scholarship by addressing cultural norms, group dynamics, and the efficacy of awareness training; and (2) contribute to a more robust agenda for rewarding innovation in engineering research.

The project consists of two phases. Phase One (Key Concepts Phase) will address how cognitive bias, group thing, cultural norms, panel dynamics and leadership, the dynamics of conflict and conflict resolution, and academic psyches and peer evaluation pressures of academia impact merit review processes. Phase Two (Experiential Phase) will apply the Phase One learning outcomes to a mock panel review experience. The study will identify leading indicators of impact to inform future development of the peer review process.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

The major goal of this project was to design and implement a novel professional development program for potential NSF reviewers of three programs within the NSF Engineering Directorate. The Increasing Reviewer Risk Tolerance Through Awareness (IRRTTA) initiative aimed to:

  1. ameliorate bias and its role in marginalizing principal investigators from underrepresented backgrounds, less resourced institutions, or whose scholarship incorporates non-traditional approaches to STEM;
  2. identify leading indicators of impact to inform the further development of NSF peer reviewer professional development, ultimately serving as a model for change; and
  3. contribute to a more robust agenda for rewarding innovation in Engineering.

In order to achieve these objectives, IRRTTA provided training to experienced and inexperienced panel reviewers aimed at increasing their skill, agency, and sense of responsibility for productive, interactive, and inclusive panel discussions.

IRRTTA consisted of an inquiry and design phase which led to the implementation of three training experiences. Participants who completed these trainings are known as Panel Fellows. These trainings are grounded in a comprehensive and research-based description of the dynamics of merit-review discussions and their vulnerabilities, combined with practical advice and strategies for avoiding, navigating, or mitigating these pitfalls. This design is solidly based in social science research related to group dynamics, cultural norms, cognitive biases, and the efficacy of awareness training.

This project was motivated by the increasing concern that the funding of transformative, or ?high risk/high reward,? proposals can be limited by the proposal review process itself. While the NSF grant review process provides a gold standard for merit-based peer review and funding decisions, both research and experience demonstrate that panel review discussions can be vulnerable to unresolved conflict, conscious and unconscious bias, and predictable small group discussion dysfunctions. Fortunately, social science research and best practice discussion facilitation tools are an already developed and effective protection against these vulnerabilities. IRRTTA harnessed and translated these resources to craft a powerful training tool that can be scaled to larger audiences. As a result, there is now an initial cohort of 26 engineers, or IRRTA Panel Fellows, who are empowered to effectively review proposals that are submitted to NSF?s CMMI Division. This core group will contribute positively through their own participation as panelists and promulgate the knowledge and perspective they gained.

To accomplish this training, IRRTTA synthesized social science research into specific behavioral and group dynamics patterns relevant to panel reviews. This curriculum highlights how the definition of a ?bad panelist? is highly dependent on how individuals function in groups and the ability of the group to mitigate inappropriate behavior. This responsibility is oftentimes jointly held by the Program Officer and panelists, making the peer-review process highly collaborative and thus highly dependent upon the fundamental principles of deliberative democracy. IRRTTA, as a professional development tool for NSF panelists offers a means of exercising a more democratic review process that is aimed at yielding investments in more transformative research, particularly in the engineering disciplines. Finally, IRRTTA also demonstrates the necessity of balancing competition and collaboration as conflict styles that are important in fully recognizing and funding transformative research.

While NSF seeks to fund more high risk/high reward proposals, it is oftentimes the case that ?risk? and ?reward? are too loosely defined to create the very opportunities needed for transformative research to be conducted. Through the course of this training, IRRTTA fostered a new understanding of the meaning of ?high risk/high reward? in an engineering context and provided new strategies for recognizing and promoting potentially transformative proposals. More specifically, Panel Fellows developed strategies for calibrating shared metrics for high risk/high reward proposals during panel discussions. For example, they considered how the timeline of a given project impacts the concept of both risk and reward and articulated a need for more discussion of reasonable expectations about timelines (i.e., projects whose impact registers in less than 5 years compared with those on a 5-10 year horizon or whose scope is greater than 10 years). Until now, discussion of project timeline has only rarely been associated with risk or reward. Panel Fellows also identified the need for more specific examples of high risk/high reward lines of inquiry to prime panel review discussions.


Last Modified: 11/30/2020
Modified by: Carmen A Rivera

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page