Other Stuff

NSB/MR-97-05

National Science Board and
National Science Foundation Staff
Task Force on Merit Review
 

NSF logo

Final Recommendations

March 1997


NATIONAL SCIENCE BOARD

 
  • DR. F. ALBERT COTTON, Distinguished Professor, Department of Chemistry, Texas A&M University
  • DR. CHARLES E. HESS,* Director of International Programs, University of California-Davis
  • DR. JOHN E. HOPCROFT,* Joseph Silbert Dean of Engineering, Cornell University
  • DR. SHIRLEY M. MALCOM,* Head, Directorate for Education and Human Resources Programs, American Association for the Advancement of Science
  • DR. JAMES L. POWELL,* President & Director, Los Angeles Museum of Natural History
  • DR. FRANK H.T. RHODES, President Emeritus, Cornell University
  • DR. IAN M. ROSS, President-Emeritus, AT&T Bell Laboratories
  • DR. RICHARD N. ZARE* (Chairman), Professor, Department of Chemistry, Stanford University
  • DR. SANFORD D. GREENBERG, Chairman & CEO of TEI Industries, Inc.
  • DR. EVE L. MENGER, Director, Characterization Science & Services, Corning Incorporated
  • DR. CLAUDIA I. MITCHELL-KERNAN, Vice Chancellor, Academic Affairs and Dean, Graduate Division, University of California
  • DR. DIANA S. NATALICIO* (Vice Chairman), President, The University of Texas at El Paso
  • DR. ROBERT M. SOLOW, Institute Professor Emeritus, Massachusetts Institute of Technology
  • DR. WARREN M. WASHINGTON, Senior Scientist and Head, Climate Change Research Section, National Center for Atmospheric Research
  • DR. JOHN A. WHITE, JR., Regents' Professor and Dean of Engineering, Georgia Institute of Technology
  • DR. JOHN A. ARMSTRONG,** IBM Vice President for Science & Technology (Retired)
  • DR. MARY K. GAILLARD,** Professor of Physics, University of California, Berkeley
  • DR. M.R.C. GREENWOOD,** Chancellor, University of California, Santa Cruz
  • DR. STANLEY V. JASKOLSKI,** Vice President, Eaton Corporation
  • DR. EAMON M. KELLY, **President, Tulane University
  • DR. JANE LUBCHENCO,** Wayne and Gladys Valley Professor of Marine Biology and Distinguished Professor of Zoology, Oregon State University
  • DR. VERA RUBIN,** Staff Member (Astronomy), Department of Terrestrial Magnetism, Carnegie Institution of Washington
  • DR. BOB H. SUZUKI,** President, California State Polytechnic University
  • DR. RICHARD TAPIA,** Professor, Department of Computational & Applied Mathematics, Rice University
  • DR. NEAL F. LANE* (Chairman, Executive Committee), Director, NSF
  • DR. MARTA CEHELSKY, Executive Officer

*Member, Executive Committee
**NSB nominee pending U.S. Senate confirmation

Members of the Task Force

National Science Board Members

Dr. Warren M. Washington, Chair

Dr. Mary K. Gaillard

Dr. Shirley M. Malcom

Dr. Eamon M. Kelly

National Science Foundation Staff

Dr. Mary E. Clutter

Dr. John B. Hunt

Mr. Paul J. Herer,

Executive Secretary


NSB/MR-97-05
March 18, 1997

 

NSB-NSF Staff Merit Review Task Force
Final Recommendations


I. Introduction


Merit review is the cornerstone of the NSF's work. Through the use of merit review, NSF seeks to maintain the high standards of excellence and accountability for which it is known around the world. NSF's current criteria were adopted by the National Science Board in 1981.

At the November 1996 meeting of the National Science Board, the NSB-NSF Staff Merit Review Task Force recommended that the current merit review criteria be simplified and that the language be harmonized with the NSF strategic plan. These recommendations are contained in the report: NSB/MR-96-15, NSB-NSF Staff Merit Review Task Force Discussion Report, November 20, 1996 (Appendix I).

The Board received this report and asked the NSF Director to share the proposed revisions to the merit review criteria with the science and engineering community in order to solicit its input. To encourage the broadest possible comment and discussion, NSF solicited the comments through press coverage and through direct contacts among staff, universities, and professional associations. The proposed criteria were also posted on the World Wide Web, with a response form to facilitate suggestions and reactions.

Via the feedback mechanisms provided on these Web pages, NSF received over 300 responses, most from tenured faculty who had experience with the NSF merit review process. A number of comments also arrived in the form of written letters. A majority of respondents who gave an overall positive or negative opinion favored the change, although many had some suggestions for further improvement or clarification. Overall, the community responses, although not a representative sample of the community, were informative and useful in helping the task force to draft improved criteria.

Final Recommendations

The Task Force recommends that the following two criteria be adopted in place of the four criteria that are currently used.

  1. What is the intellectual merit of the proposed activity?
    The following are suggested questions to consider in assessing how well the proposal meets this criterion: How important is the proposed activity to advancing knowledge and understanding within its own field and across different fields? How well qualified is the proposer (individual or team) to conduct the project? (If appropriate, please comment on the quality of prior work.) To what extent does the proposed activity suggest and explore creative and original concepts? How well conceived and organized is the proposed activity? Is there sufficient access to resources?
  2. What are the broader impacts of the proposed activity?
    The following are suggested questions to consider in assessing how well the proposal meets this criterion: How well does the activity advance discovery and understanding while promoting teaching, training, and learning? How well does the proposed activity broaden the participation of underrepresented groups (e.g., gender, ethnicity, geographic, etc.)? To what extent will it enhance the infrastructure for research and education, such as facilities, instrumentation, networks, and partnerships? Will the results be disseminated broadly to enhance scientific and technological understanding? What may be the benefits of the proposed activity to society?

The Task Force further recommends that a cover sheet be attached to the proposal review form, which presents the context for using the criteria. The suggested language for this cover sheet is as follows:
Important! Please Read Before Beginning Your Review!

In evaluating this proposal, you are requested to provide detailed comments for each of the two NSF Merit Review Criteria described below. Following each criterion is a set of suggested questions to consider in assessing how well the proposal meets the criterion. Please respond with substantive comments addressing the proposal's strengths and weaknesses. In addition to the suggested questions, you may consider other relevant questions that address the NSF criteria (but you should make this explicit in your review). Further, you are asked to address only those questions which you consider relevant to the proposal and that you feel qualified to make judgments on.

When assigning your summary rating, remember that the two criteria need not be weighted equally. Emphasis should depend upon either (1) additional guidance you have received from NSF or (2) your own judgment of the relative importance of the criteria to the proposed work. Finally, you are requested to write a summary statement that explains the rating that you assigned to the proposal. This statement should address the relative importance of the criteria and the extent to which the proposal actually meets both criteria.


Regarding the "ratings" issue, which was highlighted in the Discussion Report, the Task Force recommends that the NSF "generic" proposal review form provide for the following:

  • separate comments for each criterion
  • single composite rating
  • a summary recommendation (narrative) that address both criteria

Note: The Task Force's recommendations are exemplified in the attached sample NSF Proposal Review Form (Appendix II).

In implementing the new criteria, the Task Force believes that NSF should address such issues as: (1) designing proposal review forms (both paper and electronic) that are clear and easy to use (2) training NSF staff, and (3) revising NSF's proposal preparation guidelines. The Task Force recommends that NSF proceed without delay to full implementation of the proposed changes.

Analysis of Public Comment and Rationale for Task Force Recommendations

A brief analysis of public responses to the Task Force recommendations was prepared by the NSF Office of Policy Support (OPS). This report (Appendix III), which proved very useful to the Task Force, attempts to characterize the individuals who responded and summarize their views about the changes in the proposed criteria.

The Task Force members also read each of the individual responses received by NSF and prepared an analysis of the issues that were raised. It then met on February 19, 1997 to discuss and resolve these issues and prepare its final recommendations. The Task Force's analysis of these issues is presented below.

#1 A central issue is the "weighting or threshold" issue, which was raised by perhaps a third of the respondents. Many respondents expressed concern that adopting the new criteria will lead to a decline in NSF's standards of excellence; i.e. "excellent research with ok relevance" will be equated with "ok research with excellent relevance." Others stated that, for research proposals, Criterion #1 is much more important than Criterion #2, and should be weighted accordingly (some suggested 90/10). Still others criticized Criterion #2 as irrelevant, ambiguous, or poorly worded.

Several options for responding to this issue were identified and discussed:

    a) In the introductory wording that "presents" the criteria to the reviewer, emphatically state that the criteria need not be weighted equally, i.e., rather relative weighting depends upon the nature of the proposed activity.

    b) For most, if not all, proposals, NSF should present the first criterion as a "threshold" criterion. In other words, NSF will not fund anything that does not pass muster on Criterion #1. Criterion #2 should be used to select among those proposals that exceed the Criterion #1 threshold.

    c) Differentiate the criteria for basic research, applied research, and education proposals. This can be done with language introducing/explaining the criteria; i.e., what is currently done for the current four criteria. The extreme implementation of this is to have different sets of criteria for these different categories of activities

    d) Resolve the imbalance between the two criteria by having Criterion #2 address BOTH the intellectual impact and the "broader" impacts, which it currently does not do. This would be accomplished by adding something like the following question to Criterion #2: "How important is the proposed activity to advancing knowledge and learning within its own field and across different fields?"

Recommendation: The Task Force believes that option (a) is the best one because it does not polarize the research and education communities and can be applied very flexibly. For example, when the reviewer assigns an overall rating, the two criteria need not be weighted equally but should depend upon either (1) additional guidance received from NSF and/or (2) the reviewer's judgment of the relative importance of the criteria to the proposed work.

#2 Another issue raised by the community is the "presentation issue;" i.e. how is NSF going to get reviewers to pay attention to the new criteria, which will be printed on the back of a form?

Recommendation: The Task Force recommends that NSF prepare a sample reviewer form, both for regular mailing and for e-mail reviews. A half page tear-off cover sheet can be attached to the review form, which presents the context for using the criteria. For example, at the top of the page, it could state: PLEASE READ THIS BEFORE BEGINNING YOUR REVIEW!

#3 A third issue, raised by perhaps 20% of the respondents, can be summed up as follows: In the review of the quality of the proposed research, NSF should give greater prominence to research performer competence. Many individuals who made this recommendation suggested that NSF have a separate criteria for this.

Recommendation: The Task Force believes this could be handled with some editing rather than creating a third criterion. It also thinks that, in giving prominence to proposer competence, it is important not to create a bias against new investigators entering the field. Hence, in Criterion #1 the following question should be changed to: "How well qualified is the proposer (individual or team) to conduct the project?" Also, this question should be moved up from third to second position.

#4 A substantial fraction of the respondents indicated that the question under Criterion #2 dealing with "diversity" was ambiguous.

Recommendation: This can be addressed with some revised wording.

From: "How well does the proposed activity broaden the diversity of participants?"

To: "How well does the proposed activity broaden the participation of underrepresented groups (e.g., gender, ethnicity, geographic, etc.)?"

The Task Force also felt that some language should be included that addresses the need "to avoid undue concentration of resources." However, it is difficult to see how reviewers could respond to this issues.

#5 A number of respondents point out that the criteria need to encourage greater innovation, risk, and creativity in NSF-supported activities.

Recommendation: The Task Force believes that creativity and originality are among the most important characteristics of an NSF-supported activity; hence it recommends the following revised wording.

From: "Does the proposed activity suggest and explore new lines of inquiry?"

To: "To what extent does the proposed activity suggest and explore creative and original concepts?"

#6 Some respondents stated that, for much of basic research, it is not possible to make a meaningful statement about the potential usefulness of the research.

Recommendation: The Task Force believes that respondents may be interpreting this question too narrowly. While it may not be possible to predict specific potential applications for one's research, one should be able to discuss the value or applicability of the line of inquiry or research area. The following revised wording was recommended:

From: "And, what is the potential impact on meeting societal needs?"

To: "And, what may be the benefits of the proposed activity to society?"

#7 A number of respondents suggested that a question of criterion #1 should have reviewers take into account the benefits in relation to the research risks and costs of the proposed research activity. This may involve asking a question such as: "Is the project well-designed, with a reasonable budget and achievable time-tables?"

Recommendation: The Task Force believes that reviewers would not be in a very good position to assess budgets and time-tables. However, it recommended the following revision in order to clarify the issue:

From: "Is the plan for organizing and managing the project credible and well-conceived?"

To: "How well-conceived and organized is the proposed activity?"

#8 Current wording of the questions under each criterion tends to encourage "yes-no" responses instead of explanations from the reviewers.

Recommendation: The Task Force recommends that questions under each criterion be reworded, using such phrases as: "To what degree does ---?" or "What is the potential ---?" or "How well does ---?"

#9 There were a number of suggestions for placing greater emphasis on dissemination of results. Also, respondents had problems interpreting the question concerning scientific literacy.

Recommendation: The Task Force recommends the following revised wording.

From: "Does the activity enhance scientific and technological literacy?"

To: "Will the results be disseminated broadly to enhance scientific and technological understanding?"

#10 The "ratings" issue, which was highlighted in the Task Force Discussion Report, remains very difficult to resolve. The community is divided in its preference for a single composite rating, and separate ratings for each of the two criteria. Also, a number of people have suggested that NSF discontinue basing its ratings on hypothetical distributions, e.g., "among the top 5%."

Recommendation: The Task Force recommends that the NSF "generic" reviewer form provide for the following:

  • separate comments for each criterion
  • a single composite rating
  • a summary recommendation (narrative) that addresses both criteria.

It is also recommended that that NSF discontinue describing its ratings by referring to hypothetical distributions.

Conclusion

The Task Force believes that the proposed new criteria are flexible enough, both in their design and proposed implementation, to be useful and relevant across NSF's many different programs. Furthermore, it is expected that NSF will continue to employ special criteria to respond to the specific objectives of certain programs and activities. Hence, the Task Force recommends that NSF proceed without delay to full implementation of the proposed changes. Adoption of the new criteria will facilitate, clarify and simplify the proposal evaluation process. Excellence will continue to be the hallmark of all NSF-sponsored activities.


NSB/MR-97-05

National Science Board and
National Science Foundation Staff
Task Force on Merit Review




APPENDICES



I. NSB/MR-96-15, Discussion Report, NSB and NSF Staff Task Force on Merit Review, November 20, 1996

II. Sample NSF Proposal Review Form

III. Analysis of Responses to the NSB/NSF Report on Merit Review Criteria, Office of Policy Support, March 6, 1997.

Back to NSB Publications