This Dear Colleague Letter has been archived.
NSF 13-096Dear Colleague Letter: Information to Principal Investigators (PIs) Planning to Submit Proposals to the Sensors and Sensing Systems (SSS) Program October 1, 2013, Deadline
Date: May 30, 2013
The Sensors and Sensing Systems (SSS) program will be conducting a proposal review pilot test of a modified proposal review process utilizing proposals submitted to the SSS program for the October 1, 2013, proposal submission deadline. Submission of a proposal to this program for this deadline will imply your willingness to participate in the process. The purpose of this pilot is to seek new approaches to proposal review that can lower the cost of the review process, improve the quality of reviews, and reduce the workload on the reviewer community, while not discouraging the submission of collaborative or highly innovative proposals. Briefly, the review process shall consist of the following:
- Proposals will be subject to ad hoc review only. There will be no panel review of these proposals.
- All proposals submitted to the SSS program will be organized into groups consisting of approximately 25 to 40 proposals.
- Each PI whose proposal is assigned to a group will be assigned to review and rank seven other proposals also in that group. Review assignments will be made so as to avoid organizational or individual conflicts-of-interest.
- All PIs must complete their review and ranking of the seven assigned proposals within 30 days of the date of their assignment. Failure to complete this review and ranking within the allotted time will result in the disqualification of the PI’s own proposal.
- A composite ranking of all proposals in each group will be determined, and each PI’s proposal ranking will be adjusted based on a measure of the “quality” of the reviews provided by the PI. The adjustment is designed to provide an incentive to all PIs to do an honest and thorough job of reviewing the proposals to which they are assigned.
- Final aggregation of proposals across the groups and award/declination decision making will be done by the Program Director as currently done.
Anonymity of reviewers will be preserved as PIs will not know which of the other PIs review their proposal. A detailed description of the pilot test process can be viewed here.
NOTE: This is a pilot test of an alternative approach to proposal review. It applies only to the SSS program and only for proposals submitted to the October 1, 2013, deadline. If you do not wish to have your proposal reviewed by the approach described above, please do not submit a proposal to the SSS program for the October 1, 2013, deadline. Alternatively, you may wait until the next submission deadline, February 15, 2014.
For those PIs who do wish to participate in this pilot test, CMMI will conduct a webinar on August 20th from 2-4pm to describe the approach in detail and to answer questions.
Please direct any questions to George A. Hazelrigg, ghazelri@nsf.gov, 703-292-7068.
The Mechanism Design Proposal Review Process
Motivation
Over the past decade, the National Science Foundation has experienced a substantial increase in the number of proposals received while proposal processing resources have remained largely constant. In response to this increase and a mandate for timely proposal processing, the Foundation has greatly increased the number of proposal review panels it holds annually. This has placed an overwhelming burden on both the NSF staff and the reviewer community, and it has dramatically increased the overall cost of proposal review. This pilot is an attempt to find an alternative proposal review process that can preserve the ability of investigators to submit multiple proposals at more than one opportunity per year while encouraging high quality and collaborative research, placing the burden of proposal review onto the reviewer community in proportion to the burden each individual imposes on the system, simplifying the internal NSF review process, ameliorating concerns of conflict-of-interest, maintaining high quality in the review process, and substantially reducing proposal review costs.
Theoretical Basis
The theoretical basis for the proposed review process lies in an area of mathematics referred to as mechanism design or, alternatively, reverse game theory. In mathematics, a game is defined as any interaction among two or more people. The purpose of mechanism design is to enable one to “design” the “mechanism,” namely the game, to obtain the desired result, in this case to efficiently obtain high-quality proposal review while providing the advantages noted above. In mechanism design, this is done by formulating a set of incentives that drive behavior in the desired direction. The mechanism presented here was devised by Michael Merrifield and Donald Saari [1].
The Process
The proposed pilot review process is as follows:
- Upon receipt of the proposals in the Sensors and Sensing Systems (SSS) program, the program director will organize the proposals into sets consisting of proposals comprising specific sub-fields. Each such set of n proposals will comprise a “group.” A typical group will contain 25-40 proposals.
- The program director will then assign to each principal investigator (PI) in each group a subset of m proposals to be reviewed by that PI. For this pilot, m=7. The approach to this proposal assignment is key to the success of this method, and is detailed below. In the event that a PI submits multiple proposals, he/she will be assigned to review 7 proposals for each proposal submitted.1
- PIs will be asked to declare their conflicts-of-interest and will be assigned only to proposals with which they do not have an institutional or individual conflict.
- Each PI will then review the assigned subset of m proposals, providing a detailed written review and score (Poor-to-Excellent) for each, and rank order the proposals in his/her subset, placing the proposals in the order which he/she thinks the group as a whole will rank them, not in the order of his/her personal preference. Failure to provide both written reviews and ranking by the specified date will automatically disqualify a PI’s proposal from further consideration. PIs will be given 30 days to complete their review and ranking of the proposals to which they are assigned. The PIs are not permitted to communicate with each other regarding this process or a proposal’s content, and they are not informed of who is reviewing their proposals.
- PIs who have not completed their reviews within the allotted time will have their proposals returned as not in compliance with the program announcement, and they will not receive reviews if any have been completed for their proposal.
- The individual rankings provided by the PIs will be combined to produce a global ranking for the group.
- Each individual PI’s rankings will be compared to the global ranking, and the PI’s ranking will be adjusted in accordance with the degree to which his/her ranking matches the global ranking. This adjustment provides an incentive to each PI to make an honest and thorough assessment of the proposals to which they are assigned as failure to do so results in the PI placing himself/herself at a disadvantage compared to others in the group.
- The program director then merges the results from the various groups, uses them as advisory to his/her award/declination recommendation, and documents his/her recommendations in accordance with current NSF practice.
The Reviewer Assignment Process
The first issue to address in the reviewer assignment process is the selection of m, namely the number of proposals to be reviewed by each PI. Although m is somewhat arbitrary, it needs to be large enough to provide a meaningful rank ordering of assigned proposals. Also, m is a disincentive to PIs against frivolous multiple proposal submissions. For example, if m=10, a PI who intends to submit 3 proposals to the program would be committing to the review of 30 proposals. On the other hand, m should not be so large as to prohibit PIs from submitting worthy proposals. Given these considerations, for this pilot, m=7. Seven proposals can be ranked in a total of 7!=5040 different orderings. This diversity provides a good basis for the incentive score adjustment.
Second, the assignment of proposals to groups should be such that the ratio m/n, namely the fraction of proposals in the group reviewed by each PI, remains relatively small. This condition makes it difficult for PIs to associate specific reviews with reviewers, and thus reviewers remain relatively anonymous.
Third, the assignment of proposals to reviewers must be such that the group is not divided into sub-groups, each with a separate set of reviewers. This is necessary to enable the global ranking of proposals within the group.2
The specific assignment algorithm to be used will be the following:
- n proposals are received, each PI will review m proposals. Thus, nm reviews will be obtained.
- PIs will be sent the full list of PIs and institutions for the group and asked to declare their conflicts
- Based both on the list of conflicts and expertise, a list of excluded proposals will be generated for each PI. This is the list of proposals not to be reviewed by each PI. Obviously, a PI’s own proposal will be on the excluded list.
- The list of PIs will be randomly ordered, i.e., a random number will be assigned to each PI and an ordered list created based on these numbers.
- Let the proposals be identified as 1, 2, 3,... , n. Beginning with PI #1 on the list, proposals will be assigned randomly as follows:
- Proposal #1 is assigned to reviewer #1 unless this proposal is on the reviewer’s excluded list. If it is on the excluded list, move to proposal #2, and so on until a proposal is assigned.
- Pick a random number from 1 to n-e(2), where e(2) is the number of excluded proposals for reviewer #2. Move through the proposal list by the chosen random number, not counting excluded proposals. Assign this proposal to reviewer #2. Continue this process, using a random number from 1 to n-e(i) for the ith reviewer.
- When a proposal has been assigned to m reviewers, it is added to the excluded lists of all reviewers.
- The process continues until all proposals are excluded, and all PIs are assigned as reviewers to m proposals.
- If necessary, adjustments may be made to accommodate the problem that the last few assignments may pose the difficulty that the remaining unexcluded proposals are excluded for the remaining PIs. This will be done manually.
Creating the Global Ordered Ranking
The global ordered ranking will be obtained using a modified Borda count. The process works as follows. Suppose a reviewer ranks proposals A, B, C, D and E from best to worst in that order. Scores are then assigned from 0-4 with A given 4 points, B three, C two, D one and E zero. Since each proposal obtains m reviews, the total score must therefore lie between 0 and m(m-1). The modified Borda count score is then the total score divided by m(m-1). Proposals are then ranked in accordance with their total modified Borda count score.
Reviewers are discouraged from tie rankings. However, noting that each reviewer has a fixed and determined total of points to assign (e.g., for m=7, each reviewer has 0+1+2+3+4+5+6=21 points), ties can be accommodated by assigning an equal number of points to each of the tied proposals, while maintaining a constant total number of points. For example, suppose a reviewer feels that proposals 2 and 3 are tied in their ranking. Then, 6 points would be assigned to proposal number 1, 4.5 points would be assigned to each of proposals 2 and 3, 3 points would be assigned to proposal number 4, and so on.
Incentivizing Good Reviewing
To promote diligence and honesty in the ranking process, PIs are given a bonus for doing a good job. The bonus consists of moving their proposals up in the ranking in accordance with the accuracy with which their ranking agrees with the global ranking. This movement will be sufficient to provide a strong incentive to reviewers to do a good job, but not large enough to severely distort the ranking merely as a result of the review process. Recognizing that, if all reviewers do an excellent job of ranking the proposals they review, all PIs’ proposals will be moved up equally, which means that the ranking will not be changed, the maximum incentive bonus will be a movement of two positions, that is, a proposal could be moved up in the ranking to a position above the next two higher proposals. The process by which this will be done is as follows.
To begin, a measure of accuracy must be derived. The measure proposed for use is the absolute deviation of the reviewer’s ranking from the global ranking. For example, suppose the global ranking is A, B, C, D, E, F, G, and suppose reviewer N provides a ranking of D, A, E, G, F, B, C. The quality index for this ranking would be Q=1+4+4+3+2+1+3=18. Obviously, perfect agreement would yield a value Q=0. Thus, lower scores are more desirable. Note that a ranking that is precisely the opposite of the global ranking, namely G, F, E, D, C, B, A, would yield Qmax=24.
The incentivized ranking is now obtained as follows. To begin, each proposal is given a score based on its rank, with higher score representing higher rank. With the example above, A, B, C, D, E, F, G, the scores would be SA>SB>SC>SD>SE>SF>SG. Given these scores, the average difference in score between adjacently ranked proposals is a=(Smax-Smin)/n. To each of these scores will be added a bonus score computed as B(N)=2a (Qmax-Q(N))/Qmax. Thus, if reviewer N submitted proposal C, the resulting score for proposal C would be SC+B(N). The final ranking of proposals is then based on the incentivized scores.
Reference
1 M. Merrifield and D. Saari, Telescope Time Without Tears: a Distributed Approach to Peer Review, Astronomy and Geophysics, Vol. 50, Issue 4, July 20, 2009, pp. 4.2-4.6.
Footnotes
1 In the case of a proposal with multiple PIs or a collaborative proposal, the team will be asked to designate one person — a PI or co-PI — who will represent the team in the review process. This person is hereafter referred to as the “PI.” Only this person from each team will participate in the proposal review process.
2 Note that this is a mathematical condition required to enable the global ranking. As the proposals in the group comprise a relatively homogeneous set, this condition should not significantly impact obtaining the appropriate expertise for the review process.