This document has been archived.

[Table of Contents]

Fastlane Logo

The National Science Foundation's FastLane System Baseline Data Collection

CROSS CASE REPORT

 

November 1996

Robert K. Yin, Ph.D.
Jill G. Hensley

Prepared by

COSMOS Corporation Logo

COSMOS Corporation for the National Science Foundation under
Prime Contract No. DIS-9103603 to Compuware Corporation

7475 Wisconsin Avenue • Suite 900 • Bethesda, MD 20814 • (301) 215-9100



Preface

This report contains cross-case data from the data collection effort targeting practices prior to the full implementation of "FastLane," the National Science Foundation's (NSF) electronic research administration. Background information and data for each participating institution in the data collection effort are reported in individual "database" reports. These reports contain information on five main topics, as listed in Exhibit 1.3 of this report. Each report is about 15-20 pages in length, not including appendices, which varied in length across reports.

COSMOS Corporation conducted this cross-case analysis for NSF as part of its subcontract with Compuware Corporation (Contract No. DIS-9103603). NSF provided the majority of the funding for the study, which targeted 15 universities that had collaborated with NSF in the development of the FastLane system. Other funding from federal agencies also enabled the study to include five universities that were involved in the planning of another electronic transmission system, Electronic Data Interchange (EDI), which is sponsored by several federal agencies, including the Department of Energy and the National Institutes of Health. Altogether, the funding permitted the collection of information about proposal processing on NSF, NIH, and DOE proposals.

Several briefings of the data contained in this report have been presented to various groups at the request of NSF. These data graphically summarize the key findings of the cross-case analysis and are contained in Appendix C of this report.

Mr. William Kirby serves as the NSF point of contact for the study team. Robert K. Yin, Ph.D. and Jill G. Hensley prepared this document. Other contributing COSMOS team members include: Dana Edwards; Ann Landy, Ph.D., project director; and Cheryl Sattler, Ph.D. Tawania McFadden was the production assistant for this document.


Contents

Preface

Section

1. Data Collection and Reporting Procedures
2. Proposal Preparation andManagement Processes 3. Proposal Preparation Burden Estimates 4. Post-submission Activities

Exhibits

1.1. FastLane Sites and Data Collection Schedule
1.2. FastLane Data Collection and Report Tracking
1.3. Contents of University Database Reports
2.1. University Groups, by Processing Time
3.1. Number of Proposals, by University (1994-1995 Academic Year)
3.2. Proposal Dollar Volume, by University (1994-1995 Academic Year)
3.3. Administrative Costs Associated With Proposal Process (Based On Most Recent Indirect Cost Proposal)
3.4. Unit Costs of Proposals, by Number and Dollar Volume of Proposals Submitted (1994-1995 Academic Year)
3.5. Cost Per Proposal, by Number of Proposals Submitted (N=15 Universities)
3.6. Proposal Volume and Costs, by Processing Time
3.7. Proposal Administration Staff Effort, by SRO, and Department (1994-1995 Academic Year)
3.8. Time Spent on Proposal Process, by SRO, College, Department, and Principal Investigator (1994-1995 Academic Year)
4.1. Amount of Time Spent on Post-submission Activites, By University (1994-1995 Academic Year)

Appendix

A. Fastlane Data Collection Protocol
B. Proposal Preparation Flow Diagrams by Grouping
C. Briefing Summary


I. Data Collection and Reporting Procedures

The information contained in this cross-case summary is based on data collected from 20 universities about their 1994-1995 proposal processes. The purpose of this study was to collect baseline data, to permit later comparisons with the full implementation of "FastLane," the National Science Foundation's (NSF) electronic research administration. Of the 20 universities targeted, 12 had research administrators that were collaborating in the development of the FastLane system; 5 participated in the planning of another electronic transmission system, Electronic Data Interchange (EDI); and 3 participated in both FastLane and EDI planning. Seven of the 20 sites were visited by members of the study team; 13 sites were involved through telephone interviews and document reviews. Exhibit 1.1 lists the universities and the main time period of data collection schedule for the study.
 

Exhibit 1.1
FastLane Sites and Data Collection Schedule

University
Point of Contact
FastLane
(FL) or
EDI
Phone (P) or Site (S) Visit
Data Collection Dates
Arizona State University Jacqueline Krones
FL
S
November 28-29, 1995
Baylor University David Pinter
EDI
P
N/A
Delaware State University Mildred Ofosu
FL
P
February 8-9, 1996
Duke University Renee deGuehery
EDI
S
December 5-6, 1995
Susan Alberts
Florida A&M University Gina Kinchlow
EDI
P
N/A
Fred Hutchinson Cancer Research Center Joann Cahill
EDI
P
June 18-19, 1996
Massachusetts Institute of Technology Julie Norris
FL & EDI
S
December 18-19, 1995
Tom Duff
Ohio State University James Ball
FL
P
February 12-13, 1996
Pennsylvania State University Robert Killoren
FL & EDI
P
November 14-15, 1996
Purdue University Lou Pellegrino
FL
P
February 5-6, 1996
Santa Rosa Community College Walter Chesbro
FL
P
February 23 & 27, 1996
Southern Illinois University Stephen Hansen
FL
P
January 18-19, 1996
Texas A&M University Jo Ann Treat
FL
S
January 23 & 25, 1996
University of California at Berkeley Neil Maxwell
FL
S
Nov. 30 & Dec. 1, 1996
University of California at Los Angeles Pamela Webb
FL & EDI
S
January 22-24, 1996
University of Chicago Mary Ellen Sheridan
FL
P
January 10-11, 1996
University of Notre Dame Doug Franson
EDI
P
February 22-23, 1996
University of South Carolina Ardis Savory
FL
P
January 22-23, 1996
University of Washington Don Allen
FL
S
January 25-26, 1996
Sinh Simmons
Virginia Polytechnic Institute Tom Hurd
FL
P
January 16-17, 1996
Exhibit 1.2 shows the authors and completion dates for each site visit. Two universities--Baylor and Florida A&M--were unable to be completed due to scheduling conflicts within the universities that eventually extended beyond this project's study period and funding. Because each university involved interviews with three different Principal Investigators of NSF proposals and one Principal Investigator of an NIH proposal, as well as staff involved in those proposal processes at the Department, College, and University levels, it was not uncommon to interview at least 15 individuals in each university. Only about a third of these interviews were completed at Baylor and Florida A&M, resulting in two incomplete sets of data for these two universities.

Data were collected about general NSF, NIH, and DOE proposal processing, within which three specific NSF proposals and one NIH proposal also were examined. The specific NSF and NIH proposals were selected by the research administrators at each institution, and the proposals varied by discipline, college, and department, where possible. Funding decisions were reached during academic year 1994-1995 for all proposals. All proposals were intended to represent traditional mainstream investigator initiatives, not large institutional competitions.

Within each institution, data were collected at the following organizational levels: the sponsored research office (for the general process); and the college, department, and principal investigator (for the specific NSF and NIH proposals). Study questions focused on five major themes: proposal preparation and tracking processes; proposal volume; financial burden and staff level of effort; electronic systems involved in the proposal preparation and submission process; and individuals' perceptions about proposal processing. Finally, interviews with two or three individuals who serve as NSF reviewers also were conducted at each site. The FastLane data collection protocol used for both on-site and telephone interviews is included in Appendix A. Exhibit 1.3 contains an illustrative, expanded Table of contents from an individual site report. In addition to the on-site and telephone interviews, the study team also asked about data regarding the post-submission proposal process; and overall expenditures at the Sponsored Projects Office and Departmental levels (obtained from their most recent indirect cost proposals). The expenditures data are used to perform the cost analyses discussed later in this report.

Upon the completion of the data collection, individual database notebooks for each university were compiled, containing: numeric data on categories that could be quantified; tabular data on the preparation and submission process; flow diagrams that illustrate the tabular data; and experiences and perceptions of university personnel and NSF reviewers. The notebooks also contained information about the electronic technology available to support these processes.

Several sites also provided background and supporting documentation to the proposal process at their institution. This information is included in the appendices to the individual notebooks. The following are samples of this type of documentation:

The sections that follow illustrate and analyze the data received across the universities in the study. To ensure confidentiality, dual coding schemes were developed for the data tables and flow charts that appear in this report for each university. For all data tables, data are listed by institution (code) in order from highest to lowest (i.e., number of proposals, proposal dollar volume, budget, and full-time equivalent positions).
 

Exhibit 1.3
Contents of University Database Reports

Introduction     
    Presentation of Data     
    Data Collection Period  
Proposal Submission Levels, Amount of Staff Effort, and Electronic Technology (Numerical Tables)     
    Number of Proposals     
    Proposal Dollar Volume     
    Proposal Administration Budget Allocation     
    Proposal Administration Staff Effort     
    Electronic Technology Capabilities Used in the Proposal Preparation and Submission Process 
Proposal Preparation and Submission Process     
    Word table of tasks, type of task, level completing task, passage of calendar time, and estimated level of effort     
    Flowchart depicting word table  
Perceptions and Experiences of University Personnel and NSF Reviewers  
    Perceptions of University Personnel about the Proposal Preparation and Submission\Process  
    Experiences and Perceptions of NSF Reviewers  
FastLane Related Items  
    Monitoring Projects  
    Cash Request  
    Funding Sources  
    Reporting Procedures  
Appendices  
    Field Visit Contact Matrix  
    Primary Documents


2. Proposal Preparation and Management Processes

2.1 Methodology

To understand the typical proposal preparation and submission process as it occurred during academic year 1994-1995, the study team developed a word table for each institution that identified: 1) each major task in the process; 2) the type of task (technical, administrative, or cost-related); 3) the level at which the task was conducted (university, college, department, or principal investigator); and 4) the amount of time required for each task (calendar and level of effort). The tables were based on interviews about the whole process as well as specific processes that occurred for three NSF proposals and one NIH proposal, deliberately selected to reflect the general process within different colleges and departments within the university. These word tables are included in the individual university reports. A flow diagram was then developed from the word table information to create a visual representation of this process. Appendix B contains the flow diagrams for each university, grouped according to the analysis discussed in Section 2.2.2.

2.2 Cross-case Findings

2.2.1 Brief Comments by Universities

As mentioned earlier, each university was asked about proposal involvement at four levels: Sponsored Research Office (SRO); college; department; and Principal Investigator (PI). The study team found, however, that most of the involvement occurred within three levels: SRO; department; and PI. In several cases, the support provided by the Sponsored Research Office (SRO) was reported to vary according to the level of expertise of Principal Investigators (PIs) and the resources available at the department

Nearly every university has a variety of hardware and software on campus, making current inter-office communications and sharing of electronic materials difficult. Other formats used in proposal preparation (budgets, certifications, etc.) are accessible outside of the university system.

Specific comments regarding proposal preparation and management processes, possibly providing insight into the processes, are as follows (ignored were frequent and standard comments about the desirability for more time or reduced burden, etc.):

Learning about Proposal Opportunities

  1. When PIs serve on review panels, they become knowledgeable about the proposal process. (Most universities)
  2. Offices scan for program announcements, on hardcopy or electronic networks. (Most universities)
  3. PIs maintain interactions with the agency's project officers. (Most universities)

Access to Electronic Formats of Proposal Materials (some items are after FastLane began)

  1. Universities can download formats developed by other universities (e.g., University of Texas; Rice University). Once downloaded, they can (inadvertently) change the application form, but they also can then create routine responses to common questions, both within and across proposals. (Universities A and Q)
  2. FastLane forms are produced in "hard" formats (Word) and are not in manipulable formats for calculating budget data (e.g., Excel). (University J)
  3. Universities cannot download from FastLane but must work on the NSF server. (Most universities)
  4. The security of electronic information is not adequately protected. (University M)

2.2.2 Four Types of Proposal Processes

To analyze the proposal preparation and management process across universities, the level of involvement (SRO, College, and Department/PI) was examined. In this overall analysis, the college level appeared less significant than the other three levels: SRO; Department, and Principal Investigator. Hence, the analysis of the flow diagrams focused primarily on these three levels.

Four main patterns emerged--universities where: 1) the SRO is involved (actively--not just alerting investigators about the opportunity to submit proposals) early in proposal preparation, and the PI submits the final proposal; 2) the SRO is involved early, and the SRO submits; 3) all levels are involved early, and the SRO submits; and 4) the Departments are involved early, and the SRO submits. Appendix B contains the flow diagrams grouped in these categories. The patterns may be considered as shifting from centralized to decentralized arrangements.

The four groups were then analyzed according to their average length of proposal preparation and submission time. The results of the analysis appear in Exhibit 2.1 and show that, overall, the average processing time of the groups decreases as the process decentralizes. The groupings were then used to analyze other findings, which appear in Section 3 of this report.
 
 
Exhibit 2.1     
University Groups, by Processing Time
Group Number and Characterization
University Code
Processing Time
(in weeks)
Group I: SRO involved early and PI submits
D
17
Average Time:
17
Group II: SRO involved early and SRO submits
K
14
E
14
H
11
F
10
N
11
Average Time:
12
Group III: All levels involved early and SRO submits
B
11
I
10
O
7
Q
9
J
6
Average Time:
9
Group IV: Department involved early and SRO submits
M
5
P
13
C
7
A
5
G
5
L
5
Average Time:
7



3. Proposal Preparation Burden Estimates

3.1 Methodology

At each university in the data collection effort, numeric information was collected about the proposal preparation process as it had been experienced during academic year 1994-1995. The following categories were covered: 1) the number of proposals submitted; 2) proposal dollar volume; 3) proposal budget allocation; 4) university administrative costs broken into two components: SRO and Department (which includes schools, colleges, and any other component beneath the SRO), and the proportion of these costs "estimated" for each component to be for proposal development; 5) the SRO proposal administration staff effort; and 6) time spent on the proposal process. No attempt was made in this inquiry to determine the "win" rate for proposals.

Because some universities were not able to provide all of the requested information--and were particularly unable to provide reliable budget information across all levels--data used in the subsequent costs analyses were taken from the data set of university administrative costs devoted to the proposal process (derived, as mentioned earlier, from their most recent indirect cost proposals).

3.2 Cross-case Findings

3.2.1 Brief Comments by Universities

A variety of perceptions regarding the burden of the proposal preparation process were expressed by respondents from the universities in the study, as follows:
  1. A university can have different versions of NSF's guidelines, and the PI might not work from the latest version (University N); further, the general guidelines may differ from the specific program guidelines, causing confusion (University L).
  2. NSF requires a single point of contact at a university, and at least one university has two major components, both needing direct NSF contact, making difficult the implementation of a single point of contact. (University E)
  3. Universities have to develop their own electronic spreadsheets, as the prelude to completing NSF's budget forms. Further, data from the spreadsheets can not automatically transfer to the final budget form. (Most universities)
  4. Negotiating cost-sharing is time-consuming (most universities). In part, the time is consumed because the desired amount of cost-sharing varies and is unknown. (University J)
  5. Colored graphics and scientific diagrams are currently submitted in hard copy with confidence that they will not be distorted. (Universities Q and L)
  6. Signatures for proposals must be made on hardcopy; therefore, hardcopy must be physically transported across different parts of the campus, consuming time and energy. (University L)
  7. Proposals have to be mass copied and then sent by costly overnight mail. (University L)
  8. Postcard acknowledgment of proposal receipt can take a long time and get lost with the PI (Universities Q and M); during this period, universities do not receive the reference number for tracking the progress of the proposal. (University Q)
  9. A potential advantage of FastLane is that it can eventually provide the routine responses (about tracking) and also provide additional information about previous or current proposals that cannot be obtained elsewhere. (Most universities)
  10. Nearly every university has a variety of hardware and software on campus, making the inter-office sharing of electronic materials difficult. (Most universities)
 
The respondents also provided some feedback on the differences in burden between the NSF and NIH requirements:
  1. NSF's forms are more confusing than NIH's forms. (University E)
  2. Students collaborate more with principal investigators to write technical proposals for NIH. (Universities L and Q)
  3. NIH proposals require more interdepartmental collaboration. (Most universities)
  4. Revisions and background preparation time fir NIH proposals take greater time than with NSF proposals. (Most universities)
  5. In proposal review, NIH's numeric ratings are preferred to NSF's categorical ratings.

  6. (University G)

3.2.2 Conditions Related to Proposal Preparation Costs and Burden

The main purpose of the cross-case analysis was to define conditions prior to the implementation of FastLane. This subsection will examine the findings of those conditions related to cost.

First, the study team calculated two cost indicators of interest: dollar cost per proposal submitted; and dollar cost (in thousands) per million dollars submitted. To determine these indicators, the team used two variables, the total number of proposals and total dollar volume of proposals submitted, as the numerators. The raw data for these two variables are given in Exhibits 3.1 and 3.2, respectively. The denominator stayed constant for both indicators, and was derived from the administrative cost data associated with the proposal process (see Exhibit 3.3). These data were then used to determine the final indicators, as illustrated in Exhibit 3.4. Once the indicators had been determined, the next step was to explore items potentially related to the cost indicators. This step was accomplished by arraying the indicators against other variables, such as the volume of submissions. The scattergram in Exhibit 3.5 shows that there is an inverse relationship between cost and number of proposals submitted: Universities with higher volumes of proposals also have higher unit costs (dollars per proposal).

Finally, the analysis explored the indicators in relationship to the level of involvement in the proposal preparation process, as defined in Section 2. Exhibit 3.6 shows the relationship of unit costs to processing time, and to proposal volume (both number of proposals and dollar volume submitted): The most decentralized arrangements are associated with higher unit costs (as previously noted), shorter processing times, and higher proposal volumes. Such relationships can be explained by the following scenario: The higher unit costs (in the decentralized arrangements) appear to result from having numerous departments participate in the proposal process; at the same time, the expanded participation also means shorter processing times and higher proposal volume (across the entire university). In turn, the need for numerous departments to participate may reflect more diverse portfolios at high proposal volume universities, including complicated, interdepartmental and interdisciplinary proposals. Under these circumstances, departments are forced to play a greater role. Whether these scenarios are corrector whether there are mechanical artifacts in the data need to be subject of future analyses.
 
 

Exhibit 3.1

Number of Proposals, by University
(1994-1995 Academic Year) 

Univ
Code
Total
Proposals
Proposals to Federal Agencies
All Federal NSF NIH DOE
No. No. % No. % No. % No. %
Q 4,250 2,250 52.9 489 11.5 991 23.3 68 1.6
D 3,235 1,551 47.9 438 13.5 491 15.2 46 1.4
A 3,131 1,466 46.8 298 9.5 661 21.1 28 0.9
M 3,054 1,552 50.8 512 16.8 436 14.3 20 0.7
E 2,850 1,411 49.5 240 8.4 809 28.4 -- --
N 2,566 1,531 59.7 475 18.5 254 9.9 77 3.0
O 2,224 1,327 59.7 382 17.2 298 13.4 112 5.0
F 2,101 1,078 51.3 365 17.4 199 9.5 100 4.8
L 2,097 840 40.1 401 19.1 95 4.5 -- --
H 2,028 1,063 52.4 269 13.3 -- -- 3 0.1
J 1,339 696 52.0 295 22.0 71 5.3 21 1.6
K 1,277 743 58.2 159 12.5 92 7.2 64 5.0
G 1,184 859 72.6 164 13.9 119 10.1 47 4.0
T 635 314 49 7 1 296 47 2 0.3
B 436 297 68.1 138 31.7 41 9.4 19 4.4
I 318 96 30.2 41 12.9 4 1.3 1 0.3
P 96 59 61.5 6 6.3 6 6.3 3 3.1
C 2 2 100.0 1 50.0 0 0 0 0.0
R -- -- -- -- -- -- -- -- --
S -- -- -- -- -- -- -- -- --
Mean 1,824 952 55.7 260 16.4 286 13.3 38 2.3
Median 2,063 961 52.2 282 13.7 199 9.9 25 1.6
-- Not Available

 

Exhibit 3.2

Proposal Dollar Volume, by University 
(1994-1995 Academic Year) 

Univ.    
Code
Total Volume Federal Volume NSF Volume NIH Volume DOE Volume 
Dollars
Dollars
%
Dollars
%
Dollars
%
Dollars 
%
O 1,224,004,668 929,823,007 76.0 181,317,732 14.8 180,506,448 14.7 185,409,000  15.1
A 1,105,367,674 909,418,068 82.3 88,019,400 8.0 516,371,671 46.7 17,278,305  1.6
M 983,874,839 644,234,683 65.5 107,215,137 10.9 282,504,251 28.7 8,423,112  0.9
D 732,636,790 564,743,605 77.1 103,407,294 14.1 284,186,885 38.8 13,788,217  1.9
Q 582,146,000 475,357,000 81.7 60,544,000 10.4 279,208,000 48.0 34,134,320  5.9
E 461,639,989 167,276,810 36.2 21,195,289 4.6 155,500,979 33.7 --  -- 
F 402,900,000 308,500,000 76.6 108,000,000 26.8 39,000,000 9.7 35,062,370  8.7
G 400,071,787 324,286,511 81.1 38,447,970 9.6 44,482,895 11.1 31,529,113  7.9
H 297,191,823 230,258,308 77.5 62,712,625 21.1 -- -- 7,140,000  2.4
L 270,107,629 199,879,645 74.0 135,078,948 50.0 63,257,595 23.4 --  -- 
T 137,698,881 115,155,165 84 962,578 1 101,474,710 74 416,000  0.3
J 134,176,180 94,305,212 70.3 26,173,831 19.5 12,042,736 9.0 4,140,982  3.1
K 122,408,806 107,966,425 88.2 20,929,042 17.1 14,824,039 12.1 9,523,381  7.8
N 105,570,071 75,332,160 71.4 9,771,056 9.3 21,280,730 20.2 35,748,630  33.9
B 81,341,805 71,306,951 87.7 33,899,186 41.7 593,228 0.7 5,113,338  6.3
I 44,983,744 37,076,803 82.4 13,613,951 30.3 169,375 0.4 1,734  0.0
P 13,579,628 10,917,129 80.4 1,124,924 8.3 964,826 7.1 145,995  1.1
C 1,621,418 649,532 40.1 130,637 8.1 0 0.0 0.0
R -- -- -- -- -- -- -- --  -- 
S -- -- -- -- -- -- -- --  -- 
Mean 394,517,874 292,582,612 74.0 56,252,422 17.0 117,433,433 22.2 24,240,906  6.0
Median 283,649,726 183,578,228 77.3 36,173,578 12.5 44,482,895 14.7 8,973,247  2.7
-- Not Available
 

Exhibit 3.3

Administrative Costs Associated with Proposal Process 
(Based on Most Recent Indirect Cost Proposal)

University Component Total University
SRO Department
Total  Costs Attributable  Total  Costs Attributable  Total Total
Univ. Admin. to Proposal Process Admin. to Proposal Process Admin. Proposal Costs
Code Costs (Dollars) Dollars % Costs (Dollars) Dollars % Costs (Dollars) (Dollars)
A 3,520,391 1,584,176 45 42,200,000 18,990,000 45 45,720,391 20,574,176
Q 5,200,000 3,120,000 60.0 39,000,000 9,750,000 25.0 44,200,000 12,870,000
O 6,104,000 408,968 6.7 21,055,000 1,524,382 7.2 27,159,000 1,933,350
M 3,600,000 900,000 25 20,182,000 5,045,500 25.0 23,782,000 5,945,500
N 7,779,000 3,111,600 40.0 14,647,000 1,464,700 10.0 22,426,000 4,576,300
F 2,500,000 750,000 30 15,629,000 1,562,900 10.0 18,129,000 2,312,900
H 1,500,000 375,000 25 14,330,000 3,582,500 25.0 15,830,000 3,957,500
D 5,000,000 750,000 15 8,900,000 890,000 10.0 13,900,000 1,640,000
J 1,600,000 400,000 25.0 8,900,000 89,000 1.0 10,500,000 489,000
G 3,197,000 1,534,560 48.0 4,610,000 1,613,500 35 7,807,000 3,148,060
K 990,000 445,500 45.0 3,347,000 502,050 15.0 4,337,000 947,550
T 750,000 90,000 12 1,500,000 45,000 3 2,250,000 135,000
B 523,920 0 0 1,216,952 608,476 50 1,740,872 608,476
P 388,371 19,419 5.0 223,222 11,161 5.0 611,593 30,580
I 200,000 200,000 100.0 0 0 0.0 200,000 200,000
E -- -- -- -- -- -- -- --
L -- -- -- -- -- -- -- --
R -- -- -- -- -- -- -- --
S -- -- -- -- -- -- -- --
Mean 4,000,039 1,293,430 32.3 18,945,300 4,451,248 23.5 22,945,339  5,744,679
Median 3,560,196 825,000 27.5 15,138,000 1,588,200 17.5 20,277,500  3,552,780
SRO = Sponsored Research Office
-- Not Available
* Includes all Departments and Schools
 

Exhibit 3.4

Unit Cots of Proposals, by Number and Dollar Volume of Proposals Submitted 
(1994-1995 Academic year) 

Univ.
Code
Proposal Costs    
Total (SRO and Dept)
Proposal Volume  Unit Cost
Number     
Submitted
$$ Volume Per #    
Proposal
Per $million     
Proposal
A 20,574,176 3,131  1,105,367,674  6,571 18,613
Q 12,870,000 4,250  582,146,000  3,028 22,108
M 5,945,500 3,054  983,874,839  1,947 6,043
N 4,576,300 2,566  105,570,071  1,783 43,348
H 3,957,500 2,028  297,191,823  1,951 13,316
G 3,148,060 1,184  400,071,787  2,659 7,869
F 2,312,900 2,101  402,900,000  1,101 5,741
O 1,933,350 2,224  1,224,004,668  869 1,580
D 1,640,000 3,235  732,636,790  507 2,238
K 947,550 1,277  122,408,806  742 7,741
B 608,476 436  81,341,805  1,396 7,480
J 489,000 1,339  134,176,180  365 3,644
I 200,000 318  44,983,744  629 4,446
T 135,000 635  137,698,881  213 980
P 30,580 96  13,579,628  319 2,252
E -- 2,850  461,639,989  -- --
L -- 2,097  270,107,629  -- --
C -- 1,621,418  -- --
R -- --  --  -- --
S -- --  --  -- --
Mean 3,957,893 1,824  394,517,874  $1,605  9,827
Median 1,933,350 2,063  283,649,726  $1,101  6,043
 


Exhibit 3.5
Cost Per Proposal, By Number of Proposals Submitted
(N=15 Universities)

Exhibit 3.5 - Cost per Proposal by Number of Proposals Submitted

Exhibit 3.6

Proposal Volume and Costs, by Processing Time 

Univ.
Code
Processing    
Time
Proposal Volume  Proposal Costs
Weeks
Number  Submitted $$ Volume
Per #
Proposal
Per $million
Proposal
Group I D 17 1,621,418  N/A N/A
Group II K 14 1,339  134,176,180  365 3644
E 14 318  44,983,744  629 4446
H 11 2,224  1,224,004,668  869 1580
F 10 96  13,579,628  319 2252
N 11 436  81,341,805  1396 7480
Average 12 883  299,617,205  716 3880
Group III B 11 1,184  400,071,787  2659 7869
I 10 3,235  732,636,790  507 2238
O 7 1,277  122,408,806  742 7741
Q 9 2,097  270,107,629 
J 6 2,566  105,570,071  1783 43348
Average 9 2,072  326,159,017  1423 15299
Group IV M 5 3,131  1,105,367,674  6571 18613
P 13 4,250  582,146,000  3028 22108
C 7 2,101  402,900,000  1101 5741
A 5 2,028  297,191,823  1951 13316
G 5 2,850  461,639,989 
L 5 3,054  983,874,839  1947 6043
Average 7 2,902  638,853,388  2920 13164
 
 

An important caveat to any interpretations based on proposal volume is that the present inquiry made no attempt to examine award volume. The possibility exists that the high proposal volume universities have better "win" rates than the low proposal volume universities. If so, the unit costs per award dollar might very well be lower at the high proposal volume universities, compared to the low proposal volume universities. Such a finding would change the interpretation based solely on unit costs per proposals submitted. Thus, this matter deserves investigation in any further inquiry.

Exhibits 3.7 and 3.8 examined two other variables related to burden: proposal staff effort (in full-time equivalents); and time spent on the proposal process at each level (SRO, College, Department, and Principal Investigator), respectively. Exhibit 3.7 illustrates proposal staff effort, which was calculated at three levels: SRO; College; and Department. Data were not able to be summed across levels to produce a university FTE figure, however, because averages had to be calculated at the College and Department levels. Overall, SROs supported the highest level of FTEs allocated to the proposal process, at an average of 9.35 FTEs, compared to an average of 2.05 at the College level and an average of 2.05 at the Department level. Exhibit 3.8 illustrates the actual time spent on the proposal process as reported by the universities. These figures represent estimates of time spent on an average, or typical proposal at each of the four levels: SRO; College; Department; and Principal Investigator. The PIs spent the greatest amount of time on the proposal process, at an average of 150 hours per proposal. The fact that this figure includes technical time on the proposal explains why it far exceeds the amount of time spent at the other three levels, which are devoted to the administrative preparation of the proposal. Of the other three levels (SRO, College, and Department), an average of 17 hours were spent at the SRO, 11 hours at the Department, and 4 hours at the College on a typical proposal.

 

Exhibit 3.7

Proposal Administration Staff Effort, by SRO, College, and Department

(1994-1995 Academic Year)

Univ.
Code
SRO College Department
Total
Administrative
FTEs
Total FTEs Allocated to Proposal Process
Average Total
Administrative
FTEs
Average FTEs Allocated to Proposal Process
Average Total
Administrative
FTEs
Average FTEs Allocated to Proposal Process
No.
No.
%
No.
No.
%
No.
No.
%
G 120.10 17.75 14.8 5.50 1.50 27.3 -- -- --
D 52.00 11.00 21.2 6.00 3.75 62.5 4.00 0.03 0.8
F 52.00 16.00 30.8 15.00 3.20 21.3 5.50 1.47 26.7
L 42.34 12.62 29.8 19.00 1.00 5.3 54.75 -- --
A 33.80 15.00 44.4 -- -- -- 4.00 0.82 20.5
O 31.00 6.20 20.0 6.00 0.07 1.1 9.50 0.45 4.7
Q 27.00 16.00 59.3 20.50 0.80 3.9 10.20 0.57 5.6
J 25.00 15.00 60.0 23.13 1.83 7.9 15.67 2.23 14.2
M 21.00 -- -- 16.00 1.50 9.4 3.50 0.90 25.7
K 19.50 9.50 48.7 8.50 2.50 29.4 8.00 1.00 12.5
E 18.00 9.00 50.0 9.00 5.00 55.6 82.67 12.73 15.4
B 12.00 -- -- -- -- -- -- -- --
H 10.75 -- -- -- -- -- -- -- --
N 10.50 3.50 33.3 25.00 3.00 12.0 15.00 0.90 6.0
I 6.00 6.00 100.0 43.50 -- -- 7.00 -- --
P 3.50 0.40 11.4 1.40 0.40 28.6 3.60 1.50 41.7
C 1.00 0.50 50.0 -- -- -- -- -- --
R -- -- -- -- -- -- -- -- --
S -- -- -- -- -- -- -- -- --
T 12.75 1.75 13.7 -- -- -- -- -- --
Mean 27.68 9.35 33.8 15.27 2.05 13.4 17.18 2.05 12.0
Median 20.25 9.50 33.3 15.00 1.66 16.7 8.00 0.90 14.2
SRO = Sponsored Research Office
-- Not Available
 

Exhibit 3.8

Time Spent on Proposal Process, by SRO, College, Department, and Prinicpal Investigator  
(1994-1995 Academic Year)

Univ.
Code
Total Proposal
Time
Hours
Time Associated with the Proposal Process
SRO College Dept. PI
Hours % Hours % Hours % Hours %
G 179.85 13.60 7.6 7.75 4.3 4.75 2.6 153.75 85.5
E 105.12 0.79 0.8 0.33 0.3 14.29 13.6 89.45 85.1
O 151.59 4.39 2.9 0.20 0.1 7.00 4.6 140.00 92.4
L 114.75 4.12 3.6 5.42 4.7 23.05 20.1 82.30 71.7
J 141.25 29.75 21.1 9.50 6.7 12.50 8.8 89.50 63.4
Q 294.39 13.30 4.5 0.08 0.0 47.74 16.2 233.24 79.2
K 126.92 4.76 3.8 0.04 0.0 9.83 7.7 112.25 88.4
N 199.75 5.50 2.8 16.59 8.3 9.09 4.6 168.59 84.4
I 273.75 62.50 22.8 0.00 0.0 3.00 1.1 208.25 76.1
P 168.70 18.70 11.1 0.25 0.1 2.50 1.5 147.25 87.3
F 314.49 3.26 1.0 8.00 2.5 22.73 7.2 280.50 89.2
C 99.75 5.50 5.5 6.67 6.7 2.08 2.1 85.50 85.7
D 202.95 11.83 5.8 0.50 0.2 4.50 2.2 186.12 91.7
M -- -- -- -- -- -- -- -- --
H 107.28 7.20 6.7 0.00 0.0 9.08 8.5 91.00 84.8
B 161.50 6.50 4.0 0.25 0.2 2.25 1.4 152.50 94.4
A -- -- -- -- -- -- -- -- --
R -- -- -- -- -- -- -- -- --
S -- -- -- -- -- -- -- -- --
T 280.00 80.00 28.6 5.75 2.05 7.75 2.77 186.50 66.6
Mean 182.63 16.98 9.3 3.83 2.1 11.38 6.2 150.42 82.4
Median 165.10 6.85 4.1 0.42 0.3 8.42 5.1 149.88 90.8
SRO = Sponsored Research Office
-- Not Available

 


4. Post-submission Activities

4.1 Methodology

The study team investigated several areas involving post submission activities: proposal tracking after submission; cash draws; grant reporting; and serving on NSF review panels. Several questions were identified by the study team and disseminated to each university by e-mail for response. The same PIs who were interviewed about specific NSF proposals were included in the data collection on post-submission activities. Below are the findings across all universities in the study.

4.2 Cross-case Findings

Perceptions about post-submission activities are reported under 4.2.1. The amount of time spent on these activities is reported under the quantitative summary (subsection 4.2.2).

4.2.1 Brief Comments by Universities

The following information illustrates the qualitative perceptions of respondents across the universities in the study:

Tracking after Submission:

  1. SROs have developed their own automated tracking tool. (Most universities)
  2. PIs track the process through agency program officers, usually knowing the disposition of the award before the SRO. (University J)
  3. SRO does not monitor pending proposals (Universities P nd I); pending proposals are purged from list if no notification is received after two years. (University B)
  4. A monthly newsletter lists all awarded projects. (University M)
  5. The SRO makes monthly requests of PIs, for any information they have on the progress of proposal review. (Universities E and O)

Cash Draws:

  1. SROs do not know when projects actually start (prior to first draw-down). The main understanding is that any risk (of drawing down for a project that later fails to receive an award; or for drawing down for costs that cannot be reimbursed by the grant) is assumed by the academic department (Universities B, D, G, J, and O); or by the college (University N); or both (University Q).
  2. The SRO authorizes pre-award expenditures for 90 days prior to award. (Universities M and K)
  3. The University does not permit premature draw downs. (University P)

Grant Reporting:

  1. Reports are spliced from existing publications. (University M)
  2. Administrative managers remind PIs of report deadlines, and PIs produce the technical portions of the report, which is then completed and mailed by administrative managers. (University A)

Proposal Review:

  1. Reviewers receive no advance notice that they will be asked to review proposals. (University E)
  2. Reviewers do not use NSF's paper form for the review (Universities A and L); an electronic form would be helpful. (University O)
  3. The due date for a review is not highlighted to stand out. (University E)
  4. Late reviews are finally conducted only when the NSF program officer makes a reminder phone call or sends an e-mail message. (Most universities)
  5. Copies of reviews are not returned to the original proposal submitters in time for revisions prior to the next round of proposal submission deadlines (University B); the same is true of NIH reviews (University J).

4.2.2 Time Spent on Post-Submission Activities

Exhibit 4.1 illustrates the amount of time spent on the following post submission activities: tracking a proposal; requesting advanced funds; review of funded projects under proposed area; and preparing grant reports. Preparing grant reports (progress and final) took the greatest amount of time of all the post submission activities, occurring at the PI level. Preparing progress reports took an average of 6 hours; final reports took an average of 21 hours. Requesting advanced funds (cash draws) also required substantial time, especially at the SRO level, which was more than the Department and PI levels combined. The average SRO spent about 9 hours on cash draws, the average Department spent 0.1 hours, and the average PI spent 1.5 hours. Tracking a proposal took less time, which at the SRO level took an average of 16 minutes per proposal. The PI time on this activitiy was more minimal, at an average of 3 minutes per proposal. Finally, review of funded projects under the proposal area required only about 2 minutes for the average SRO, and about 3 minutes for the average PI.
 

Exhibit 4.1

Amount of Time Spent on Post-Submission Activities, by University 
(1994-1995 Academic Year)

Univ.
Code
Tracking     
a Proposal     
(in minutes)
Requesting Advanced Funds
(in hours)
Review of Funded Projects     
Under Proposal Area     
(in minutes)
Preparing a Grant Report
(in hours)
SRO PI* SRO Dept PI* SRO PI* Progress Final
Q 10 0 -- -- -- 0 0 5 33
A 10 0 8 0 8 0 0 0.33 --
G 15 3 2 0 0 0 1.5 4 14
I 30 0 48 0 0 0.33 5 10 --
C -- -- -- -- -- 0 0.58 -- --
N 10 3 28 0 0 0 5 6 10
D 20 0 1 1 -- 0 0.75 1.75 20.25
B 10 0 0.5 0 0 1.5 2 8 24
O -- -- -- -- -- 0 0 3 --
E 30 0 9 0 0 0 3 0.16 --
P 10 10 Not Allowed " " " " 2 0 6 9
M 25 0 0.5 0 0 0 0.33 4 8
J 30 15 0.5 0 0 0.42 0 14 14
K 10 0 1.5 0 0 -- -- 3.5 3.5
L 15 15 0.33 0.25 0.5 0 0 24 72
F 5 0 8 0 8 0 8 -- --
H -- -- -- -- -- -- -- -- --
R -- -- -- -- -- -- -- -- --
S -- -- -- -- -- -- -- -- --
T -- -- -- -- -- 30 15 4 24
Mean 16.4 3.3 8.9 0.1 1.5 2.1 2.6 6.2 21.1
Median 12.5 0.0 1.8 0.0 0.0 0.0 0.7 4.0 14.0
SRO = Sponsored Research Office
-- Not Available
*PI = average of 1-2 principal investigators interviewed at each university

 


Appendix A
FastLane Data Collection Protocol1

This protocol is a guide for capturing information about the proposal and award process at a university. The information will be used to develop a flow chart that represents the institutional effort required to obtain extramural funds and in particular, funds from the National Science Foundation (NSF) and the National Institutes of Health (NIH).

Section I of the protocol is used to obtain information about all proposals submitted during 1994-95. Sources of information include reports of staff members who have managed the functions connected with proposal preparation and records and documentation pertaining to the institution's proposal administration operation.

The information obtained in Section II applies to three specific proposals that were submitted to NSF and one that was submitted to NIH. Information will be obtained from staff who participated in the preparation of each proposal at the college (or research unit) and at the department(s) of the principal investigator(s). Forms and papers used in the process also will be examined.

Section III will be used to obtain information from faculty members or researchers who have served as reviewers for NSF proposals.

1 The site visit protocol guides the evaluation team in collecting on-site data (archives, documents, interviews, observations). The protocol is not an interview questionnaire (the "respondent" to the protocol is the evaluation team itself).

I. All Proposals
[bracketed items in bold can be retrieved ahead of time by the office of sponsored research]

The purpose of this portion of the case study is to characterize the university's pre-award research administration process for the academic year 1994-1995. The complete process is assumed to involve four levels: the office of sponsored research (university level), a dean's office (school or college level), an academic department within the school or college (department level), and a research investigator or team (investigator level). Although some proposals may skip one or more of these levels, the case study description should cover all four levels as they related during 1994-1995.

The information to be collected will come from all levels. Thus, prior to the site visit, arrangements must be made with the office of sponsored research, which in turn should help to identify at least one school or college (from which NSF and/or NIH proposals were submitted), two or more departments within each school or college, and one or two investigators within each department. On the basis of interview and documentary evidence [staffing charts, archival data, instructions or manual issuances, and copies of forms that might be used in the process], the case study team is asked to complete the following protocol.

* Critical items for telephone data collection.

A. Proposal Preparation and Tracking Process

  1. For both NSF and NIH proposals, describe the flow of new, standard proposals (not continuations), from the decision to develop to a submission to an external funding agency. Trace this flow three times, once for each of three different components of proposals: the technical section, the administrative section, and the cost section (if the flow is the same for two or more of these  components, indicate so). Describe the flow as it works most of the time (for most proposals). Note: the information that we gather for the NSF and NIH proposals is identical.
    1. *The flow is defined as the movement of the proposal from one person to another, keeping track of the offices within which the persons work. Start the flow by assuming that the investigator has decided to submit an application for a particular grant. What is the first task completed? Who completes it? What is next? Where does the product go next? After that, where does it then go? What person and office handle the final step--mailing or submitting the proposal to the external funding agency?
    2. For each point in the flow, establish how much time passes for most proposals. The time should be expressed by a range (the shortest time experienced to the longest time, with an indication of the "average" time in between).
    3. *Within the flow, define the points at which sign-offs (including internal approvals) occur. Ascertain the exact dates.
    4. *On the basis of 1a-1c, prepare a flow diagram, indicating the identities of specific offices and the time spent between and within offices. (FORMAT TO BE PROVIDED LATER)
  2. In a similar manner as 1a-1c, describe the flow of papers or information if the external funding agency raises questions or there is (technical or cost) negotiation after the proposal was submitted but before award is made.
  3. *Is the status of pending proposals monitored? If so, what is the mechanism? How much time is spent? What mechanisms are currently in place for monitoring the portfolio of pending NSF and NIH projects? Estimate the amount of time spent keeping track of those projects.

B. Proposal Volume (See tables I.B.1 and I.B.2)

  1. *[Define the number and dollar volume of proposals submitted by the university during academic year 1994-1995. Of this volume, define the proportion (number and dollar amount) that went to federal agencies, as well as the proportion that went to NSF and NIH.]
  2. *[Obtain similar data for the school or college and for the departments that were involved in the site visit.]

C. Burden (See tables I.C.1 and I.C.2)

  1. For the three levels in the flow (university, school or college, and department), [identify the direct operating budget for the academic year 1994-1995 and estimate the proportion of that budget that was allocated to pre-award research administration]. If possible, indicate the portion allocated to non-personnel costs, such as telephone, postage and courier, photocopying, and all other.
  2. *Similarly, for each level, [identify the number of staff (full-time equivalents) for the academic year 1994- 1995 and estimate the proportion of the staff time that was devoted to proposal preparation and tracking]. Staffing charts or other documents may provide information about how these figures were determined.

D. Technology

  1. Describe the electronic systems involved in the proposal preparation and award process. These systems may be both within an office and between offices. For each system, briefly list the computer and peripheral hardware, the relevant software, and the relevant communications hardware and software.
  2. Estimate the proportion of proposals for which these systems were used during the academic year 1994-1995.

E. Perceptions

  1. At each level, obtain perceptions of the proposal and award process. Identify three types of perceptions: a) strengths of the process (ask "in your estimation, what are the good points in the process?"), b) weaknesses in the process (ask, "in your estimation, what are the weak points in the process?", and c) areas for improvement (ask, "in your estimation, what can be improved in the process?"). Ask respondents to list strengths, weaknesses, and areas for which improvement is needed, and then to rank their lists in order of importance.

II. Individual Proposals

The purpose of this portion of the case study is to define the specific process used for three individual, new NSF proposals and one NIH proposal. (The proposals will have been identified ahead of time and will represent those for which a decision was made in academic year 1994-1995 even though the proposal may have been submitted earlier.) The process may mimic completely the four levels defined in Section I: the office of sponsored research (university level), a dean's office (school or college level), an academic department within the school or college (department level), and a research investigator or team (investigator level).

The information to be collected will again come from all levels, referencing each of the four proposals singly. Thus, prior to the site visit, arrangements must be made with the office of sponsored research, which in turn should help to identify the key offices and persons (including the research investigators) that were involved in each of the proposals.

For this section, the case study team should repeat Parts A, C2, D, and E from Section I, for each of the proposals. For A1c, ascertain the exact date for each sign-off.

III. Review of NSF Proposals

For the two to three of the research investigators who were interviewed for either Section I or Section II, also ascertain whether they conducted reviews of NSF proposals during academic year 1994-1995 (if none did, identify two to three other investigators who did). Where a single investigator reviewed two or more NSF proposals, ask the investigator to define the collective experience across all the NSF proposals reviewed during the year.

A. Review Process and Burden

  1. ?Identify the number of proposals that were reviewed.
  2. Define the basic steps in the process, starting with the receipt of the request for a review and concluding with the submission of the review to NSF. At each step, define what technologies were used, if any.
  3. Define the amount of time taken at each step.

B. Reviewers' Perceptions

  1. Ask the reviewer to indicate the strengths and weaknesses of the review process (focusing on the handling of the paperwork and the information flow, not any technical peer review issues).
  2. Ask the reviewer to suggest any improvements in the paperwork and information flow process.

Field Visit Contact Matrix

The matrix below represents the sources of information about the process for all proposals and for the three NSF proposals. Prior to the site visit, the goal is to fill each cell with the name and telephone number of a contact person (the same person may appear more than once). Most desirably, the cell also would include the time and day the person will be contacted.
 
Institutional Level Type of Proposals to be Discussed
All Proposals NSF NIH
Research Office -
1
2
3
-
College    

(1)    

(2)

- - - - -
Department    

(1)    

(2)

- - - - -
Investigator    

(1)    

(2)    

(3)

- - - - -
* When there is a foundation, make sure that there is no other university-level office that is involved
in the proposal process; otherwise, this office has to be included as a fifth level.
 

Table I.B.1

Number of Proposals
AY 1994-1995

Institutional Level
Total number
of proposals
Total number
(or proportion)
to federal agencies
Total number
(or proportion)
to NSF
Total number
(or proportion)
to NIH
University - - - -
College - - - -
Department - - - -
 

Table I.B.2

Proposal Dollar Volume
AY 1994-1995

Institutional Level
Total dollars
requested
Total
(or proportion)
of federal
dollars requested
Total
(or proportion)
of NSF dollars requested
Total
(or proportion)
of NIH dollars requested
University - - - -
College - - - -
Department - - - -
 

Table I.C.1

Proposal Administration Budget Allocation
AY 1994-1995

Institutional Level
(1)    
Total    
Budget of Organizational Unit
(2)    
Total (or Proportion) Attributable to Proposal Process
Total Non-Personnel Costs     
(or Proportion*)     
Associated with the Proposal Process
Telephone
Postage
Courier
Photo-copying
Other
University - - - - - - -
College - - - - - - -
Department - - - - - - -
 

Table I.C.2

Proposal Administration Staff Effort
AY 1994-1995

Institutional Level
Total FTEs of
organizational unit
Total FTEs associated with proposal process
University - -
College - -
Department - -
 


Appendix B
Proposal Preparation Flow Diagrams By Grouping

I = SRO Involved Early and PI Submits

II = SRO Involved Early and SRO Submits

III= All Levels Involved Early and SRO Submits

IV= Department Involved Early and SRO Submit

    

Group I:
SRO Involved Early and PI Submits

Flow Diagram for University "D"


Group II:
SRO Involved Early and SRO Submits

Flow Diagram for University "K"

University "K" Annotations to Flowchart
1 Addtional PIs will confer with the lead PI throughout the entire proposal preparation process.

Flow Diagram for University "E"

Flow Diagram for University "H"

Flow Diagram for University "F"

Flow Diagram for University "N"


Group III:
All Levels Involved Early and SRO Submits

Flow Diagram for University "B"

Flow Diagram for University "I"

Flow Diagram for University "O"

Flow Diagram for University "Q"

Flow Diagram for University "J"


Group IV:
Department Involved Early and SRO Submits

Flow Diagram for University "M"

Flow Diagram for University "P"

Flow Diagram for University "C"

Flow Diagram for University "A"

Flow Diagram for University "G"

Flow Diagram for University "L"


Appendix C
Briefing Summary















[Return to Top]