Award Abstract # 1238120
Technical Evaluation Assistance in Mathematics and Science (TEAMS)

NSF Org: DRL
Division of Research on Learning in Formal and Informal Settings (DRL)
Recipient: RMC RESEARCH CORP
Initial Amendment Date: May 16, 2013
Latest Amendment Date: January 20, 2016
Award Number: 1238120
Award Instrument: Cooperative Agreement
Program Manager: Rebecca A. Kruse
DRL
 Division of Research on Learning in Formal and Informal Settings (DRL)
EDU
 Directorate for STEM Education
Start Date: May 15, 2013
End Date: April 30, 2016 (Estimated)
Total Intended Award Amount: $1,199,997.00
Total Awarded Amount to Date: $799,997.00
Funds Obligated to Date: FY 2013 = $399,999.00
FY 2014 = $399,998.00
History of Investigator:
  • John Sutton (Principal Investigator)
    john@result-ed.com
  • David Weaver (Former Co-Principal Investigator)
Recipient Sponsored Research Office: RMC Research Corporation, OR
1000 MARKET ST UNIT 2
PORTSMOUTH
NH  US  03801
(603)422-8888
Sponsor Congressional District: 01
Primary Place of Performance: RMC Research Corporation
633 17th St., Suite 2100
Denver
CO  US  80202-3600
Primary Place of Performance
Congressional District:
01
Unique Entity Identifier (UEI): K2GWCKB1JCE4
Parent UEI:
NSF Program(s): MSP-OTHER AWARDS
Primary Program Source: 04001314DB NSF Education & Human Resource
04001415DB NSF Education & Human Resource

04001516DB NSF Education & Human Resource
Program Reference Code(s): 9150
Program Element Code(s): 179300
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.076

ABSTRACT

This Research, Evaluation and Technical Assistance project awarded to RMC Research Corporation supports the needs of the evaluation community in the Math and Science Partnership (MSP) program, improving the quality of the evaluations of the MSP awards and building community capacity in evaluation design, methodology, analysis and reporting. This project conducts on-going needs assessments to ascertain barriers to high quality evaluations, provides increased access to instruments to measure a wide variety of important STEM education outcomes, and facilitates ongoing networking among MSP evaluators and investigators to improve evaluation. Evaluation design and implementation technical assistance includes direct help in determining how projects can best utilize rigorous experimental and strong quasi-experimental designs, how investigators and evaluators can best interact with schools and districts in order to ensure their cooperation, and how the findings of evaluations can best be communicated to a variety of audiences. The technical assistance will address new methodology in evaluation that includes the use of large scale, longitudinal databases of student outcomes, social network designs to examine collaboration and capacity building, and other methodologies coming from NSF programs such as Promoting Research and Innovation in Methodologies in Evaluation (PRIME).

A multi-tiered technical assistance program provides different levels of help to projects. A website, available to the general STEM education public, provides information relevant to a wide range of topics for MSP evaluation. An online Help Desk provides more detailed assistance for NSF MSP projects. An on-going analysis of evaluation plans, reports and event services that are provided through interactions with MSP projects provides examples of ways that other projects have identified and addressed barriers to robust evaluation. A series of webinars that are archived for anytime viewing address common evaluation issues that have arisen in projects.

Evaluation of funded projects is an important issue that goes well beyond the target population of the NSF and Title IIB (of the U.S. Department of Education) MSP funded projects. Increasingly, research and development projects in STEM education are being required to show evidence of impact or effectiveness of their efforts. The tools and designs that are needed to ensure that the information that is provided about those impacts is distributed across a variety of reports, journal articles and publications. This project, integrated with the ongoing MSP Learning Network, expands the support that is available for STEM educators who engage in professional development and research in STEM education.

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

The Technical Evaluation Assistance in Mathematics and Science (TEAMS) project, funded (DRL 1238120) through the Mathematics and Science Partnership (MSP) Research, Evaluation, and Technical Assistance (RETA) program was in operation between May 15, 2013 and April 30, 2016. The long-term goal of the TEAMS project was to strengthen the quality of MSP project evaluation and build the capacity of evaluators by strengthening their skills related to evaluation design, methodology, analysis, and reporting. The primary means of building capacity took the forms of responding to Help Desk inquiries; providing intensive and focused technical assistance to projects such as reviewing instruments and evaluation plans; providing “just in time” consultation and targeted technical assistance to individual evaluators and/or projects; webinars on focused evaluation topics identified through a needs assessment and feedback; and products, publications, and materials on topics of interest to MSP evaluators based on emerging trends and identified needs.  

 

While in operation over the full duration of the project:

TEAMS staff provided literature and research-based responses to 38 Help Desk queries; prepared and provided services and intensive technical assistance to 31 individuals representing projects; and provided consultation and targeted technical assistance to 82 individuals and/or projects.

 

TEAMS staff prepared, presented, and facilitated 27 webinars on a variety of topics relevant to MSP project evaluators, Principal Investigators, and others. All webinars were recorded and placed on the teams.mspnet.org website. According to the TEAMS external evaluation report, “the total number of viewings [of TEAMS webinars] exceeds 1800 (1149 live participants plus 660 recorded views as of 3/29/2016). This is a substantial number of viewings for webinars designed to build evaluation capacity! When averaging actual attendance in the webinars and the number of views of webinar recordings through March, 2016, the average number of “participants” is 72 people per webinar—quite a large reach. The webinars reached people in 6 countries, nearly 200 universities or university departments, 9 state offices of education as well as many regional offices of education and school districts/schools, well over 100 organizations specializing in evaluation that were not listed as MSP evaluators in the list provided by TEAMS, 25 federal or state agencies, and numerous non-profit organizations and for-profit businesses.”

 

TEAMS staff TEAMS staff provided or facilitated more than 15 presentations at relevant conferences including the annual NSF MSP Learning Network meeting, the USED MSP conference, and the American Evaluation Association (AEA) annual conference. Conference presentation materials are archived on the teams.mspnet.org website. According to the TEAMS external evaluation report [regarding presentations in 2015-2016 only], “sessions drew approximately 200 people (not including the poster session, for which no data were collected), engaged enthusiastic participants in meaningful discussions and/or activities, and resulted in changed practices for five of the seven (over 70%) of those who were interviewed—a result that surprised this evaluator who often attends conference sessions, leaves enthused, but returns home to implement few new concepts!” All TEAMS conference presentation materials are located on the teams.mspnet.org website under TEAMS Resources or Showcase sections.

 

TEAMS staff created and posted on the teams.mspnet.org website eight publications over the course of the project that address a variety of topics and have the potential to contribute to evaluation capacity building over time. These publications, able to be downloaded as pdf files, a...

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page