Award Abstract # 1621210
MOSART HSPS: Misconceptions Oriented Standards-Based Assessment Resource for Teachers of High School Physical Sciences

NSF Org: DRL
Division of Research on Learning in Formal and Informal Settings (DRL)
Recipient: PRESIDENT AND FELLOWS OF HARVARD COLLEGE
Initial Amendment Date: July 18, 2016
Latest Amendment Date: September 6, 2018
Award Number: 1621210
Award Instrument: Continuing Grant
Program Manager: Robert Ochsendorf
rochsend@nsf.gov
 (703)292-2760
DRL
 Division of Research on Learning in Formal and Informal Settings (DRL)
EDU
 Directorate for STEM Education
Start Date: September 1, 2016
End Date: August 31, 2022 (Estimated)
Total Intended Award Amount: $2,648,234.00
Total Awarded Amount to Date: $2,648,234.00
Funds Obligated to Date: FY 2016 = $1,525,482.00
FY 2018 = $1,122,752.00
History of Investigator:
  • Philip Sadler (Principal Investigator)
    psadler@cfa.harvard.edu
Recipient Sponsored Research Office: Harvard University
1033 MASSACHUSETTS AVE STE 3
CAMBRIDGE
MA  US  02138-5366
(617)495-5501
Sponsor Congressional District: 05
Primary Place of Performance: Harvard University
60 Garden Street
Cambridte
MA  US  02138-1516
Primary Place of Performance
Congressional District:
05
Unique Entity Identifier (UEI): LN53LCFJFL45
Parent UEI:
NSF Program(s): Discovery Research K-12
Primary Program Source: 04001617DB NSF Education & Human Resource
04001819DB NSF Education & Human Resource
Program Reference Code(s):
Program Element Code(s): 764500
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.076

ABSTRACT

The Discovery Research K-12 program (DRK-12) seeks to significantly enhance the learning and teaching of science, technology, engineering and mathematics (STEM) by preK-12 students and teachers, through research and development of innovative resources, models and tools (RMTs). Projects in the DRK-12 program build on fundamental research in STEM education and prior research and development efforts that provide theoretical and empirical justification for proposed projects.

Researchers in the Harvard Smithsonian Center for Astrophysics at Harvard University are developing and validating assessment instruments intended to measure chemistry and physical science concepts for students and teachers in grades 9 through 12. This project builds upon the widely used K-12 Misconception Oriented Standards-based Assessment Resource for Teachers (MOSART) developed by this research team. The project is developing 500 new test items that are intended to assess disciplinary core ideas in chemistry and physics aligned to Next Generation Science Standards. The new measures will be used to measure the knowledge acquired in a year of study by 10,000 students and 200 teachers in chemistry and physics. The new assessment items and instruments will be made available to other researchers and practitioners through the project website and the on-line MOSART assessment system.

The assessment development process is based on prior research conducted to develop similar MOSART items and instruments, which includes design efforts of assessment specialists, content experts, and research scientists. Pilot items are tested with a national sample of approximately 20,000 high school students and their teachers. Data will be analyzed using item response theory to model student responses. Outcomes consist of item parameters, test and sub-test characteristics, and predictive linkages among items. Descriptive statistics are generated to establish the state of student knowledge, pre-and post-test performance by item and by standard, and teacher knowledge. Descriptive analyses are followed by hierarchical linear modeling (HLM) to examine the relationships between teacher-level and program-level variables.

The MOSART instruments have been widely used and are based on a model of cognition with a strong research base in misconceptions in science education. These additional Grade 9-12 chemistry and physics instruments will address gaps in the current MOSART system of assessments. The new instruments focused on chemistry and physics disciplinary core ideas provide a much needed set of assessments for researchers and practitioners, particularly teacher professional development providers.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

C Chen, G Sonnert, P Sadler, S Sunbury "Simulation and plotting are fully mediated by prediction making to promote conceptual gain in high school chemistry" International Conference on Science and Technology Education (STE 2020) , 2020

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

The primary goal of MOSART-HSPS was to develop rigorous assessment tools that can generate evidence-based measures of teacher and student understanding of high school-level physical science concepts in chemistry and physics. The project achieved this goal by first developing close to 1,000 distractor-driven multiple-choice items based on the grades 9-12 Next Generation Science Standards (NGSS) for Physical Science (divided into the domains of chemistry and physics), using the related peer-reviewed research on student misconceptions. Items passed through several development stages, including review by scientists, pilot-testing using crowd-sourced subjects, and national field-testing with high school students. Multiple 30-item assessment instruments were constructed to represent a range of difficulties, have high discrimination, low gender bias, high misconception strength, and cover all disciplinary core ideas in the Next Generation Science Standards (NGSS). Chemistry and physics assessment instruments are available on the newly revised MOSART Self-Service website for use at no-cost by teachers, professional development providers, researchers, and other science education stakeholders (https://waps.cfa.harvard.edu/mosart/).

Secure versions were developed for use by researchers and employed by our team to study effectiveness of teaching science concepts with a national sample of 6,000 chemistry and physics students. Administered as pre/post-tests at the beginning and the end of the school year, teachers answered the same content questions as students to determine their subject matter knowledge (SMK). Teachers also predicted the most common wrong answer among their students for each item to identify their knowledge of student misconceptions (KOSM). Additionally, students identified the classroom pedagogies that teachers chose to employ.

The relationship between teachers’ subject matter knowledge and student gains.
Student learning has long been thought to depend on the extent of teacher knowledge. We found this to be the case; students with teachers who had high SMK achieved more. By building a model that predicted students’ likelihood of answering a posttest item correctly as a function of the misconception strength of that item, teachers’ knowledge (SMK & KOSM, SMK-Only, No-SMK), and student pretest correctness of the item, we found that student learning to be highly dependent on both teacher knowledge of subject matter and of student misconceptions. This was particularly evident for items with “strong misconceptions,” those that are conceptually most difficult for students. The general pattern shows that students’ correct rate is low when teachers’ SMK is low and declines drastically with misconception strength in the mid-range. As the misconception strength increases, students whose teachers do not have the SMK for a specific item will find the item more difficult in the posttest (controlling for the pretest); students whose teachers have only SMK but not KOSM will be minimally affected by the item misconception strength, and, most interestingly, students whose teacher have both SMK and KOSM will find the item easier. When the misconception strength is low, teachers in the SMK & KOSM and the SMK-Only group yielded similar students’ correctness in the posttest.

Pedagogical practices associated with student gains
We also examined the association between a variety of classroom pedagogies and student gains for both chemistry and physics. An example of such an activity is “asking students to make predictions.” A comparison of gains of students whose teachers asked them to make predications vs. those of students whose teachers did not ask them to make predictions indicates that, while both groups’ pretest scores were almost identical, the group that made predictions had higher posttest scores than the group that did not make predictions.
Following this example, we estimated the effect of each of 31 pedagogies on students' gains by running a list of regression analyses that predicted students’ posttest scores as the function of each pedagogy, while controlling for background information such as gender and race/ethnicity. There are both similarities and differences in the pedagogical effects between high school chemistry and physics. An interesting similarity is that field trips and guest speakers had negative effects, while giving lectures, making connections to life or other disciplines, having students ask or answer questions, in addition to making predictions, had positive effects in both disciplines. The most noteworthy difference between the two disciplines was that hand-on activities, collecting data, doing homework, and memorizing facts had a positive effect on students’ gain in high school chemistry, but not in physics.

 


Last Modified: 12/30/2022
Modified by: Philip M Sadler

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page