Award Abstract # 1742011
Formative Assessments for Computer Science in NYC

NSF Org: DRL
Division of Research on Learning in Formal and Informal Settings (DRL)
Recipient: TEACHERS COLLEGE COLUMBIA UNIVERSITY
Initial Amendment Date: August 18, 2017
Latest Amendment Date: June 22, 2018
Award Number: 1742011
Award Instrument: Standard Grant
Program Manager: Amy Baylor
abaylor@nsf.gov
 (703)292-5126
DRL
 Division of Research on Learning in Formal and Informal Settings (DRL)
EDU
 Directorate for STEM Education
Start Date: September 1, 2017
End Date: August 31, 2022 (Estimated)
Total Intended Award Amount: $1,401,950.00
Total Awarded Amount to Date: $1,401,950.00
Funds Obligated to Date: FY 2017 = $1,401,950.00
History of Investigator:
  • Nathan Holbert (Principal Investigator)
    holbert@tc.columbia.edu
  • Matthew Berland (Co-Principal Investigator)
  • Elizabeth DiSalvo (Co-Principal Investigator)
  • Leigh DeLyser (Former Co-Principal Investigator)
Recipient Sponsored Research Office: Teachers College, Columbia University
525 W 120TH ST
NEW YORK
NY  US  10027-6605
(212)678-3000
Sponsor Congressional District: 13
Primary Place of Performance: Teachers College, Columbia University
525 West 120th Street
New York
NY  US  10027-6696
Primary Place of Performance
Congressional District:
13
Unique Entity Identifier (UEI): DBM1C8MDJ5L3
Parent UEI:
NSF Program(s): STEM + Computing (STEM+C) Part
Primary Program Source: 04001718DB NSF Education & Human Resource
Program Reference Code(s): 023Z
Program Element Code(s): 005Y00
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.076

ABSTRACT

The project is funded by the STEM+Computing program, which seeks to address emerging challenges in computational science, technology, engineering, and mathematics (STEM) areas through the applied integration of computational thinking and computing activities within disciplinary STEM teaching and learning in early childhood education through high school (preK-12). Integrating computer science into the United States schools is one of the most significant changes for K-12 public education in decades. The New York City (NYC) CS4All initiative is a leader in this educational reform and serves as a model for other cities and states. One aspect of the NYC efforts is to integrate computer science concepts across disciplines. A challenge with this approach is helping non-computer science teachers, and their students, in assessing if they have met computer science learning goals. This project seeks to create a method to measure this learning through an interactive technology-enhanced assessment system that will provide formative feedback to students and teachers. This project will address core research questions about how to provide ongoing assessment of student computer science (CS) learning that: (a) provides useful feedback to teachers and students; (b) is appealing for students to engage with; and (c) can be used with a wide diversity of curricula that integrate CS and STEM domains. The tool will be open-sourced and made available to approximately 600 students during the project, with an eye to serve over 100,000 diverse and low-income NYC middle school students, and eventually to other districts across the U.S. In addition, the project will provide insight on research and design in areas as diverse as assessment, CS education curricula, and educational game design.

This project addresses the nationwide need for a validated instrument that provides formative feedback on student progress with diverse computer science curricula. The is a three-year project to design, develop, and conduct research with a validated system of formative assessment for middle school computational thinking that would meet NYC's needs and create opportunities for broader impacts nationwide. Building on research-based approaches, including stealth assessment and evidence-centered design (ECD) of assessments, this project incorporates a range of performance tasks on the topic of Data and Analysis as outlined in the K-12 CS framework. Students will engage in a construction environment and formative feedback on student knowledge and performance will be generated from their interactions. Using participatory design methods, the project will engage stakeholders with designing tasks and reports that best meet teachers' and students' needs. The project will be piloted and then implemented in schools using varied CS curricula throughout NYC. The research will provide empirical data on how integration of formative assessment with adoption of varied CS curriculum can enable districts to meet the needs of a diverse student population. Furthermore, adopting an iterative design process steeped in participatory methods will allow for the generation of theory-driven design principles around how students and teachers from diverse schools and neighborhoods encounter core CS constructs from a variety of CS curricula.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Zheng, Yipu and Blikstein, Paulo and Holbert, Nathan "Combining Learning Analytics and Qualitative Analysis for the Exploration of Open-Ended Learning Environments" Proceedings of the 2020 Constructionism Conference , 2020 Citation Details
Basu, Satabdi and Disalvo, Betsy and Rutstein, Daisy and Xu, Yuning and Roschelle, Jeremy and Holbert, Nathan "The Role of Evidence Centered Design and Participatory Design in a Playful Assessment for Computational Thinking About Data" SIGCSE '20: Proceedings of the 51st ACM Technical Symposium on Computer Science Education , 2020 10.1145/3328778.3366881 Citation Details

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

The Formative Assessment for Computer Science (PFACS) project is designed to support middle school teacher?s formative assessment practices of students? ability to understand and analyze data as it relates to the domain of computer science. To address this goal the project used a combination of evidence-centered and participatory design approaches to develop a formative assessment game, Beats Empire. In the game, students take on the role of a music studio and make decisions about which artist to hire and what type of songs to record using data about listeners? interest in various music genres, topics, and song moods. The game focuses on CS concepts and practices related to data collection, storage, visualization, and interpretation. In addition, the PFACS project developed a dashboard that provides teachers with actionable information about how students are playing the game, as well as a set of unplugged formative activities and teacher guides for these activities. 

A key design outcome of Beats Empire was the development of a game system that allowed students to make personally meaningful choices throughout the game, choices that could also be assessed for their connection to target concepts. Rather than standardize the game experience, or require all players to ?correctly? answer questions or perform tasks, Beats Empire invites players to make predictions about music trends with in-game data. Playes then indicate which data they used to make these predictions, allowing the teacher and game system to evaluate whether these decisions are data driven. This system ensures assessment moments make sense in the game?s narrative and that these in-game actions are useful and meaningful to students? play.

To obtain feedback on the game the team conducted cognitive interviews with students during game play, interviews with teachers after they had used the game in their classroom and several small pilot studies. The formative feedback sessions with students confirmed many of our foundational assumptions about middle school students? interest in music and games.  We found students? game literacy was very high, with little direction or help needed to understand how to play this game about data and the music industry. We also found that students had a strong interest in music and the music industry, and that music was a core part of their social experience. This finding highlights the importance of developing games that aligns with the experiences of young people and made explicit the need to ensure the game?s representation of the music industry was authentic.

Results from the pilot studies showed overall high engagement of students during gameplay. Students overwhelmingly enjoy the game and are enthusiastic about engaging meaningfully with the game mechanics. By collecting clickstream data of how players progressed through the game, and how they used data to make predictions in the game, we were also able to differentiate students that used data differently in the game. This log data revealed that students ranged from those who rarely used data, to those who consistently were able to draw meaningful inferences from data. The game was also able to detect those students that changed their use of data over time. We found that students were less likely to engage with aspects of collection and storage in the game, but this may be because students were able to progress without this level of engagement. These findings indicate play in Beats Empire can help to distinguish between students that have varying capabilities using and drawing inferences from data, and that this information can be communicated clearly to teachers through the included dashboard. 

When examining the relationship between developed "unplugged" activities and game play we found that the use of the classroom activities had some effect on how students played hte game. For instance, after students engaged with an activity in class that had them comparing information found from line graphs and bar graphs, we found that students were more likely to use the line graph in the game. We saw similar behavior when students engaged with activities around data storage and collection. 

Results from the teacher interviews indicated that teachers found the game to be a useful tool. Teachers felt that Beats Empire is a good way to engage students with the concepts of data and analysis. Teachers also indicated that the unplugged activities have value and they saw the connections between the two and how they could use both the game and the activities in future instruction. 

Overall, we believe that this project demonstrated the potential for the use of formative assessment games to measure students? engagement with data science practices.  We found that designing a formative assessment game required a balance of incorporating ways to keep students engaged with the game, while also collecting information that would allow you to make inferences about students evolving knowledge and skills.

 


Last Modified: 11/07/2022
Modified by: Nathan Holbert

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page