Award Abstract # 2017042
An Embodied, Augmented Reality Coding Platform for Pair Programming

NSF Org: IIS
Division of Information & Intelligent Systems
Recipient: UNIVERSITY OF CALIFORNIA, SAN DIEGO
Initial Amendment Date: September 1, 2020
Latest Amendment Date: June 12, 2023
Award Number: 2017042
Award Instrument: Standard Grant
Program Manager: Paul Tymann
IIS
 Division of Information & Intelligent Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: October 1, 2020
End Date: September 30, 2024 (Estimated)
Total Intended Award Amount: $749,979.00
Total Awarded Amount to Date: $774,414.00
Funds Obligated to Date: FY 2020 = $749,979.00
FY 2022 = $8,264.00

FY 2023 = $16,171.00
History of Investigator:
  • Ying Choon Wu (Principal Investigator)
    yingchoon@gmail.com
  • Robert Twomey (Co-Principal Investigator)
Recipient Sponsored Research Office: University of California-San Diego
9500 GILMAN DR
LA JOLLA
CA  US  92093-0021
(858)534-4896
Sponsor Congressional District: 50
Primary Place of Performance: University of California-San Diego
La Jolla
CA  US  92093-0934
Primary Place of Performance
Congressional District:
50
Unique Entity Identifier (UEI): UYTTZT6G9DT1
Parent UEI:
NSF Program(s): IUSE,
ITEST-Inov Tech Exp Stu & Teac,
Cyberlearn & Future Learn Tech
Primary Program Source: 01002223DB NSF RESEARCH & RELATED ACTIVIT
04002324DB NSF STEM Education

04002021DB NSF Education & Human Resource

1300XXXXDB H-1B FUND, EDU, NSF
Program Reference Code(s): 093Z, 8045, 9251
Program Element Code(s): 199800, 722700, 802000
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070, 47.076

ABSTRACT

Augmented reality (AR) allows the real world to be enhanced, or augmented, by computer-generated objects that are ?added? to the real world. For example, a clothing store may use AR to allow a customer to ?see? how clothes would look on them before they are purchased. This project at University of California San Diego will use AR to create an environment in which students can practice pair programming in an AR environment. Pair programming is a software development technique in which two programmers work together at one workstation, on the same piece of code. Wearing AR headsets, students will be able to manipulate, in three-dimensional space, using familiar gestures such as pointing and grabbing, a piece of code. The use of a AR programming environment, that immerses students in their code, will result in a more intuitive and collaborative computing experience that will engage learners with low confidence in programming, while supporting growth in their computational thinking abilities.

A human-centered AR coding platform will be developed for the creation of three-dimensional assets, artwork, and computational logic. This platform will be a merged digital/physical workspace where spatial representations of code, interactive outputs, and user editing activities are simultaneously located. While wearing AR headsets, learners will manipulate virtual code blocks in real space to assemble programs, and they will debug their code by evaluating the representations that they create. The approach will build on the accessibility and sense of play in successful visual learning technologies (e.g., Scratch), but will leverage regular patterns of perception, action, and social interaction in the three-dimensional physical world. The goal is to increase participation and interest in groups traditionally underrepresented in the educational and career pathways of computer science, including females and some minority students, who often exhibit lower confidence in STEM-related abilities relative to other students.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Lay, Ryan and Bhutada, Rhea and Lobo, Alejandro and Twomey, Robert and Eguchi, Amy and Wu, Ying Choon "Embodied Code: Creative Coding in Virtual Reality" , 2024 https://doi.org/10.1145/3626253.3635428 Citation Details
Lobo, Alejandro and Eguchi, Amy and Twomey, Robert and Wu, Ying Choon "Balanced Creative Coding for Motivation and Learning Transfer" , 2025 https://doi.org/10.1145/3641555.3705134 Citation Details
Sharkey, T. "Need Finding for an Embodied Coding Platform: Educators Practices and Perspectives" CSEDU , v.1 , 2022 https://doi.org/10.5220/0011000200003182 Citation Details
Twomey, Robert and Sharkey, Tommy and Wood, Timothy and Eguchi, Amy and Sweet, Monica and Wu, Ying "An Immersive Environment for Embodied Code" CHI 22 Extended Abstracts , 2022 https://doi.org/10.1145/3491101.3519896 Citation Details
Wu, Ying_Choon and Eguchi, Amy and Twomey, Robert and Otsuki, Mayumi "Making Computer Science Concepts Visible and Virtually Tangible through Creative Coding in Virtual Reality" , 2025 Citation Details

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

Embodied Code is an immersive creative coding toolkit implemented in virtual reality (VR). It introduces novices to fundamental computing concepts and game engine skills through a visual coding interface in 3D space (Figure 1). Unlike traditional creative coding toolkits, this system affords coders considerable flexibility in placing, rearranging, and manipulating elements of code (nodes and connectors) and its output such that space and movement can be leveraged as organizational and conceptual scaffolds.  Further, assembling nodes and connectors is structured by two primary principles – input versus output and events versus data.  These design principles were adopted to support the emergence of analogy-based understandings that connected the physical experience of working with code elements in a 3D space with physical experiences of engaging in the real world.  These principles also supported exploration of different levels of coding abstraction in classroom use (Figure 2).

Findings of our studies indicate the potential benefits of Embodied Code. On average, after a two-day workshop using Embodied Code, high school seniors’ performance on a computational concepts test improved by 11% relative to pre-test – versus 7% improvement by a control group of peers who engaged in different coding activities.  However, this outcome only approached significance due to the small sample size.  In a second study involving high school freshmen, we found modest gains from pre-test to post-test on a computational concepts test in individuals with positive CS attitudes.  However, analysis of students’ final projects revealed that most had mastered fundamental skills such as spawning objects, changing their color, texture, and size, and adding game engine physics.  We are currently conducting more targeted studies aimed at understanding how learner’s grasp of coding concepts changes over the course of engagement with the platform.

 


Last Modified: 02/27/2025
Modified by: Ying Choon J Wu

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page