Award Abstract # 1551590
INT: Collaborative Research: Detecting, Predicting and Remediating Student Affect and Grit Using Computer Vision

NSF Org: IIS
Division of Information & Intelligent Systems
Recipient: TRUSTEES OF CLARK UNIVERSITY
Initial Amendment Date: August 26, 2016
Latest Amendment Date: May 16, 2019
Award Number: 1551590
Award Instrument: Standard Grant
Program Manager: Amy Baylor
abaylor@nsf.gov
 (703)292-5126
IIS
 Division of Information & Intelligent Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: September 1, 2016
End Date: August 31, 2022 (Estimated)
Total Intended Award Amount: $134,960.00
Total Awarded Amount to Date: $166,720.00
Funds Obligated to Date: FY 2016 = $134,960.00
FY 2019 = $31,760.00
History of Investigator:
  • John Magee (Principal Investigator)
    jmagee@clarku.edu
Recipient Sponsored Research Office: Clark University
950 MAIN ST
WORCESTER
MA  US  01610-1400
(508)421-3835
Sponsor Congressional District: 02
Primary Place of Performance: CLARK UNIVERSITY
950 MAIN ST
Worcester
MA  US  01610-1400
Primary Place of Performance
Congressional District:
02
Unique Entity Identifier (UEI): LD3WUVEUK2N5
Parent UEI:
NSF Program(s): Cyberlearn & Future Learn Tech
Primary Program Source: 01001617DB NSF RESEARCH & RELATED ACTIVIT
01001920DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 063Z, 8045, 8233, 9251
Program Element Code(s): 802000
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

The Cyberlearning and Future Learning Technologies Program funds efforts that support envisioning the future of learning technologies and advance what we know about how people learn in technology-rich environments. Integration (INT) projects refine and study emerging genres of learning technologies that have already undergone several years of iterative refinement in the context of rigorous research on how people learn with such technologies; INT projects contribute to our understanding of how the prototype tools might generalize to a larger category of learning technologies. This INT project integrates prior work from two well-developed NSF-sponsored projects on (i) advanced computer vision and (ii) affect detection in intelligent tutoring systems. The latter work in particular developed instruments to detect student emotion (interest, confusion, frustration and boredom) and showed that when a computer tutor responded to negative student affect, learning performance improved. The current project will expand this focus beyond emotion to attempt to also detect persistence, self-efficacy, and the trait called 'grit.' The project will measure the impact of these constructs on student learning and explore whether the grit trait (a persistent tendency towards sustained initiative and interest) can be improved and whether and how it depends on other recently experienced emotions. The technological innovation enabling this research into the genre of broadly affectively aware instruction is Smartutors, a tool that uses advanced computer vision techniques to view a student's gaze, hand gestures, head, and face to increase the "bandwidth" for automatically detecting their affect. One goal is to reorient students to more productive attitudes once waning attention is recognized.

This research team brings together a unique blend of leading interdisciplinary researchers in computer vision; adaptive education technology and computer science; mathematics education; learning companions; and meta-cognition, emotion, self-efficacy and motivation. Nine experiments will provide valuable data to extend and validate existing models of grit and emotion. In particular, the team will gather fine-grained data on grit, assess the impact of tutor interventions in real-time, and contribute thereby to a theory of grit. Visual data of student behavior will be integrated with advanced analytics of log data of students' actions based on the behavior of over 10,000 prior students (e.g., hint requests, topic mastery) to provide individualized guidance and tutor responses in a timely fashion. This will allow the researchers to measure the impact of interventions on student performance and attitude, and it will uncover how grit levels relate to emotion and what impact emotions and grit combined have on overall student initiative. By identifying interventions that are sensitive to individual differences, this research will refine theories of motivation and emotion and will reveal principles about how to respond to student grit and affect, especially when attention and persistence begin to wane. To ensure classroom success, the PIs will evaluate Smartutors with 1,600 students and explore its transferability by testing it in a more difficult mathematics domain with older students.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 16)
Ajjen Joshi, Camille Monnier, Margrit Betke, Stan Sclaroff "Comparing random forest approaches to segmenting and classifying gestures" Image and Vision Computing , v.58 , 2017 10.1016
Ajjen Joshi, Danielle Allessio, John J. Magee, Jacob Whitehill, Ivon Arroyo, Beverly Park Woolf, Stan Sclaroff, Margrit Betke "Affect-driven Learning Outcomes Prediction in Intelligent Tutoring Systems" 2019 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019) , 2019 , p.1 https://doi.org/10.1109/FG.2019.8756624
Ajjen Joshi, Danielle Allessio, John J. Magee, Jacob Whitehill, Ivon Arroyo, Beverly Park Woolf, Stan Sclaroff, Margrit Betke "Affect-driven Learning Outcomes Prediction in Intelligent Tutoring Systems" IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019) , 2019 10.1109/FG.2019.8756624
Arroyo, I., Wixon, N., Allessio, D., Woolf, Muldner, K., Burleson, W. "Collaboration Improves Student Interest in Online Tutoring" Eighteenth International Conference on Artificial Intelligence in Education , 2017
Breanna Desrochers, Ella Tuson, John Magee "Evaluation of Why Individuals with ADHD Struggle to Find Effective Digital Time Management Tools" Proceedings of the 21st International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '19), , 2019
Breanna Desrochers, Ella Tuson, Syed Asad Rizvi, and John Magee. "Breaking Down the "Wall of Text" - Software tool to address complex assignments for students with attention disorders." In: Antona M., Stephanidis C. (eds) Universal Access in Human?Computer Interaction. UAHCI 2019. Lecture Notes in Computer Science. , v.11573 , 2019 , p.77 https://doi.org/10.1007/978-3-030-23563-5_7
Hao Yu, Ankit Gupta, Will Lee, Ivon Arroyo, Margrit Betke, Danielle Allesio, Tom Murray, John Magee, Beverly Woolf "Measuring and Integrating Facial Expressions and Head Pose as Indicators of Engagement and Affect in Tutoring Systems" Adaptive Instructional Systems. Adaptation Strategies and Methods. HCII 2021. Lecture Notes in Computer Science , v.12793 , 2021 , p.219 https://doi.org/10.1007/978-3-030-77873-6_16
Karumbaiah, S., Lizarralde, R., Allessio, D., Woolf, B., Arroyo, I. "Addressing Student Behavior and Affect withEmpathy and Growth Mindset" 10th International Conference on Educational Data Mining , 2017
Kevin Delgado, Juan Manuel Origgi, Tania Hasanpoor, Hao Yu, Danielle Allessio, Ivon Arroyo, William Lee, Margrit Betke, Beverly Woolf, Sarah Adel Bargal "Student Engagement Dataset" ICCV 2021: 2nd Workshop and Competition on Affective Behavior Analysis in-the-wild (ABAW) , 2021
Linh Pham, Skye Whitlow, Emily Rosenbaum and John Magee "Input Accessibility: Effect of Input Device on Interaction Time and Accuracy - An Expanded Analysis of a Large Dataset" HCI International 2022 Posters. HCII 2022. Communications in Computer and Information Science, vol 1580. , v.1580 , 2022 , p.576 https://doi.org/10.1007/978-3-031-06417-3_77
Lyle Pierson Stachecki and John Magee "Predictive Link Following Plug-In For Web Browsers" Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility (ASSETS '17) , 2017
(Showing: 1 - 10 of 16)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

We showed that detection of affect and "state of grit" in intelligent tutors is a promising avenue for improving online learning and education. The state of grit entails working to overcome challenges and maintaining effort over time despite failures. We used computer vision techniques to analyze students? faces and gestures and to develop methods to respond to their perceived engagement. The facial expression detection mechanisms worked in real-time so that the online system both recognized grit and responded to it.

Predicting whether students have trouble with problems supports online systems to provide interventions, such as hints or encouragement. We detected students? grit state in real-time; investigated which interventions were effective when students? initiative began to fade; and contributed to theories of affect and motivation. These studies were conducted in the context of MathSpring.org, an intelligent tutor that supports practice with Common Core mathematics problems online.  

In order to measure grit, we first created videos of online student engagement. While students work online, they demonstrated various levels of engagement and emotions (e.g., confusion, boredom, excitement). Having such information automatically accessible to teachers aids them to understand students? progress, suggesting when and who needs further assistance.

We classified facial expressivity to predict student behavior. Specifically, we developed automated facial expression detectors that differentiated between grit and its absence, relatively early (e.g., a few seconds into the learning session). Accuracy in detecting grit grew with the number of problems attempted by each student. The most accurate detectors of grit were based on a combination of machine learning models and human-engineered factors. The system could personalize its interventions, such as hints and encouragement. 

The computer vision system predicted student behavior based on students? facial expression, seconds before students responded or answers were made. It made predictions with relatively high accuracy using only several seconds of video footage, well before students had solved the mathematics problems. 

We also developed Affective Teacher Tools, including a report card that presents measures of students? engagement and a live affective states Dashboard which senses students? affect and performance. We designed several prototypes of these tools. We produced both a student ?Emotion Chart? and an ?Effort Chart? that visualized students? subjective self-reports of their frustration, excitement, interest, and confidence while they solved mathematical problems. 

Our facial expression recognition software detected students' engagement and emotional expressions and inferred student disengagement. The detector achieved 97.94% accuracy as compared with human annotations and earlier system predictions. By analyzing the estimation results of Head Pose Detector on MathSpring videos, we further found invaluable information in the juxtaposition of head position and the logs of problem-solving activity. For example, it appears that a ?head tilt" is a sign of concentration and cognitive engagement. We showed that finetuning of the affect network with age-appropriate images and video further improves performance in this scenario. 

We manually created our own "ground truth" about recognizing student engagement by examining multiple videos to determine the angles for each student?s head position while they solved problems or were distracted. The machine learning model automated this process.

Finally, we collected and made public a video dataset of nearly three thousand frames of college students solving math problems (https://www.cs.bu.edu/faculty/betke/research/learning/). 

We also worked in Latin America with a version of the mathematics system translated into Spanish. This data set contains over 35 hours of facial expressions of 11-year-old children using MathSpring to practice math problem solving as part of their regular mathematics classes in either Spanish speaking or bilingual schools. 

We collected an experimental dataset with participants middle-school aged participants from Worcester and Amherst Massachusetts. The dataset will be used to evaluate the affective tutor and identify future directions for this work.

In sum, we showed that an intelligent tutor can estimate the real-time gaze of students and respond to loss of grit, and consequently disengagement. We produced and tested Teacher Tools that identified students' affective states and provided enhanced information to teachers. Previous work has shown that real time signals from students can be used to improve learning. Our publications summarized both the report card and the affective dashboard, the research studies, results, the implications, and future planned experiments. These results pave the way for future improvements for this task and future tutor systems may use our outcome prediction models to deliver real-time interventions to improve students? learning.


 

 


Last Modified: 12/30/2022
Modified by: John Magee

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page