
NSF Org: |
DUE Division Of Undergraduate Education |
Recipient: |
|
Initial Amendment Date: | August 29, 2018 |
Latest Amendment Date: | June 23, 2021 |
Award Number: | 1821594 |
Award Instrument: | Standard Grant |
Program Manager: |
R. Corby Hovis
chovis@nsf.gov (703)292-4625 DUE Division Of Undergraduate Education EDU Directorate for STEM Education |
Start Date: | October 1, 2018 |
End Date: | September 30, 2023 (Estimated) |
Total Intended Award Amount: | $1,006,103.00 |
Total Awarded Amount to Date: | $1,118,368.00 |
Funds Obligated to Date: |
FY 2021 = $112,265.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
104 AIRPORT DR STE 2200 CHAPEL HILL NC US 27599-5023 (919)966-3411 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
113 Peabody Hall CB#3500 Chapel Hill NC US 27599-3500 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | IUSE |
Primary Program Source: |
04002122DB NSF Education & Human Resource |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.076 |
ABSTRACT
This collaborative project includes investigators at the University of North Carolina at Chapel Hill (Award DUE-1821594), the University of Nevada at Las Vegas (UNLV; Award DUE-1821601), and the College of Southern Nevada. The United States has an ongoing need for more STEM professionals. College students who initially major in STEM cite the coursework as a major reason for leaving STEM to pursue other interests. Instructors who move away from lectures to more engaging kinds of instruction find that their students are more likely to stay in STEM majors, but only when the students know how to learn in these new environments. Unfortunately, many students have simply not experienced engaging instruction and therefore have not developed the knowledge and skills to take full advantage of it. This project will develop models to identify struggling students in introductory STEM courses (especially biology and anatomy and physiology) and will test interventions to help these students gain the knowledge and skills they need to benefit from active-learning course formats. This work should provide knowledge that can be used to increase students' success and retention in the courses under study, and should inform similar interventions in other STEM courses.
In this project, the investigators will combine: 1) identifying struggling students with an existing data-driven, web-based approach for early identification and 2) support of struggling college students with a robust initiative focused on retaining students who traditionally have not persisted in STEM fields. Specifically, a previous project at UNLV, Learning Theory and Analytics as Guides to Improve Undergraduate STEM Education (LearningTAGs), has developed a data-driven approach for identifying and directly intervening with struggling students. In addition, the Finish Line Project at UNC-Chapel Hill has found that first-generation college students benefit most from early intervention, accessible academic coaches, and active-learning STEM classrooms. The LearningTAGs methods can be expanded to better serve struggling students by integrating findings from the Finish Line Project. The researchers will (1) develop and test UNLV's LearningTAGs prediction modeling and digital intervention at UNC (another highly selective, public institution) as well as at the College of Southern Nevada (an open enrollment two-year college); (2) leverage Finish Line Project findings about academic coaching to test various support interventions (i.e. online self-regulated learning instructional modules, academic coaching, and supplemental instruction); and (3) identify whether support efficacy varies across different groups of students, including students from groups that are underrepresented in STEM and first-generation college students. The campus data infrastructure and student support platform that is tested and refined in this project should provide a model that can be replicated at other colleges and universities, using the universities' existing data from learning management systems.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
Over the five years of this IUSE Engaged Student Learning project, our five-year agenda had three foci: (1) developing, testing, and elaborating upon previous successful prediction modeling research by transferring it to a new institutional context, (2) investigating the efficacy of a variety of science of learning interventions to determine the optimal match of intervention design to student needs, and (3) to examine whether the efficacy of these interventions varied depending upon whether a student was a member of an underrepresented minority or first-generation college student group. Building on previous NSF-funded research at another institution, we applied and refined methods of gathering trace data from students’ engagement with course digital tools (e.g., learning management systems, video hosting software, assessment tools) and then using theory-aligned coding schemes to categorize and understand that trace data, separating the student learning signal from the digital noise. Then, we used those theory-aligned digital trace data that reflected student learning to predict, with a high degree of accuracy, which students would successfully obtain the course grade they needed to proceed in their major, and which ones would not. Using these prediction models, we were able to test a variety of interventions, based in empirical work on the science of learning, designed to help those students predicted to struggle in the course. We were particularly interested in examining which interventions worked, for whom, and under what conditions.
Due to efforts of the research team over the five years of this project, we produced numerous findings that enhanced the scholarly literature and evidenced, and showed promise of, broader impact. First, we showed previous prediction modeling efforts could be successfully transferred to a new context and then elaborated to include new forms of data that informed prediction through multimodal learning analytics methods. We developed prediction models with a high degree of accuracy that could be reapplied across numerous semesters. These models can be used to identify students likely to struggle in the course after only three class periods worth of digital trace data. The value of using a prediction model that can accurately identify struggling students early in the semester is that there is still ample time to intervene and help these students succeed. In essence, these prediction models identify students before their academic difficulties threaten their STEM persistence and careers. Further, because we co-designed courses and data models with instructors, our prediction models are an example of explainable artificial intelligence, where the prediction models include intuitive predictors that derive from instructional design, and can thus provide insight on ways to support learning through redesign and responsive types of learner support. Second, we tested a variety of learner supports by redesigning and testing new versions of the Science of Learning to Learn interventions. These included highly scalable flyer-like digital advice pages, 15-minute trainings on essential course design features and strategies to navigate them, and 90-minute interactive video-based modules to provide more comprehensive learning skill development. These can be delivered at low to no cost. Prediction models can also be used to fast-track referrals to one-on-one, in person coaching interventions that are less scalable but more adaptable to individual student needs and that can be commenced earlier in the semester than when students typically begin to use these services (i.e., after a first poor grade). Across eight semesters of intervention deployment and evaluation, with 3,321 consented students, we found that the success of the intervention depended critically upon students’ motivations for learning. Students with positive motivations, students who adhered to their instructors’ design for the course, and students who fully engaged with the interventions performed better in the course. Our results showed there are complex interactions between motivation, student learning activities, and their success in the course, all indicating the need for more research with larger samples to investigate how these interactions might vary for students who are a member of an underrepresented minority or first-generation college student group. Finally, our efforts elucidated how to conduct effective, ethical, and transformative learning analytics research.
To date, our scholarly efforts have been shared via 29 presentations at national and international conferences, six presentations to practitioner and technical audiences, and five peer-reviewed journal publications. This work has involved four doctoral students and two postdoctoral research associates, both of whom have secured tenure-line academic positions. Also, we have developed productive relationships with numerous industry partners, each of whom has benefitted our project with the infrastructure and support while also benefitting themselves from our findings and insights. Finally, our findings have informed scholarly, educational, and industry partners on how to understand and promote undergraduate STEM student success via learning analytics and theory- and empirically-supported efforts to develop successful courses and provide scalable and effective support to students who need assistance maximizing the opportunities those courses provide.
Last Modified: 01/27/2024
Modified by: Jeffrey A Greene
Please report errors in award information by writing to: awardsearch@nsf.gov.