
NSF Org: |
DUE Division Of Undergraduate Education |
Recipient: |
|
Initial Amendment Date: | June 17, 2019 |
Latest Amendment Date: | August 23, 2021 |
Award Number: | 1903304 |
Award Instrument: | Standard Grant |
Program Manager: |
Paul Tymann
ptymann@nsf.gov (703)292-2832 DUE Division Of Undergraduate Education EDU Directorate for STEM Education |
Start Date: | July 1, 2019 |
End Date: | June 30, 2023 (Estimated) |
Total Intended Award Amount: | $299,059.00 |
Total Awarded Amount to Date: | $299,059.00 |
Funds Obligated to Date: |
|
History of Investigator: |
|
Recipient Sponsored Research Office: |
100 INSTITUTE RD WORCESTER MA US 01609-2280 (508)831-5000 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
100 Institute Rd Worcester MA US 01609-2280 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | IUSE |
Primary Program Source: |
|
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.076 |
ABSTRACT
With support from the NSF Improving Undergraduate STEM Education Program: Education and Human Resources (IUSE: EHR), this project aims to serve the national interest by leveraging artificial intelligence (AI) techniques to improve student learning and instructor productivity in computing courses. An important aspect of an instructor's job is to provide feedback on student work. Research in cognitive science demonstrates that students more effectively learn concepts if they explain answers to cognitively demanding questions. Learning is further improved when instructor feedback is provided. In large classes, it is difficult for instructors provide all students with the feedback and help they need. This project will develop a tool that uses machine learning to automatically generate feedback on student solutions. The tool is designed to reduce the grading burden of the instructor while ensuring that students receive frequent and worthwhile feedback. The project aims to generate knowledge about how AI can help to engage students without requiring significant instructor time. This tool will impact thousands of students as it is incorporated into ASSISTments, which is used by 50,000 students and is a free service of Worcester Polytechnic Institute (WPI).
The goal of this project is to implement and study a tool that uses state-of-the-art machine learning techniques and natural language processing methods to automate instructor feedback. In the first phase of the project, students in the Algorithms course at WPI will be assigned open-ended questions to complete prior to each lecture. Teaching staff for the course will manually grade student responses. The data set generated in this phase will be used to develop and train the algorithms for the automated tool. The tool will apply methods of varying complexities including regression and tree-based modeling methods, and deep learning methods including long short-term memory (LSTM). It will analyze student responses and suggest feedback and a grade. Instructors may choose to accept the tool's suggestions or override the suggestions to provide different feedback. A randomized control trial will assign students to receive manual feedback or feedback generated with the help of the tool. The usability of the tool will be evaluated through interviews with the instructors. Data on how often the tool's suggestions are overridden and how long it takes the instructor to grade each solution will also be collected. Student experience with the tool will be evaluated through online surveys. Student learning will be evaluated through posttests given each week. The project is expected to improve student learning and instructor productivity in the WPI Algorithms course. It is also has potential to contribute to the broader fields of machine learning, natural language processing, and education through the study, generation, and deployment of effective feedback. The NSF IUSE: EHR Program supports research and development projects to improve the effectiveness of STEM education for all students. Through the Engaged Student Learning track, the program supports the creation, exploration, and implementation of promising practices and tools.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
Project Objective:
The goal of our project was to create a tool called ADVISER, aimed at helping teachers give better feedback on students’ work in an undergraduate Algorithms course. Inspired by technologies like Google’s SmartReply, we wanted to make a tool that uses advanced computer techniques to improve the way teachers evaluate open-ended student answers, making their feedback more meaningful and helpful.
Intellectual Merit:
We successfully developed ADVISER using state-of-the-art methods of natural language processing (NLP) and machine learning (ML). These technologies allowed our tool to understand and analyze students' text and image answers more deeply, making the feedback process easier and more effective for teachers. While a larger scale evaluation of ADVISER has yet to be conducted, pilot testing of the tool has supported the feasibility of its usage in practice.
We made significant advancements by using powerful language models to help the tool generate feedback that is relevant and useful. Through research and development efforts, we have been able to demonstrate improvement in not only the tool’s ability to understand textual answers, but we also improved its ability to understand answers that include mathematical terms or are given in image form which are common in computer science courses.
In comparing across domains, we do find that the developed models underperform in the intended application area of undergraduate algorithms in comparison to other contexts such as mathematics, suggesting some cross-context challenges for these types of tools. Despite this, the performance in pilot testing still provides promise for its application in this space.
Broader Impacts:
Our project has made important contributions that go beyond the classroom. We shared our findings and progress through various educational and technological conferences, reaching other researchers and experts in fields like education, machine learning, and more.
The ADVISER tool has the potential to change the way teachers provide feedback, making education in technical subjects like Algorithms more engaging and effective. By doing this, we hope that more students will succeed in these courses, paving the way for a stronger foundation in important technical skills.
In summary, the ADVISER tool signifies a step forward in educational tools, promoting enhanced teaching and learning experiences, and paving the way for improved educational outcomes in technical disciplines.
Last Modified: 10/24/2023
Modified by: Neil T Heffernan
Please report errors in award information by writing to: awardsearch@nsf.gov.