
NSF Org: |
DRL Division of Research on Learning in Formal and Informal Settings (DRL) |
Recipient: |
|
Initial Amendment Date: | June 25, 2016 |
Latest Amendment Date: | June 25, 2016 |
Award Number: | 1644430 |
Award Instrument: | Standard Grant |
Program Manager: |
Finbarr Sloane
DRL Division of Research on Learning in Formal and Informal Settings (DRL) EDU Directorate for STEM Education |
Start Date: | September 1, 2015 |
End Date: | July 31, 2017 (Estimated) |
Total Intended Award Amount: | $295,579.00 |
Total Awarded Amount to Date: | $311,579.00 |
Funds Obligated to Date: |
FY 2015 = $16,000.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
400 HARVEY MITCHELL PKY S STE 300 COLLEGE STATION TX US 77845-4375 (979)862-6777 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
400 Harvey Mitchell Pkwy South College Station TX US 77845-4357 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): |
REAL, ECR-EDU Core Research |
Primary Program Source: |
04001516DB NSF Education & Human Resource |
Program Reference Code(s): | |
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.076 |
ABSTRACT
This research aims to improve student learning effectiveness, efficiency, and enjoyment in online courses by using online learning data to provide constructive feedback to course developers and instructors. The feedback provides assistance with the design and improvement of course activities and interactive instruction. Online course development is often guided primarily by the intuition of the instructor. This research seeks to enable course improvements using a data-driven approach by developing methods to measure the effects of course redesign.
This project integrates diagnostic feedback on course content and activities, analytic methods and course-inspection tools for discovering barriers to student learning and opportunities for course improvement, and authoring tools to translate discoveries to course improvements. An Integrated Development Environment with Analytics (IDEA) that allows course developers and instructors to validate courseware contents before actual use will be developed. Once the courseware becomes available online and used by students, IDEA will support the use of logged data and analytics to discover barriers to learning, to hypothesize modifications in the underlying cognitive model of learning, and to evaluate and select the modifications that best predict the data. IDEA will provide performance profiling to developers and instructors to summarize students' learning and identify issues to be improved by analyzing learning curves and modifying knowledge component models. IDEA will also provide developers with a tight connection between performance profiling and courseware contents so that developers can review and modify troublesome courseware contents.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
A goal of this project is to develop a novel methodology to analyze online courseware content and learning data to provide constructive feedback for courseware developers and instructors. In the past, online course development was largely guided by instructor intuition. In order to encourage the transition from that intuitive approach to an evidence-based approach, we previously developed an online learning data repository, DataShop, that provides learning analytics, including a semi-automated discovery of knowledge components (i.e., a set of latent skills, knowledge, and concepts that students ought to learn) and learning curve analysis, and used it to provide data-driven feedback for course improvement. The current project is an extension of our effort to advance evidence-based online course development to address a major issue that we observed through our past practice—the lack of a courseware development environment that provides course developers with an ability to interpret the learning-data visualization toward course improvement.
We have developed an online course development environment called IDEA—an Integrated Development Environment with Analytics—that provides course developers and instructors with data-analytics feedback to validate courseware content as a basis for course improvement. IDEA provides three specific types of feedback: (1) feedback on the coherence of the courseware content, (2) feedback on the barriers to students’ learning, and (3) feedback on the validity of the current knowledge component model. The first type of feedback is provided during the period when the courseware developers are creating an online course. For example, the system checks if the placement (both amount and coverage) of assessment is appropriate. The second and third types of feedback are provided once students actually use the online course and their learning log data are collected. For example, the barriers to learning are identified by computing a performance profile for each assessment item to see if there are assessment items that show a notable high error rate. The validity of the knowledge component model is assessed by computing learning curves for each of the knowledge components using actual students’ learning data.
The IDEA system has been applied to existing online courses, including a gateway discrete math course (Discrete Math Primer, or DMP for short) and an introductory course for computing facility (Computing@Carnegie Mellon, or C@CM for short), both hosted by Open Learning Initiative at Carnegie Mellon University. The results show sizable gains in student performance for DMP after applying IDEA, relative to previous years, as measured by the course’s summative assessments (students performed at an 80% or higher level). Students’ satisfaction with the DMP course (measured as their response to Faculty Course Evaluation) also grew significantly over the rounds of IDEA-led improvement.
Intellectual merit includes advancement for traditional cognitive task analysis to make it scalable and enable it to convey genuine knowledge about the structure and contents of online courses to course developers; that is, IDEA generates actionable knowledge for course improvement. We have also demonstrated that IDEA helps learning science researchers advance the theory of how people learn by allowing them to identify barriers to students’ learning and what kind of instructional strategies can reduce those barriers. The lessons learned from the project have been published as the following papers:
Matsuda, N., Furukawa, T., Bier, N., & Faloutsos, C. (2015). Machine beats experts: Automatic discovery of skill models for data-driven online course refinement. In J. G. Boticario, O. C. Santos, C. Romero, M. Pechenizkiy, A. Merceron, P. Mitros, J. M. Luna, C. Michaescu, P. Moreno, A. Hershkovitz, S. Ventura & M. C. Desmarais (Eds.), Proceedings of the International Conference on Educational Data Mining (pp. 101-108). Madrid, Spain.
Matsuda, N., Van Velsen, M., Barbalios, N., Lin, L., Vasa, H., Hosseini, R., Sutner, K., Bier, N. (2016). Cognitive Tutors Produce Adaptive Online Course: Inaugural Field Trial. In A. Micarelli, J. Stamper & K. Panourgia (Eds.), Proceedings of the International Conference on Intelligent Tutoring Systems (pp. 327-333). Switzerland: Springer.
Bier, N. (2016) "The Open Learning Initiative at 15: Successes, Lessons Learned and the Road Ahead." The 13th Open Education Conference. Richmond, VA. November 2-4, 2016.
Bier, N (2017) "OER and Learning Research." The 14th Open Education Conference. Anaheim, CA. October 10-13, 2017.
Sutner, Klaus (2016) "Results from the OLI Discrete Mathematics Primer." 2016 Teaching & Learning Summit at Carnegie Mellon University. Pittsburgh, PA. October 14, 2016.
Broader impacts include the cross-institutional, rapid dissemination of scalable methods for data-driven online course improvement. Since the developed IDEA methods are platform-agonistic, they will apply to a wide variety of existing online course platforms and providers. This has been already begun to be demonstrated in the use of these methods in the OLI, Open EdX and Lumen Waymaker platforms.
Last Modified: 01/28/2018
Modified by: Noboru Matsuda
Please report errors in award information by writing to: awardsearch@nsf.gov.