Award Abstract # 1248991
SBIR Phase I: Emotionally Immersive Tele-Learning

NSF Org: TI
Translational Impacts
Recipient:
Initial Amendment Date: December 2, 2012
Latest Amendment Date: May 5, 2013
Award Number: 1248991
Award Instrument: Standard Grant
Program Manager: Glenn H. Larsen
TI
 Translational Impacts
TIP
 Directorate for Technology, Innovation, and Partnerships
Start Date: January 1, 2013
End Date: June 30, 2013 (Estimated)
Total Intended Award Amount: $150,000.00
Total Awarded Amount to Date: $155,000.00
Funds Obligated to Date: FY 2013 = $155,000.00
History of Investigator:
  • Ian Bennett (Principal Investigator)
    ianmbennettc2y@gmail.com
Recipient Sponsored Research Office: The Spirituality Network, Inc.
2275 East Bayshore Road
Palo Alto
CA  US  94303-3222
(650)796-9517
Sponsor Congressional District: 16
Primary Place of Performance: The Spirituality Network, Inc.
211 Cleveland Ct.
Mill Valley
CA  US  94941-3515
Primary Place of Performance
Congressional District:
02
Unique Entity Identifier (UEI): H7JCNFSGDGW3
Parent UEI:
NSF Program(s): SBIR Phase I
Primary Program Source: 01001314DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 5371, 8031, 8033, 8039
Program Element Code(s): 537100
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.084

ABSTRACT

This Small Business Innovation Research (SBIR) Phase I project aims to incorporate novel machine vision functionality and innovative social networking capabilities into the technology of distance learning and online webinars. The primary objective is to make on-line training and virtual collaboration more engaging and compelling by replicating non-verbal feedback related to the rate and acceptance of information delivery in lectures, in order to make the experience of distance learning more emotionally immersive. This project contributes four significant innovations: 1) a machine-vision recognition system for head position, gaze direction, facial expressions of interest or comprehension, which when averaged across participants will provide simple feedback related to the rate and acceptance of information delivery; 2) a machine-vision-based hand detection system for motion and shape to detect hand raising or other gestures; 3) to enable hot-deployable third party pedagogical applications within the framework, aka "side apps"; and 4) to integrate social functionalities that replicate pre- and post-lecture socialization including pair-sharing, breakout groups, team teaching, and support for teaching assistance. In anticipation of support for this project, TSN has already built an evaluation test bed for tele-lectures and virtual classrooms.

The broader impact/commercial potential of this project is to significantly transform on-line education. On-line training also has the potential to radically alter the delivery of education. By 2018, the estimated cost of four-year public university education is expected to rise to $151,000. For private colleges, this cost will increase to over $300,000. To address this crisis, several colleges and startup companies have announced an increased use of on-line training. However existing systems for streaming video for lectures, and virtual group learning environments, have not advanced to the level that distance learning isn't considered to be a second-class citizen in the educational world. The proposed system can transform the fundamental efficacy of on-line training and spur new research.

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

The objective of the SBIR Phase I project is to incorporate novel machine vision functionality and innovative social networking capabilities into the technology of distance learning and online webinars. The primary objective is to allow on-line training and virtual collaboration to be more engaging and compelling by replicating non-verbal meta-linguistic feedback related to the rate and acceptance of information delivery in lectures and to make the experience of distance learning more emotionally immersive. This system,  ‘Revolution e-Learning Platform’ and will be referred as ‘Revolution’ throughout this report.

Revolution implemented and integrated the following four innovations:

• Machine-vision recognition system for head position, gaze direction, facial expressions of interest or comprehension, which when averaged across participants will provide simple feedback related to the rate and acceptance of information delivery;

• Machine-vision-based hand detection system for motion and shape to detect hand raising or other gestures;

• Ability to enable hot-deployable third party pedagogical applications within the framework, aka ‘side apps’;

• The integration of social functionalities that replicate pre- and post-lecture socialization including pair-sharing, breakout groups, team teaching, and support for teaching assistance.

The above innovations were made possibe by the following technical achievements: the creation of a machine vision system to gauge head position, gaze direction, facial expressions of interest paired with an analysis system for estimating comprehension; the development of a server application to aggregate the data coming in from many students to provide the lecturer with simple feedback about the rate of information delivery; the integration of novel social networking and media/content functionality; and the integration of these capabilities within a core webinar/collaboration system for the prototype pilot which provided a crisp and responsive user experience.

This project not only achieved the key objectives as described above but also provided the framework for further dissemination of the research results to the community via publications and discussions at relevant upcoming professional conferences. The Revolution prototype demonstrates these key functionalities regarding machine vision, machine learning, intelligent data aggregation, and user experience. The objectives were achieved in the following process: first, we developed a complete specification for the framework and successfully implemented a prototype of the Revolution system, then we developed and integrated the systems, and then we deployed a pilot of the prototype within a live in-vivo test bed so actual users could provide us with deeper customer insights and market validation data. Although the engineering project was significantly more technically challenging than initially estimated, we successfully managed to make agile adjustments to the critical path dynamically in order to meet the feature set proposed for our prototype on budget and on time.

A functional and operationally meaningful administrative system was also completed – this system provides control and admin functionality for lecturers and teaching assistants, and customized interfaces for students, lecturers and teaching assistants. Additionally, we were able to complete a first pass at media/content and so­cial network data integration. Most importantly, via this prototype and pilot, we were able to learn more about the real and unarticulated needs of students, lecturers and teaching assistants – and the constituents of the entire eco-system of online learning including managers and school superintendents – to inform our design and validate marketability.

 

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page