
NSF Org: |
CNS Division Of Computer and Network Systems |
Recipient: |
|
Initial Amendment Date: | September 16, 2016 |
Latest Amendment Date: | September 16, 2016 |
Award Number: | 1647213 |
Award Instrument: | Standard Grant |
Program Manager: |
Deepankar Medhi
dmedhi@nsf.gov (703)292-2935 CNS Division Of Computer and Network Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | February 1, 2017 |
End Date: | January 31, 2021 (Estimated) |
Total Intended Award Amount: | $599,160.00 |
Total Awarded Amount to Date: | $599,160.00 |
Funds Obligated to Date: |
|
History of Investigator: |
|
Recipient Sponsored Research Office: |
121 UNIVERSITY HALL COLUMBIA MO US 65211-3020 (573)882-7560 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
Columbia MO US 65211-0001 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | CISE Research Resources |
Primary Program Source: |
|
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
This project explores a high-speed network-enabled, immersive, and smart virtual reality application, called vSocial, to connect children with autism spectrum disorder (ASD) from different geographical regions for online social training. The 2015 National Health Interview Survey (NHIS) suggests that 1 in 45 children (>2%) have been diagnosed with ASD. Children with ASD are characterized by impairments in social skills, which can result in low quality of life, bringing emotional, financial, and physical stress and burden on the children, families, schools, and society. This project builds on project team's work during the past five years within which we have successfully developed and evaluated a social competence intervention (SCI) curriculum and a computer-based virtual learning application called iSocial. The iSocial application makes the face-to-face SCI curriculum available online to youth with ASD and public schools, who would otherwise, due to geographical and personal limitations, have no access to such programs provided by experts.
vSocial will use an immersive Virtual Reality (VR) medium for application delivery over high-speed networking infrastructures at schools as well as cloud technologies available within Global Environment for Network Innovation (GENI) Racks. Such a transformation will allow us to study how end-to-end network performance tuning needs to be orchestrated across multi-provider paths and how to troubleshoot last-mile network bottlenecks (e.g., at schools or homes) for field-deployment of demanding gigabit applications such as vSocial. Specifically, (a) it will bridge the knowledge generalization gap between online social training and real-world social skills for students with ASD through use of a networked immersive VR system. (b) It will provide smart sensing capabilities for effective monitoring and tracking of the cognitive-affective states of student learners at remote ends for early individualized pedagogical interventions and outcome assessment. Through vSocial application prototype experimentation in the field (at actual schools with teachers and students involvement), this project will investigate the use of cognitive-affective sensing for online social training in education of students with ASD, use of VR glasses, and Unity3D VR content creation platform to assess immersive learning experience.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
In this project, we developed a high-speed network-enabled, immersive, and smart virtual reality application, called vSocial, to connect children with autism spectrum disorder (ASD) from different geographical regions for online social training. We have successfully leveraged a social competence intervention (SCI) curriculum that makes the face-to-face SCI curriculum available online to youth with ASD and public schools, who would otherwise, due to geographical and personal limitations, have no access to such programs provided by experts.
Details of the key outcomes of the project are as follows:
1) Based on the High-Fidelity platform, we have successfully developed 4 curriculum units, including orientation, group coordination, and facial recognition, for online VR social training of ASD kids.
2) We have developed a set of high-speed networking, security, management, computer vision, VR-based human-computer interaction methods and tools to support our online VR training course development. These tools include:
(a) devices and deep learning-based algorithms to capture eyes images inside the VR headsets and recognize face expression from faces coved by the VR headsets. For five facial expressions: neutral, happy, angry, disgusting, and surprised, we have achieved an average classification accuracy of 88%;
(b) machine learning algorithms and hardware-software interfaces to support natural human-VR interactions, which capture the body pose and motion of the body, and mapping the body pose and gesture into the command and control in the VR worlds, such as walking, running, turning, approve (OK), etc;
(c) tools and VR world gadget for student-instruction interactions, such as virtual iPad for students to search browse information, audio ball for students to seek audio message while walking; a strike system for instructors to provide prize or penalty for students; audio attenuation to mimic the audio fading over distance or across walls or walking away.
(d) network security methods and tools using a novel risk assessment framework that utilizes attack trees to calculate a risk score for varied VR learning threats with rate and duration of threats as inputs.
3) We have conducted 8 usability tests and surveys to obtain feedback about our system user experience and improve the system, as well as the comfort level and cyber sickness of wearing the VR headset.
4) The team joined the Pose Challenge competition, developing deep learning methods for accurate human pose estimation, and ranked 4th in the world.
The project work received news coverage both at the U. of Missouri College level (https://engineering.missouri.edu/2018/08/mu-researchers-working-on-groundbreaking-vr-classroom) and at the Campus level (https://news.missouri.edu/2018/creating-a-virtual-reality).
We have made open various datasets, code and testbed scripts of “OnTimeSocial”, a Social Network Portal at - https://github.com/mizzou-viman-lab/ontimesocial-sgc. The OnTimeSocial web application can be used by vSocial instructors to keep track of progress of the remote students logging into a VRLE.
Last Modified: 03/24/2021
Modified by: Zhihai He
Please report errors in award information by writing to: awardsearch@nsf.gov.