Award Abstract # 1647213
US Ignite: Focus Area 1: A Networked Virtual Reality Platform for Immersive Online Social Learning of Youth with Autism Spectrum Disorders

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: UNIVERSITY OF MISSOURI SYSTEM
Initial Amendment Date: September 16, 2016
Latest Amendment Date: September 16, 2016
Award Number: 1647213
Award Instrument: Standard Grant
Program Manager: Deepankar Medhi
dmedhi@nsf.gov
 (703)292-2935
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: February 1, 2017
End Date: January 31, 2021 (Estimated)
Total Intended Award Amount: $599,160.00
Total Awarded Amount to Date: $599,160.00
Funds Obligated to Date: FY 2016 = $599,160.00
History of Investigator:
  • Zhihai He (Principal Investigator)
    hezhi@missouri.edu
  • Janine Stichter (Co-Principal Investigator)
  • Prasad Calyam (Co-Principal Investigator)
Recipient Sponsored Research Office: University of Missouri-Columbia
121 UNIVERSITY HALL
COLUMBIA
MO  US  65211-3020
(573)882-7560
Sponsor Congressional District: 03
Primary Place of Performance: University of Missouri-Columbia
Columbia
MO  US  65211-0001
Primary Place of Performance
Congressional District:
03
Unique Entity Identifier (UEI): SZPJL5ZRCLF4
Parent UEI:
NSF Program(s): CISE Research Resources
Primary Program Source: 01001617DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 015Z, 9150
Program Element Code(s): 289000
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

This project explores a high-speed network-enabled, immersive, and smart virtual reality application, called vSocial, to connect children with autism spectrum disorder (ASD) from different geographical regions for online social training. The 2015 National Health Interview Survey (NHIS) suggests that 1 in 45 children (>2%) have been diagnosed with ASD. Children with ASD are characterized by impairments in social skills, which can result in low quality of life, bringing emotional, financial, and physical stress and burden on the children, families, schools, and society. This project builds on project team's work during the past five years within which we have successfully developed and evaluated a social competence intervention (SCI) curriculum and a computer-based virtual learning application called iSocial. The iSocial application makes the face-to-face SCI curriculum available online to youth with ASD and public schools, who would otherwise, due to geographical and personal limitations, have no access to such programs provided by experts.

vSocial will use an immersive Virtual Reality (VR) medium for application delivery over high-speed networking infrastructures at schools as well as cloud technologies available within Global Environment for Network Innovation (GENI) Racks. Such a transformation will allow us to study how end-to-end network performance tuning needs to be orchestrated across multi-provider paths and how to troubleshoot last-mile network bottlenecks (e.g., at schools or homes) for field-deployment of demanding gigabit applications such as vSocial. Specifically, (a) it will bridge the knowledge generalization gap between online social training and real-world social skills for students with ASD through use of a networked immersive VR system. (b) It will provide smart sensing capabilities for effective monitoring and tracking of the cognitive-affective states of student learners at remote ends for early individualized pedagogical interventions and outcome assessment. Through vSocial application prototype experimentation in the field (at actual schools with teachers and students involvement), this project will investigate the use of cognitive-affective sensing for online social training in education of students with ASD, use of VR glasses, and Unity3D VR content creation platform to assess immersive learning experience.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

A. Gulhane, A. Vyas "Security, Privacy and Safety Risk Assessment for Virtual Reality Learning Environment Applications" IEEE Consumer Communications & Networking Conference (CCNC) , 2019 Citation Details
Gulhane, A. Vyas "Security, Privacy and Safety Risk Assessment for Virtual Reality Learning Environment Applications" IEEE Consumer Communications & Networking Conference (CCNC) , 2019 Citation Details
Ning, Guanghan and Zhang, Zhi and He, Zhiquan "Knowledge-Guided Deep Fractal Neural Networks for Human Pose Estimation" IEEE Transactions on Multimedia , v.20 , 2018 10.1109/TMM.2017.2762010 Citation Details
Sai Shreya Nuguri, Prasad Calyam "vSocial: a cloud-based system for social virtual reality learning environment applications in special education" Multimedia tools and applications , 2020 https://doi.org/https://doi.org/10.1007/s11042-020-09051-w Citation Details
Valluripally, Samaikya and Gulhane, Aniket and Mitra, Reshmi and Hoque, Khaza Anuarul and Calyam, Prasad "Attack Trees for Security and Privacy in Social Virtual Reality Learning Environments" 2020 IEEE 17th Annual Consumer Communications & Networking Conference (CCNC) , 2020 https://doi.org/10.1109/CCNC46108.2020.9045724 Citation Details
Wang, Songjie "Cost-Performance Trade-Offs in Fog Computing for IoT Data Processing of Social Virtual Reality" 2019 IEEE International Conference on Fog Computing (ICFC) , 2019 Citation Details
Wang, Songjie "Cost-Performance Trade-Offs in Fog Computing for IoT Data Processing of Social Virtual Reality" 2019 IEEE International Conference on Fog Computing (ICFC) , 2019 Citation Details
Zizza, Chiara Shreya and Starr, Adam and Hudson, Devin and Nuguri, Sai and Calyam, Prasad and He, Zhihai "Towards a social virtual reality learning environment in high fidelity" IEEE Consumer Communications & Networking Conference , 2018 10.1109/CCNC.2018.8319187 Citation Details

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

In this project, we developed a high-speed network-enabled, immersive, and smart virtual reality application, called vSocial, to connect children with autism spectrum disorder (ASD) from different geographical regions for online social training. We have successfully leveraged a social competence intervention (SCI) curriculum that makes the face-to-face SCI curriculum available online to youth with ASD and public schools, who would otherwise, due to geographical and personal limitations, have no access to such programs provided by experts.

Details of the key outcomes of the project are as follows:

1)  Based on the High-Fidelity platform, we have successfully developed 4 curriculum units, including orientation, group coordination, and facial recognition, for online VR social training of ASD kids.

2) We have developed a set of high-speed networking, security, management, computer vision, VR-based human-computer interaction methods and tools to support our online VR training course development. These tools include:

(a) devices and deep learning-based algorithms to capture eyes images inside the VR headsets and recognize face expression from faces coved by the VR headsets. For five facial expressions: neutral, happy, angry, disgusting, and surprised, we have achieved an average classification accuracy of 88%; 

(b) machine learning algorithms and hardware-software interfaces to support natural human-VR interactions, which capture the body pose and motion of the body, and mapping the body pose and gesture into the command and control in  the VR worlds, such as walking, running, turning, approve (OK), etc; 

(c) tools and VR world gadget for student-instruction interactions, such as virtual iPad for students to search browse information, audio ball for students to seek audio message while walking; a strike system for instructors to provide prize or penalty for students; audio attenuation to mimic the audio fading over distance or across walls or walking away.

(d) network security methods and tools using a novel risk assessment framework that utilizes attack trees to calculate a risk score for varied VR learning threats with rate and duration of threats as inputs.

3) We have conducted 8 usability tests and surveys to obtain feedback about our system user experience and improve the system, as well as the comfort level and cyber sickness of wearing the VR headset.

4) The team joined the Pose Challenge competition, developing deep learning methods for accurate human pose estimation, and ranked 4th in the world.

The project work received news coverage both at the U. of Missouri College level (https://engineering.missouri.edu/2018/08/mu-researchers-working-on-groundbreaking-vr-classroom) and at the Campus level (https://news.missouri.edu/2018/creating-a-virtual-reality). 

We have made open various datasets, code and testbed scripts of “OnTimeSocial”, a Social Network Portal at - https://github.com/mizzou-viman-lab/ontimesocial-sgc. The OnTimeSocial web application can be used by vSocial instructors to keep track of progress of the remote students logging into a VRLE.

 


Last Modified: 03/24/2021
Modified by: Zhihai He

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page