Award Abstract # 2112778
CNS Core: Small: A Split Software Architecture for Enabling High-Quality Mixed Reality on Commodity Mobile Devices

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: PURDUE UNIVERSITY
Initial Amendment Date: August 5, 2021
Latest Amendment Date: August 5, 2021
Award Number: 2112778
Award Instrument: Standard Grant
Program Manager: Marilyn McClure
mmcclure@nsf.gov
 (703)292-5197
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: October 1, 2021
End Date: September 30, 2025 (Estimated)
Total Intended Award Amount: $424,434.00
Total Awarded Amount to Date: $424,434.00
Funds Obligated to Date: FY 2021 = $424,434.00
History of Investigator:
  • Charlie Hu (Principal Investigator)
Recipient Sponsored Research Office: Purdue University
2550 NORTHWESTERN AVE # 1100
WEST LAFAYETTE
IN  US  47906-1332
(765)494-1055
Sponsor Congressional District: 04
Primary Place of Performance: Purdue University
465 Northwestern Avenue
West Lafayette
IN  US  47907-2114
Primary Place of Performance
Congressional District:
04
Unique Entity Identifier (UEI): YRXVL4JYCEF5
Parent UEI: YRXVL4JYCEF5
NSF Program(s): CSR-Computer Systems Research
Primary Program Source: 01002122DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7923, 7354
Program Element Code(s): 735400
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

By blending the physical and digital worlds into a programmed experience, Mixed Reality (MR) allows users to visualize and interact with digital information such as 3D overlays and real-time data and has important applications in many societal domains including education, remote working, military training, and health care such as tele-medicine. Despite the tremendous potential of the MR technology, the MR solutions available in today?s market are either enterprise-grade which are costly or consumer-grade which can only support low-quality MR content which leads to poor user experience. The high cost and/or low-quality of current enterprise-grade and consumer-grade MR solutions lead to a fundamental ?content-adoption? dilemma faced by the MR industry: the lack of MR content has limited the market penetration of custom-made MR headsets, and the low market penetration of MR headsets in turn has hindered the development of MR content.

This NSF CSR project proposal will develop key technologies to enable high-quality MR on commodity mobile devices like smartphones, etc.., viewed by a simple see-through head-mount devices (HMD) with a high-resolution camera for input and a projector for output such as Nreal Light glasses. Such technologies will transform millions of smartphones (equipped with the above inexpensive HMDs) into ubiquitous MR devices and in doing so help the MR industry to overcome the ?content-adoption? dilemma and pave the way for wide adoption of the MR technology and its many important applications.

This project aims to create the first split software architecture that enables high-quality MR applications to run on commodity mobile devices; the capability to jointly optimize offloading multiple Deep Neural Network (DNN)-based tasks constituting a complex, resource-intensive application such as MR over the bandwidth-limited and time-varying wireless network; the capability to jointly schedule multiple DNN-based tasks of resource-intensive applications such as MR to efficiently share all local resources such as the CPU, GPU, and other processors such as NPU on emerging mobile devices; and the capability to support high-quality multi-player MR on commodity mobile devices by scaling the split software architecture across multiple mobile devices to efficiently share the limited global resources such as the wireless network and the edge cloud.

The proposed research will have lasting impact on knowledge discovery, the computer industry, and the society. Technically, this work anticipates having far-reaching impacts outside the area of supporting AR/VR/MR on commodity smartphones by developing general edge-assisted software architectures for enabling the class of latency-sensitive 5G/6G applications on current and future mobile computing platforms such as smart glasses. Developing the proposed technologies for MR have the potential to fundamentally overcome the ?deployment-content? dilemma faced by the industry as well as fostering the proliferation and wide adoption of MR technologies and its many societal applications. The importance of this work will be further heightened by making smartphones an important enabler of accessing information and new technologies like AR/VR/MR for people in both developed and developing countries and hence being an important tool in overcoming the ?digital divide".

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Moinak Ghoshal, Z. Jonny "An In-Depth Study of Uplink Performance of 5G mmWave Networks" Proceedings of The 2nd ACM SIGCOMM Workshop on 5G and Beyond Network Measurements, Modeling, and Use Cases (5G-MeMU) , 2022 https://doi.org/10.1145/3538394.3546042 Citation Details
Meng, Jiayi and Kong, Z. Jonny and Hu, Y. Charlie and Choi, Mun Gi and Lal, Dhananjay "Do we need sophisticated system design for edge-assisted augmented reality?" EdgeSys '22: Proceedings of the 5th International Workshop on Edge Systems, Analytics and Networking , 2022 https://doi.org/10.1145/3517206.3526267 Citation Details
Corbett, Matthew and David-John, Brendan and Shang, Jiacheng and Hu, Y. Charlie and Ji, Bo "BystandAR: Protecting Bystander Visual Data in Augmented Reality Systems" Proceedings of the 21st Annual International Conference on Mobile Systems, Applications and Services , 2023 https://doi.org/10.1145/3581791.3596830 Citation Details
Corbett, Matthew and David-John, Brendan and Shang, Jiacheng and Hu, Y. Charlie and Ji, Bo "Poster: BystandAR: Protecting Bystander Visual Data in Augmented Reality Systems" the 21st Annual International Conference on Mobile Systems, Applications and Services , 2023 https://doi.org/10.1145/3581791.3597377 Citation Details
Dash, Pranab and Kong, Z. Jonny and Hu, Y. Charlie and Turner, Chris and Wolfensparger, Dell and Choi, Mun Gi and Kshitij, Abhinav and McLandrich, Viviane E. "How to Pipeline Frame Transfer and Server Inference in Edge-assisted AR to Optimize AR Task Accuracy?" Proc. of the 6th International Workshop on Edge Systems, Analytics and Networking (EdgeSys) , 2023 https://doi.org/10.1145/3578354.3592870 Citation Details
Kong, Z Jonny and Xu, Qiang and Hu, Y Charlie "ARISE: High-Capacity AR Offloading Inference Serving via Proactive Scheduling" , 2024 https://doi.org/10.1145/3643832.3661894 Citation Details
Kong, Z Jonny and Xu, Qiang and Meng, Jiayi and Hu, Y Charlie "AccuMO: Accuracy-Centric Multitask Offloading in Edge-Assisted Mobile Augmented Reality" , 2023 https://doi.org/10.1145/3570361.3592531 Citation Details
Meng, Jiayi and Kong, Zhaoning and Xu, Qiang and Hu, Y. Charlie "Do Larger (More Accurate) Deep Neural Network Models Help in Edge-assisted Augmented Reality?" NAI'21: Proceedings of the ACM SIGCOMM 2021 Workshop on Network-Application Integration , 2021 https://doi.org/10.1145/3472727.3472807 Citation Details
Moinak Ghoshal, Pranab Dash "Can 5G mmWave Support Multi-user AR?" In: Hohlfeld, O., Moura, G., Pelsser, C. (eds) Passive and Active Measurement. PAM 2022. Lecture Notes in Computer Science, vol 13210 , 2022 https://doi.org/https://doi.org/10.1007/978-3-030-98785-5_8 Citation Details

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page