Award Abstract # 1645463
EAGER: Wide Field of View Augmented Reality Display with Dynamic Focus

NSF Org: IIS
Division of Information & Intelligent Systems
Recipient: UNIVERSITY OF NORTH CAROLINA AT CHAPEL HILL
Initial Amendment Date: July 6, 2016
Latest Amendment Date: July 6, 2016
Award Number: 1645463
Award Instrument: Standard Grant
Program Manager: Ephraim Glinert
IIS
 Division of Information & Intelligent Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: August 1, 2016
End Date: July 31, 2018 (Estimated)
Total Intended Award Amount: $81,657.00
Total Awarded Amount to Date: $81,657.00
Funds Obligated to Date: FY 2016 = $81,657.00
History of Investigator:
  • Henry Fuchs (Principal Investigator)
    fuchs@cs.unc.edu
Recipient Sponsored Research Office: University of North Carolina at Chapel Hill
104 AIRPORT DR STE 2200
CHAPEL HILL
NC  US  27599-5023
(919)966-3411
Sponsor Congressional District: 04
Primary Place of Performance: University of North Carolina at Chapel Hill
201 S. Columbia St.
Chapel Hill
NC  US  27599-3175
Primary Place of Performance
Congressional District:
04
Unique Entity Identifier (UEI): D3LHU66KBLD5
Parent UEI: D3LHU66KBLD5
NSF Program(s): HCC-Human-Centered Computing
Primary Program Source: 01001617DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7367, 7916
Program Element Code(s): 736700
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Augmented reality (AR) has the potential to integrate computer graphics and the human visual system. The technology has been demonstrated to improve task performance in areas such as communication, accessibility, worker efficiency and medicine. However, use of AR technology is not yet widespread, and it is still viewed as a novelty. The PI believes this is partially due to the lack of adequate display technology to support compelling applications. The most advanced head-mounted devices available today present a flat image located at a fixed distance from the user's eyes, and optical see-through devices are often quite limited in coverage of the visual field. In this exploratory project, the PI and his team will design and implement a prototype of a novel see-through augmented reality display system that provides both wide field of view and variable focal depth. If successful, this device will lay the foundation for a new class of augmented reality display that enables virtual objects to have the same focal depth as real world objects at any location, creating better fusion of real and virtual, which will enable society to take advantage of the capabilities augmented reality can provide. Such a new class of display will open exciting research avenues; for example, by incorporating gaze tracking such devices could have even wider research applications by adapting to the user's gaze direction and fixation distance.

The PI's approach relies upon a deformable membrane optical combiner which can be shaped to change the optical depth of the virtual image. This single optical element solution also allows for the creation of virtual imagery that covers a large portion of the visual field. By employing a single reflective optical element as the vari-focal relay optics, this project will simplify the design of see-through vari-focal optical systems for near-eye displays. This technique also promises a large aperture size, leading to wide field of view near-eye display solutions for AR applications. In theory, this approach can provide a full field of view solution with a display and aperture in the proper configuration. Upon completion of a working prototype, many perceptual user studies which were previously difficult to perform will become easily available, enabling a deeper understanding of the human visual system.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Dunn, David and Chakravarthula, Praneeth and Dong, Qian and Ak?it, Kaan and Fuchs, Henry "10-1: Towards Varifocal Augmented Reality Displays using Deformable Beamsplitter Membranes" SID Symposium Digest of Technical Papers , v.49 , 2018 10.1002/sdtp.12490 Citation Details
Dunn, David and Dong, Qian and Fuchs, Henry and Chakravarthula, Praneeth and Osten, Wolfgang and Stolle, Hagen and Kress, Bernard C. "Mitigating vergence-accommodation conflict for near-eye displays via deformable beamsplitters" Proceedings Volume 10676, Digital Optics for Immersive Displays , 2018 10.1117/12.2314664 Citation Details
Dunn, David and Tippets, Cary and Torell, Kent and Fuchs, Henry and Kellnhofer, Petr and Myszkowski, Karol and Didyk, Piotr and Ak?it, Kaan and Luebke, David "Membrane AR: varifocal, wide field of view augmented reality display from deformable membranes" Proceeding SIGGRAPH '17 ACM SIGGRAPH 2017 Emerging Technologies , 2017 10.1145/3084822.3084846 Citation Details
Dunn, David and Tippets, Cary and Torell, Kent and Kellnhofer, Petr and Aksit, Kaan and Didyk, Piotr and Myszkowski, Karol and Luebke, David and Fuchs, Henry "Wide Field Of View Varifocal Near-Eye Display Using See-Through Deformable Membrane Mirrors" IEEE Transactions on Visualization and Computer Graphics , v.23 , 2017 10.1109/TVCG.2017.2657058 Citation Details

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

This project sought to develop improved head-mounted displays for augmented reality, displays that allow the wearer to observe computer-generated objects while also viewing the wearer's surroundings. Often these computer generated object are three-dimensional and are placed at fixed locations in the user's surroundings, for example a virtual sofa in the user's actual living room (or a virtual "zombie" next to the user's friend). The apparent distance of real objects is determined by a combination of vergence of the user's two eyes (the angle that the two eyes need to rotate for the two images of the object to match) and accommodation, the focus adjustment of each of the user's eyes. These two factors, vergence and accommodation, are naturally neurally coupled, and when they don't match, the user experiences discomfort and sometimes double vision, and occasionally nausea.  One of the problems with most current augmented reality displays is that their optics put the internal, virtual image at a fixed distance from the user, for example 2 meters, even though the objects displayed may be much closer or much farther away. Since the user is simultaneously observing both virtual imagery and also real objects, this mismatch between verge and accommodation is especially problematic.

The research supported by this grant sought to ameliorate this problem by developing displays whose focus distance can be dynamically adjusted to match the distance of the virtual object at the user's current gaze.  

The prototype display was developed in collaboration with researchers from NVIDIA and MPI Informatics,  proved to be quite effective. The research paper describing it and its tests received the best paper award at the 2017 IEEE Virtual Reality Conference.  We also demonstrated this prototype at ACM Siggraph Emerging Technologies exhibition, where it won the Digital Content Expo Japan Prize.

Encouraged by the effectiveness of the prototype display, we develop a still-more capable display, "FocusAR," that could not only dynamically adjust the focal depth of the internal (virtual) imagery but could also simultaneously and independently adjust the focus of the view from the user's "real world" surroundings. This dynamic adjustment is particularly important for user over 45 years old, who almost always need some adjustment either for near objects or far objects or both.

Results from this second display, developed in collaboration with Dr. Kaan Aksit, of NVIDIA Research, were also well received. The paper introducing this display received the Best Paper Award at IEEE ISMAR 2018, the International Symposium on Mixed and Augmented Reality.

In the future, these kinds of dynamic adjustments may find their way to commercial head-mounte displays, which will make them more comfortable to use and thus more effective for a wide variety of tasks.


Last Modified: 12/28/2018
Modified by: Henry Fuchs

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page