
NSF Org: |
IIS Division of Information & Intelligent Systems |
Recipient: |
|
Initial Amendment Date: | March 21, 2019 |
Latest Amendment Date: | March 20, 2024 |
Award Number: | 1846477 |
Award Instrument: | Continuing Grant |
Program Manager: |
Han-Wei Shen
hshen@nsf.gov (703)292-2533 IIS Division of Information & Intelligent Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | April 1, 2019 |
End Date: | March 31, 2026 (Estimated) |
Total Intended Award Amount: | $500,000.00 |
Total Awarded Amount to Date: | $552,000.00 |
Funds Obligated to Date: |
FY 2020 = $115,614.00 FY 2021 = $110,110.00 FY 2022 = $144,392.00 FY 2023 = $74,761.00 FY 2024 = $20,000.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
A-153 ASB PROVO UT US 84602-1128 (801)422-3360 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
A-285 ASB Provo UT US 84602-1231 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | HCC-Human-Centered Computing |
Primary Program Source: |
01002324DB NSF RESEARCH & RELATED ACTIVIT 01002021DB NSF RESEARCH & RELATED ACTIVIT 01001920DB NSF RESEARCH & RELATED ACTIVIT 01002122DB NSF RESEARCH & RELATED ACTIVIT 01002425DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
The goal of this project is to increase both interest and competence in STEM by developing a new type of 3D display as a tool for teaching spatial reasoning. Strong spatial reasoning skills have been important to many of humanity's greatest advances in science and engineering. To help teach these skills, this research will develop a display platform that is entirely different from all preceding 3D display technologies. The display uses light to trap multiple particles in air, then moves and illuminates these trapped particles to draw 3D images. We perceive these "spatial images" as physical objects because they are, in fact, physical objects in space. These full-color, high-definition images can be seen from every direction. The technology makes possible, for the first time, many of the 3D images portrayed in science fiction. In addition to the broader impacts for STEM education built directly into the primary research objective, a team of undergraduates under the supervision of the principal investigator (PI) will also design a 3D-centric Saturday academy called 'Spatial Forces' for middle and high school students from groups underrepresented in engineering, with a focus on low-income and rural students. These students will create interactive, revolving 3D content for a micro-museum aimed at exposing their communities to STEM topics of local relevance (e.g., preserving Native American artifacts, water conservation, the biology of native game fish, etc.). The programming will include learning from, and creating content for, demonstration installations to be incorporated into micro-museums. As part of the academy experience, students will be evaluated for both affect and spatial ability, and the results of the program will be assessed by an Education Advisory Board and disseminated to both local and out of state partners. Opportunities for program perpetuation and expansion will be provided through a collaboration with the UTAH Stem Action Center.
The long-term goal of this research is to create glasses-free, interactive, 3D environments as tools to expand human capacity and creativity. The PI's recent research has led to a novel, non-holographic method of screenless 3D display with the potential to change how we interact with our data by making it physically present. To this end, the objective of the current project is to create and evaluate parallel optical trap displays (OTDs) as tools for spatial thinking. Free-floating 3D displays have been the "holy grail" of 3D imaging for over a century. Such a display would be potentially transformative for many information visualization applications; however, the application space with perhaps the greatest potential scientific impact would be that of spatial thinking, a skill that has been foundational to many of the greatest scientific achievements in history and which is also strongly associated with both interest and achievement in STEM. The PI's prototype OTD is a natively spatial 3D technology ideally suited to spatial thinking that is currently capable of drawing full-color, video-rate images of small objects with a single trapped particle. Achieving larger image volumes will require a new "parallel" OTD approach that creates a large volume OTD from multiple trapped particles illuminated independently and simultaneously. The central hypothesis of this work is that parallel OTDs will lead to better mastery of spatial concepts (as measured by standard spatial thinking tests) by making data physical and interactive. To test this hypothesis, the PI will optimize single particle traps, develop at least two parallel display methods (e.g. point trap array and line trap array), and compare parallel OTDs against screen-based tools for spatial reasoning. Parallel OTDs will be capable of providing all of the 3D visual cues of holography (accommodation, parallax, and potentially even occlusion) without being subject to its limitations (aperture constraints and prohibitive computational complexity).
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
Please report errors in award information by writing to: awardsearch@nsf.gov.