Award Abstract # 1714623
SaTC: CORE: Small: Eye Movement Biometrics in Virtual and Augmented Reality

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: TEXAS STATE UNIVERSITY
Initial Amendment Date: August 16, 2017
Latest Amendment Date: February 10, 2020
Award Number: 1714623
Award Instrument: Standard Grant
Program Manager: Jeremy Epstein
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: October 1, 2017
End Date: September 30, 2022 (Estimated)
Total Intended Award Amount: $499,988.00
Total Awarded Amount to Date: $647,981.00
Funds Obligated to Date: FY 2017 = $499,988.00
FY 2018 = $16,000.00

FY 2019 = $115,993.00

FY 2020 = $16,000.00
History of Investigator:
  • Oleg Komogortsev (Principal Investigator)
    ok11@txstate.edu
Recipient Sponsored Research Office: Texas State University - San Marcos
601 UNIVERSITY DR
SAN MARCOS
TX  US  78666-4684
(512)245-2314
Sponsor Congressional District: 15
Primary Place of Performance: Texas State University - San Marcos
601 University dr
San Marcos
TX  US  78666-4684
Primary Place of Performance
Congressional District:
15
Unique Entity Identifier (UEI): HS5HWWK1AAU5
Parent UEI:
NSF Program(s): Special Projects - CNS,
Secure &Trustworthy Cyberspace
Primary Program Source: 01001718DB NSF RESEARCH & RELATED ACTIVIT
01001819DB NSF RESEARCH & RELATED ACTIVIT

01001920DB NSF RESEARCH & RELATED ACTIVIT

01002021DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 025Z, 7434, 7923, 9178, 9251
Program Element Code(s): 171400, 806000
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Virtual and augmented reality (VR/AR) applications are expected to play an increasingly important role in many aspects of everyday life; however, we do not yet have effective methods for protecting VR/AR systems from cybersecurity threats. The goal of this research is to make VR/AR systems more secure via the development of highly accurate and counterfeit-resistant biometric techniques based on eye movements. These techniques are based on the computational modeling of multiple characteristics of the way individuals move their eyes. The development of trustworthy solutions for performing biometric recognition in such systems is critical for the creation of a cybersecurity infrastructure that can adequately serve emerging applications of VR/AR for social networking, health monitoring, and economic transactions. Improved understanding of distinctive eye movement features could also facilitate their use for the detection of cyber-sickness, stress, fatigue, concussions and other states that manifest in abnormalities of human vision. The education component of the project will help recruit a greater number of diverse students to careers in computer science as well as interdisciplinary studies involving computer science, and it will better prepare students to be key players in the next generation of innovators.

The goal of this project is to advance the current state of security in VR/AR systems via the development of highly accurate and counterfeit-resistant biometric techniques based on eye movements. The problem of eye movement-driven biometrics in VR/AR environments is significantly more challenging due to the 3-D environment which produces very complex eye movements that are hard to accurately classify and also the much larger number of extracted eye movement-driven features when compared to the eye movement-driven biometrics in 2D spaces. This project has two major thrusts: (1) biometric recognition: establishing the baseline for person recognition performance via eye movement characteristics in VR/AR environments; and (2) counterfeit-resistance: researching the robustness against spoofing attacks (e.g., attempts to defeat a biometric system through the introduction of fake biometric samples). This research provides answers to important questions related to the uniqueness, variability, scalability, and longevity of eye movement characteristics in VR/AR environments. The outcome of this work will be a new method to address the biometric security vulnerabilities of current and future VR/AR systems.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 14)
Friedman, Lee and Lohr, Dillon James and Hanson, Timothy and Komogortsev, Oleg V "Angular offset distributions during fixation are, more often than not, multimodal" Journal of Eye Movement Research , v.14 , 2021 https://doi.org/10.16910/jemr.14.3.2 Citation Details
Friedman, Lee and Prokopenko, Vladyslav and Djanian, Shagen and Katrychuk, Dmytro and Komogortsev, Oleg V. "Factors affecting inter-rater agreement in human classification of eye movements: a comparison of three datasets" Behavior Research Methods , 2022 https://doi.org/10.3758/s13428-021-01782-4 Citation Details
Friedman, Lee and Stern, Hal and Prokopenko, Vladyslav and Djanian, Shagen and Griffith, Henry and Komogortsev, Oleg "Biometric Performance as a Function of Gallery Size" Applied Sciences , v.12 , 2022 https://doi.org/10.3390/app122111144 Citation Details
Friedman, Lee and Stern, Hal S. and Price, Larry R. and Komogortsev, Oleg V. "Why Temporal Persistence of Biometric Features, as Assessed by the Intraclass Correlation Coefficient, Is So Valuable for Classification Performance" Sensors , v.20 , 2020 https://doi.org/10.3390/s20164555 Citation Details
Griffith, Henry and Lohr, Dillon and Abdulin, Evgeny and Komogortsev, Oleg "GazeBase, a large-scale, multi-stimulus, longitudinal eye movement dataset" Scientific Data , v.8 , 2021 https://doi.org/10.1038/s41597-021-00959-y Citation Details
Griffith, Henry K. and Komogortsev, Oleg V. "Texture Feature Extraction From Free-Viewing Scan Paths Using Gabor Filters With Downsampling" ACM Symposium on Eye Tracking Research and Applications , 2020 https://doi.org/10.1145/3379157.3391423 Citation Details
Katrychuk, Dmytro and Griffith, Henry and Komogortsev, Oleg "A Calibration Framework for Photosensor-based Eye-Tracking System" ACM Symposium on Eye Tracking Research and Applications , 2020 10.1145/3379156.3391370 Citation Details
Katrychuk, Dmytro and Griffith, Henry K. and Komogortsev, Oleg V. "Power-efficient and shift-robust eye-tracking sensor for portable VR headsets" Eye Tracking Research and Applications Symposium (ETRA 2019) , 2019 10.1145/3314111.3319821 Citation Details
Lohr, Dillon and Berndt, Samuel-Hunter and Komogortsev, Oleg "An implementation of eye movement-driven biometrics in virtual reality" ACM Symposium on Eye Tracking Research & Applications (ETRA 2018) , 2018 10.1145/3204493.3208333 Citation Details
Lohr, Dillon and Griffith, Henry and Aziz, Samantha and Komogortsev, Oleg "A Metric Learning Approach to Eye Movement Biometrics" IEEE International Joint Conference on Biometrics (IJCB) , 2020 https://doi.org/10.1109/IJCB48548.2020.9304859 Citation Details
Lohr, Dillon and Griffith, Henry and Komogortsev, Oleg V. "Eye Know You: Metric Learning for End-to-End Biometric Authentication Using Eye Movements From a Longitudinal Dataset" IEEE Transactions on Biometrics, Behavior, and Identity Science , v.4 , 2022 https://doi.org/10.1109/TBIOM.2022.3167633 Citation Details
(Showing: 1 - 10 of 14)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

This project concentrated on understanding highly dynamic and individualistic traits related to eye-movements and their relationship to internal, non-visible, anatomical properties of the human eye and the brain’s strategies for guiding visual attention to assess the utility of eye-movement biometrics – an accurate and spoof resistant method of person’s authentication.

 

We found individual differences in ways people move their eyes on macro and micro scale. If eye motility is recorded with high enough quality the differences in eye-movement patterns of any two individuals are such that it is possible to distinguish between them with an accuracy comparable to fingerprints. This discovery validates eye-movements as an authentication method for devices that have eye tracking capabilities. This is an important finding that will allow to potentially create accurate and spoof resistant device-unlock features aided by eye-movement analysis. 

 

We also found via theoretical analysis and the empirical work with the data that eye-movement-driven biometrics cannot be employed for identification purposes when the pool of users is large. This finding has large privacy-related implications in the positive way, because it puts away fears that people can be somehow be re-identified based on eye-movements. It is important to understand that users can be reliably authenticated, but not re-identified if a pool of users is large. This can be simply interpreted as that eye-movement signal contains enough information to indicate that this person is the same or different, but it does not contain enough unique information to allow to find one specific person in a large pool of people. This is very different from iris-based biometrics where there is enough information to distinguish a person in a large group of people.

 

The findings that eye-movement signal contains units of information that are person specific were verified via statistical methods, which were based on the idea of improving authentication accuracy by selecting uncorrelated eye motility-derived features that have high degree of temporal persistence. The use of statistical methods in eye-movement-driven biometrics helps to understand and interpret the nature of the phenomenon of why fingerprint-like accuracy of user authentication can be achieved from eye-movement signal. 

 

We have also created machine learning (ML) architectures that outperformed statistical methods and provided higher authentication accuracy while requiring less data to do so, making eye-movement-driven authentication comparable to a 4-digit pin in terms of its performance, and thus making it a practical authentication method that can be considered for actual use.

 

Interestingly we found that features extracted by ML architecture from eye-movement signal follow recommendation of the statistical theory that was created by analyzing hand crafted features. ML based features are normally distributed, correlated between themselves only weekly, and are temporally persistent. Further research is required to better understand this phenomenon. 

 

While ML methods allow to achieve best authentication performance, we feel that conventional statistical pipeline allows to specifically understand why it is possible to answer following questions by analyzing information encoded in eye-movements: “How to assess neurological health of a person using same features that are employed for person authentication?”, “How such human states as fatigue, emotions, stress affect eye-movement signal and the performance of eye-movement biometrics system in the authentication and health assessment mode?”, “How quality of the captured eye tracking signal affects the performance of the eye-movement-driven biometric system in authentication and health assessment mode?”. We feel that if such questions are answered with ML methods alone it would not provide a meaningful insight why certain level of accuracy is achieved on a specific dataset and how this level of performance will perform on a different set of data. Eye-movements provide a beautiful and analytically tenable ecosystem where it is possible to understand the neurological component that initiates the movement, oculomotor plant that executes them and a gaze estimation pipeline that assesses the state of the eye. While ML black box can achieve a certain accuracy number it does not explain why exactly this level of performance is actually possible.

 

 

We have also verified the efficacy of eye-movements as a tool against print attacks - printed images of the human eye that are presented to a biometric system.

 

We have made publicly available the eye-movement datasets that were recorded as a result of this line of work to facilitate future research in the domain of eye-movement biometrics.

 

Our findings of high authentication accuracy and spoofing resistance have positive implications for future virtual and augmented reality platforms, which are expected to incorporate eye tracking hardware to support better display quality and enable other applications of eye tracking. Assuming that eye motility signal capture on such platforms can be done with high enough quality, eye-movements can be used as one of the most secure ways to authenticate people in VR/AR devices or even provide broader user understanding while following strict privacy guidelines.


Last Modified: 01/29/2023
Modified by: Oleg V Komogortsev

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page