Award Abstract # 1421407
CSR: Small: Behavior Based User Authentication for Mobile Devices

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: MICHIGAN STATE UNIVERSITY
Initial Amendment Date: August 6, 2014
Latest Amendment Date: August 6, 2014
Award Number: 1421407
Award Instrument: Standard Grant
Program Manager: Marilyn McClure
mmcclure@nsf.gov
 (703)292-5197
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: August 15, 2014
End Date: July 31, 2018 (Estimated)
Total Intended Award Amount: $500,000.00
Total Awarded Amount to Date: $500,000.00
Funds Obligated to Date: FY 2014 = $500,000.00
History of Investigator:
  • Alex Liu (Principal Investigator)
    alexliu@cse.msu.edu
Recipient Sponsored Research Office: Michigan State University
426 AUDITORIUM RD RM 2
EAST LANSING
MI  US  48824-2600
(517)355-5040
Sponsor Congressional District: 07
Primary Place of Performance: Michigan State University
428 S. Shaw Lane, Room 2132
East Lansing
MI  US  48824-1226
Primary Place of Performance
Congressional District:
07
Unique Entity Identifier (UEI): R28EKN92ZTZ9
Parent UEI: VJKZC4D1JN36
NSF Program(s): CSR-Computer Systems Research,
Secure &Trustworthy Cyberspace
Primary Program Source: 01001415DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7434, 7923
Program Element Code(s): 735400, 806000
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Mobile devices equipped with touch screens have increasingly rich functionality, enhanced computing power, and greater storage capacity. These devices often contain private information such as personal photos, emails, and even corporate data. Therefore, it is crucial to have secure yet convenient user authentication mechanisms for touch screen devices. However, the widely used password/PIN/pattern based solutions are susceptible to shoulder surfing (as mobile devices are often used in public settings where shoulder surfing often happens either purposely or inadvertently) and smudge attacks (as oily residues left by fingers on touch screens can be recognized by impostors) and are sometimes inconvenient for users to input when they are walking or driving.

The goal of this project is to develop a behavior based user authentication approach for touch screen devices. Rather than authenticating users solely based on what they input (such as a password/PIN/pattern), Behavioral Authentication is based upon how users provide input input. Specifically, a user is first asked to perform certain actions, such as gestures/signatures, on touch screens and then the behavior feature information (such as velocity magnitude and device acceleration) is extracted from the actions to authenticate the user based on machine learning techniques. The intuition behind the proposed approach is that people have consistent and distinguishing behavior of performing gestures and signatures on touch screens. Compared with current user authentication schemes for touch screen devices, the proposed approach is significantly more difficult to compromise because it is nearly impossible for impostors to reproduce the behavior of others doing gestures/signatures through shoulder surfing or smudge attacks - they can see it, but they cannot do it.

This project will advance the knowledge and understanding of behavior based user authentication on touch screen devices. This is potentially transformative research with high-impact. If successful, this project will not only yield a theoretical foundation for behavior based user authentication on touch screen devices but also invite future research along this direction.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 12)
Kamran Ali, Alex X. Liu, Wei Wang, and Muhammad Shahzad "Keystroke Recognition Using WiFi Signals" Proceedings of the 21th ACM Annual International Conference on Mobile Computing and Networking (MOBICOM), Paris, France , 2016
Kamran Ali, Alex X. Liu, Wei Wang, and Muhammad Shahzad: "Keystroke Recognition Using WiFi Signals" IEEE Journal on Selected Areas in Communications , v.35 , 2017
Kang Ling, Haipeng Dai, Yuntang Liu, and Alex X. Liu "UltraGesture: Fine-Grained Gesture Sensing and Recognition" Proceedings of the IEEE International Conference on Sensing, Communication and Networking (SECON) , 2018
Ke Sun, Wei Wang, Alex X. Liu, and Haipeng Dai "Depth Aware Finger Tapping on Virtual Displays" Proceedings of the 16th ACM International Conference on Mobile Systems, Applications, and Services MobiSys , 2018
Lei Wang, Ke Sun, Haipeng Dai, Alex X. Liu, and Xiaoyu Wang "WiTrace: Centimeter-Level Passive Gesture Tracking Using WiFi Signals" Proceedings of the IEEE International Conference on Sensing, Communication and Networking (SECON) , 2018
Muhammad Shahzad, Alex X. Liu, and Arjmand Samuel "Behavior Based Human Authentication on Touch Screen Devices Using Gestures and Signatures" IEEE Transactions on Mobile Computing , v.16 , 2017 , p.2726
Nan Yu, Wei Wang, Alex X. Liu, and Lingtao Kong "QGesture: Quantifying Gesture Distance and Direction with WiFi Signals" Proceedings of the 20th ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP) , 2018
Wei Wang, Alex X. Liu, and Muhammad Shahzad "Gait Recognition Using WiFi Signals" Proceedings of the 18th ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP), HeidelBerg, Germany , 2016
Wei Wang, Alex X. Liu, and Muhammad Shahzad "Gait Recognition Using WiFi Signals" Proceedings of the 18th ACM International Joint Conference on Pervasive and Ubiquitous Computing (UBICOMP), HeidelBerg, Germany, September 2016 , 2016
Wei Wang, Alex X. Liu, Ke Sun "Device-Free Gesture Tracking Using Acoustic Signals" Proceedings of the 22th ACM Annual International Conference on Mobile Computing and Networking (MOBICOM), New York City, New York, October 2016 , 2016
Wei Wang, Alex X. Liu, Muhammad Shahzad, Kang Ling, and Sanglu Lu "Device-Free Human Activity Recognition Using Commercial WiFi Devices" IEEE Journal on Selected Areas in Communications , v.35 , 2017
(Showing: 1 - 10 of 12)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

Mobile devices equipped with touch screens have become prevalent in our lives with increasingly rich functionalities, enhanced computing power, and more storage capacity. These devices often contain private information such as personal photos, emails, and even corporate data. Therefore, it is crucial to have secure yet convenient user authentication mechanisms for touch screen devices. However, the widely used password/PIN/pattern based solutions are susceptible to shoulder surfing (as mobile devices are often used in public settings where shoulder surfing often happens either purposely or inadvertently) and smudge attacks (as oily residues left by fingers on touch screens can be recognized by imposters) and are sometimes inconvenient for users to input when they are walking or driving. In this project, the PI developed BEAT, a behavior based user authentication approach for touch screen devices. Rather than authenticating users solely based on what they input (such as a password/PIN/pattern), which is inherently subject to shoulder surfing and smudge attacks, BEAT authenticates users based on how they input. Specifically, BEAT first asks a user to perform certain actions, such as gestures/signatures, on touch screens and then uses the behavior feature information (such as velocity magnitude and device acceleration) extracted from the actions to authenticate the user based on machine learning techniques. The intuition behind the proposed approach is that people have consistent and distinguishing behavior of performing gestures and signatures on touch screens. Compared with current user authentication schemes for touch screen devices, the proposed approach is significantly more difficult to compromise because it is nearly impossible for imposters to reproduce the behavior of others doing gestures/signatures through shoulder surfing or smudge attacks - they can see it, but they cannot do it. This project represents the first effort towards developing behavior based user authentication approaches based on machine learning techniques for touch screen devices. The PI reveals many new observations (such as people often exhibit different behaviors when they perform the same action under different types of postures such as standing and sitting) and proposes many new concepts.

This project has advanced the knowledge and understanding of behavior based user authentication on touch screen devices. The fundamental concepts developed in this project have led to a much deeper understanding of behavior based user authentication approach for touch screen devices. In this project, the PI's team has successfully achieved the objectives of developing gesture based user authentication schemes for touch screen devices and developing signature based user authentication schemes for touch screen devices. They have proposed, implemented, and evaluated a gesture and signature behavior based authentication scheme for the authentication on touch screen devices. They have identified a set of effective features that capture the behavioral information of performing gestures and signatures on touch screens. They have developed algorithms that can automatically segments each stroke into sub-strokes of different time duration where for each sub-stroke the user has consistent and distinguishing behavior. They have also developed methods to automatically identify combined strokes in signatures and split them at appropriate locations. They have also developed algorithms to extract multiple behaviors from the training samples of a given action.

The research results have been successfully published in top tier computer science conferences (such as MOBICOM, MobiSys, and UbiComp), and computer science journals (such as IEEE Transactions on Mobile Computing, IEEE/ACM Transactions on Networking, and IEEE Journal on Selected Areas in Communications). This project has supported a number of Ph.D. students and post-docs.

The results that we developed in this project has been incorporated into the graduate courses CSE 825 "Computer and Network Security". The PI has designed course projects based on this project for students to work on. The course materials help to educate the next generation of computer engineers for the nation. In addition, this project has provided new opportunities for Ph.D. students to develop their research skills as they pursue their doctoral degrees and eventual careers in academia and/or industry.


Last Modified: 08/10/2018
Modified by: Alex X Liu

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page