Award Abstract # 1925709
CCRI: Planning: Establishing A Hand-Gesture Research Platform for Behavior Biometrics and Cognitive Robotics (HGRP)

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: ARIZONA STATE UNIVERSITY
Initial Amendment Date: August 16, 2019
Latest Amendment Date: August 16, 2019
Award Number: 1925709
Award Instrument: Standard Grant
Program Manager: Yuanyuan Yang
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: October 1, 2019
End Date: March 31, 2021 (Estimated)
Total Intended Award Amount: $100,000.00
Total Awarded Amount to Date: $100,000.00
Funds Obligated to Date: FY 2019 = $100,000.00
History of Investigator:
  • Dijiang Huang (Principal Investigator)
    dijiang@asu.edu
  • Yezhou Yang (Co-Principal Investigator)
  • Richard Gould (Co-Principal Investigator)
Recipient Sponsored Research Office: Arizona State University
660 S MILL AVENUE STE 204
TEMPE
AZ  US  85281-3670
(480)965-5479
Sponsor Congressional District: 04
Primary Place of Performance: Arizona State University
PO Box 876011
Tempe
AZ  US  85281-6011
Primary Place of Performance
Congressional District:
04
Unique Entity Identifier (UEI): NTLHJXM55KZ6
Parent UEI:
NSF Program(s): CCRI-CISE Cmnty Rsrch Infrstrc
Primary Program Source: 01001920DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7359
Program Element Code(s): 735900
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

The hand is one of the most complex and beautiful pieces of natural engineering in the human body, and it represents a triumph of complex engineering, exquisitely evolved to perform a range of tasks. Hand-gestures have been used for many areas and how to model hand-gestures has broad and profound impact on addressing the nation's priorities and societal needs, e.g., manipulators and co-robot in manufacturing, natural human-computer interaction for virtual reality, wearable platforms in consumer electronics, hand-gesture biometrics in cybersecurity, etc. In this project, one of the research goals is to build a hand-gesture research platform, Hand-Gesture Research Platform (HGRP). HGRP targets at enabling researchers to easily access various hand-gesture data to validate their hand-gesture recognition models, benchmark the performance of newly developed algorithms, and compare with research outcomes from others. Moreover, HGRP is used to gather research communities' feedback based on existing cutting-edge hand-gesture research to prioritize the need on establishing a hand-gesture focused computing research infrastructure.

The HGRP framework is based on a cloud computing platform and is used to enable research capabilities in the following areas: (a) hand-gesture biometrics; (b) cognitive Robotics; and (c) programmable interfaces for gesture-based data processing and visualization. HGRP is composed by the following salient features:

*Data collection based on two major types of sensors: (a) motion detection sensors, e.g., wearable sensors such as watch, wrist band, on figure sensors, data motion gloves, infrared motion detection sensors, etc., and (b) video sensors such as leap motion sensors, video recorders, etc. The detected hand-gesture data is sent to data storage for processing and storing.

*Data are collected and stored on an objective storage service, and frequently used data is stored in memory storage.

*A GPU-based private cloud is established to allow researchers to implement well- known hand-gesture data processing models and establish benchmarking models.

HGRP also allows researchers to submit computation tasks for evaluations and comparative studies. HGRP provides a web-based data collection, processing, sharing, and storing APIs that allow researchers remotely to access the hand-gesture repository for data retrieval, processing, sharing, storing through web services APIs.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Duo Lu, Linzhen Luo "FMKit: An In-Air-Handwriting Analysis Library and Data Repository" CVPR Workshop on Computer Vision for Augmented and Virtual Reality , 2020 https://doi.org/ Citation Details
Duo Lu, Linzhen Luo "FMKit: An In-Air-Handwriting Analysis Library and Data Repository" CVPR Workshop on Computer Vision for Augmented and Virtual Reality, 2020 , 2020 https://doi.org/ Citation Details

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

The proposed HGRP framework is established based on a cloud computing platform and it is used to enable research capabilities in the following areas: (a) hand-gesture biometrics; (b) cognitive Robotics; and (c) programmable interfaces for gesture-based data processing and visualization. HGRP is composed of the following major components:

  • Data collection based on two major types of sensors: (a) motion detection sensors, e.g., wearable sensors such as a watch, wrist band, on figure sensors, data motion gloves, infrared motion detection sensors, etc., and (b) video sensors such as leap motion sensors, video recorders, etc. The detected hands-gesture data is sent to data storage for processing and storing.
  • A GPU-based private cloud is established to allow researchers to implement well- known hand-gesture data processing models and establish benchmarking models. HGRP also allows researchers to submit computation tasks for evaluations and comparative studies.
  • HGRP provides a web-based data repository that allow researchers remotely to access the hand-gesture data for research.

We organized the outcomes of this project in a more user-friendly way. The generated datasets and tool kits are collectively called FMKit. It includes a code library and data repository for finger motion-based in-air-handwriting analysis. For details of the FMKit project, please refer to the project home page and a 2-minute short video introduction:

FMKit contains five datasets of in-air-handwriting signals collected from over 200 participants with two different hand motion capture devices. All datasets are openly and freely available to researchers. For details of the datasets, please refer to:

To download the data repository for research and evaluation purposes, the project team provides a dataset access procedure by submitting a data acquisition letter to fmkit@googlegroups.com to request access. The datasets are free of access and use for research and evaluation purposes.

A new dataset, called the word-210 dataset, can be downloaded directly without needing the data access application. Word-210 dataset includes 10 users? in-air handwriting of 210 English words and 210 Chinese words and each with 5 repetitions: 

FMKit also provides a Python code library to process the signals and interface with other Python tools. You can download the FMKit codes at the following links:

Hand-gesture research is an emerging inter-disciplinary research area. It has been actively involved research areas such as motion detection, visualization, robotics, ML/AI, and cybersecurity. Newly developed models and solutions including behavioral modeling and ML/AI-based solutions to make the hand-gesture applications more adaptive and easier to use.  The developed education materials will significantly promote the research in these directions. 

 


Last Modified: 06/03/2021
Modified by: Dijiang Huang

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page