Award Abstract # 1521289
I-Corps: Automated Attendance Check by Using Smartphone Cameras

NSF Org: TI
Translational Impacts
Recipient: UNIVERSITY OF MISSOURI SYSTEM
Initial Amendment Date: January 2, 2015
Latest Amendment Date: January 2, 2015
Award Number: 1521289
Award Instrument: Standard Grant
Program Manager: lydia mcclure
TI
 Translational Impacts
TIP
 Directorate for Technology, Innovation, and Partnerships
Start Date: January 15, 2015
End Date: June 30, 2016 (Estimated)
Total Intended Award Amount: $50,000.00
Total Awarded Amount to Date: $50,000.00
Funds Obligated to Date: FY 2015 = $50,000.00
History of Investigator:
  • Zhaozheng Yin (Principal Investigator)
    zhaozheng.yin@stonybrook.edu
Recipient Sponsored Research Office: Missouri University of Science and Technology
300 W. 12TH STREET
ROLLA
MO  US  65409-1330
(573)341-4134
Sponsor Congressional District: 08
Primary Place of Performance: Missouri University of Science and Technology
300 W 12th Street
Rolla
MO  US  65409-6506
Primary Place of Performance
Congressional District:
08
Unique Entity Identifier (UEI): Y6MGH342N169
Parent UEI:
NSF Program(s): I-Corps
Primary Program Source: 01001516DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 9150
Program Element Code(s): 802300
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.084

ABSTRACT

Checking attendance in scenarios such as classrooms commonly needs an instructor to recognize each student one by one by reading the names on a roster or ask students to sign up the attendance sheet. However, this traditional method faces two problems: reading students' names may occupy minutes of lecture time when the number of students is large and letting students to sign up an attendance sheet is prone to be cheated since they can sign their own names and their classmates' names who are absent in the class; it is not a desirable task for instructors to calculate the total attendance of every student in a semester by going through every attendance sheet manually. This I-Corps team proposes an efficient and accurate way to accomplish this task. By taking videos of student faces in classrooms using Smartphone cameras, the team proposes a unified framework of visual face detection, tracking and recognition algorithms to recognize multi-faces in the video simultaneously.

The proposed system has the following steps: instructors install the proposed App on their own Smartphones; in the first class, instructors use the Smartphone cameras to take a short-period video of student faces in the classroom. The application will automatically build a face dataset for the course and the instructor only needs to identify them for the first class; in the remaining classes, instructors take videos of each class and the application will do automated attendance check. The proposed Smartphone App will perform multi-object tracking to associate detected faces (including false positives) into face tracklets (each tracklet contains multiple instances of the same individual with variations in pose, illumination etc.) and then the face instances in each face tracklet are clustered into a small number of clusters, achieving sparse face representation with less redundancy.

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

Face recognition has promising applications in the education field such as class attendance check, identity verification and exam proctor during the oneline education. We propose an efficient and accurate way to accomplish this task. By taking videos of student faces in classrooms using Smartphone (or web cameras during online education), we propose a unified framework of visual face detection, tracking and recognition algorithms to recognize multi-faces in the video simultaneously. The algorithm can further enable educators to monitor students’ attendance and learning behavior over time.

Intellectual Merits: Our proposed system has the following steps: 1. Instructors install our App on their own Smartphones which will be lower cost than commercial face recognition systems used in security applications; 2. In the first class, instructors use the Smartphone cameras to take a short-period video of student faces in the classroom. The application will automatically build a face dataset for the course and the instructor only needs to identify them for the first class; 3. In the remaining classes, instructors take videos of each class and the application will do automated attendance check. During online education, the face videos will be provided by webcams. Our algorithm performs multi-object tracking to associate detected faces (including false positives) into face tracklets (each tracklet contains multiple instances of the same individual with variations in pose, illumination etc.) and then the face instances in each face tracklet are clustered into a small number of clusters, achieving sparse face representation with less redundancy. The algorithm solves a unified optimization problem to: (a) identify false positive face tracklets; (b) link face tracklets belonging to the same person due to long occlusion; and (c) recognize the group of faces simultaneously with spatial and temporal context constraints in the video.

Broader Impacts. The proposed project will advance the scientific and technological understanding on scene-specific object detection, multi-object tracking under occlusion and multi-face recognition in videos. It will assist the educators to efficiently and accurately check the course attendance, verify identities and protocor exams. The video-based face recognition algorithm can also be extended into the application domains of identify and access management in financial system and medical institutions.


Last Modified: 07/05/2016
Modified by: Zhaozheng Yin

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page