Award Abstract # 1719031
PFI:BIC: iWork, a Modular Multi-Sensing Adaptive Robot-Based Service for Vocational Assessment, Personalized Worker Training and Rehabilitation.

NSF Org: TI
Translational Impacts
Recipient: UNIVERSITY OF TEXAS AT ARLINGTON
Initial Amendment Date: July 25, 2017
Latest Amendment Date: April 26, 2022
Award Number: 1719031
Award Instrument: Standard Grant
Program Manager: Jesus Soriano Molla
jsoriano@nsf.gov
 (703)292-7795
TI
 Translational Impacts
TIP
 Directorate for Technology, Innovation, and Partnerships
Start Date: September 1, 2017
End Date: August 31, 2023 (Estimated)
Total Intended Award Amount: $999,638.00
Total Awarded Amount to Date: $1,031,638.00
Funds Obligated to Date: FY 2017 = $999,638.00
FY 2019 = $32,000.00
History of Investigator:
  • Fillia Makedon (Principal Investigator)
    makedon@cse.uta.edu
  • Vassilis Athitsos (Co-Principal Investigator)
  • Morris Bell (Co-Principal Investigator)
  • Nicolette Hass (Former Co-Principal Investigator)
Recipient Sponsored Research Office: University of Texas at Arlington
701 S NEDDERMAN DR
ARLINGTON
TX  US  76019-9800
(817)272-2105
Sponsor Congressional District: 25
Primary Place of Performance: University of Texas at Arlington
Arlington
TX  US  76019-0145
Primary Place of Performance
Congressional District:
25
Unique Entity Identifier (UEI): LMLUKUPJJ9N3
Parent UEI:
NSF Program(s): PFI-Partnrships for Innovation,
IIS Special Projects
Primary Program Source: 01001718DB NSF RESEARCH & RELATED ACTIVIT
01001920DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 9251, 1662, 116E
Program Element Code(s): 166200, 748400
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.084

ABSTRACT

Automation, foreign competition, and the increasing use of robots replacing human jobs, stress the need for a major shift in vocational training practices to training for intelligent manufacturing environments, so-called "Industry 4.0". In particular, vocational safety training using the latest robot and other technologies is imperative, as thousands of workers lose their job or die on the job each year due to accidents, unforeseen injuries, and lack of appropriate assessment and training. The objective of this Partnerships for Innovation: Building Innovation Capacity (PFI:BIC) project is to develop iWork, a smart robot-based vocational assessment and intervention system to assess the physical, cognitive and collaboration skills of an industry worker while he/she performs a manufacturing tasks in a simulated industry setting and collaborating with a robot to do the task. The aim is to transform traditional vocational training and rehabilitation practices to an evidence-based and personalized system that can be used to (re)train, retain, and prepare workers for robotic factories of the future. The need for personalized vocational training, rehabilitation and accurate job-matching is essential to ensuring a strong manufacturing sector, vital to America's economic development and ability to innovate. The iWork service is "smart" because it can adjust and adapt to the individual's abilities as it assesses him/her and help decide on the type of tasks needed to test and train, based on the job's complexity, difficulty or familiarity to the worker. The iWork system integrates human expert knowledge to overcome or compensate for detected worker constraints.
Research has shown that robot trainers can increase motivation and sustain interest, increase compliance and learning, and provide training for specific and individual needs. The iWork system aims to assess and train both the human and the work-assistive robot, as they collaborate on a manufacturing job. The projected outcome is low-cost vocational training solutions that can have substantial economic and societal benefits to diverse economic sectors. Most importantly, if successful, projected outcomes could impact how millions of persons seeking a manufacturing job are trained, including those facing a type of learning, physical or aging disability. The system's mobile, low cost methods accelerate recognizing a worker's specific needs and improve the ability of the vocational expert to make correlations between cognitive and physical assessments, thus empowering traditional practices with user-centric targeted training methods. In addition, the project's robot-based emphasis on safety and risk assessment, can reduce liability costs and productivity setbacks faced by industry, due to manufacturing accidents.
The iWork system uses computational methods in reinforcement (machine) learning, data mining, collaborative filtering and human robot interaction to collect and analyze multi-sensing worker data during a manufacturing human-robot collaboration simulation. Data collected and analyzed come from sensors, wearables, and explicit user feedback measuring worker movements, eye gazes, errors made, performance delays, human-robot interactions, physiological metrics, and others, depending on the task. The system has a closed loop architecture composed of four phases: assessment, recommendation, intervention (or adjustment), and evaluation, with a human expert in the loop. The system generates recommendations for personalized interventions to the expert, at different loop intervals. Use of the latest developments in sensing technologies, robotics and intelligent communications, assess the ability to enhance the intelligence of a robot co-worker with more human-like learning and collaboration abilities to support the human in achieving a task. The system is modular and customizable to a particular manufacturing task, domain or worker robot. Two types of robots are used, socially assistive robots that provide non-contact user assistance through feedback and physically assistive robots that provide cognitive, physical and collaboration skill training. To predict risks of injury due to inattention, age, vision, or physical and mental issues, motion analysis and kinematics experiments are conducted to determine the type of safety training needed, to assess how well a human interacts with a collaborative robot, and how best to train the robot to help the human overcome identified physical and other deficiencies in performing a given task. The project integrates three main areas of expertise, engineered service system design, where assistive robots interact with and train each other to collaborate; computing, sensing, and information technologies, where machine learning, data mining and recommender algorithms are used to identify behavioral patterns of interest, and recommend targeted interventions; and human factors and cognitive engineering that deploy methods from the team's expertise in workplace assessment, personalized psychiatric intervention, and evaluation methods of vocational satisfaction, work habits, work quality, etc., as they relate to job preparation and retention.
The project has an interdisciplinary team of experts from two collaborating universities, University of Texas Arlington (UTA) and Yale University, representing several fields, including human factors, psychology, computing, and industrial organization. The project deploys two primary industry partners, SoftBank Robotics (San Francisco, CA) manufacturer of humanoid service robots, and InteraXon (Canada), producing mobile EEG devices, who provides hardware, software and know-how to enhance iWork's functionality in cognitive activity monitoring. The broader context partners include, C8Sciences (USA), Assistive Technology Resources (USA), Barrett Technologies Inc. (USA), and the Dallas Veteran Affairs Research Corp. (USA).

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 25)
Abujelala, Maher and Gupta, Sanika and Makedon, Fillia "A Collaborative Assembly Task to Assess Worker Skills in Robot Manufacturing Environments" Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference , 2018 10.1145/3197768.3203171 Citation Details
Abujelala, Maher and Kanal, Varun and Rajavenkatanarayanan, Akilesh and Makedon, Fillia "9PM: A Novel Interactive 9-Peg Board for Cognitive and Physical Assessment" , 2021 https://doi.org/10.1145/3453892.3453996 Citation Details
Babu, Ashwin Ramesh and Cloud, Joe and Theofanidis, Michail and Makedon, Fillia "Facial Expressions as a Modality for Fatigue detection in Robot based Rehabilitation" Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference on - PETRA '18 , 2018 10.1145/3197768.3203168 Citation Details
Doolani, Sanika and Owens, Luke and Wessels, Callen and Makedon, Fillia "vIS: An Immersive Virtual Storytelling System for Vocational Training" Applied Sciences , v.10 , 2020 https://doi.org/10.3390/app10228143 Citation Details
Doolani, Sanika and Wessels, Callen and Kanal, Varun and Sevastopoulos, Christos and Jaiswal, Ashish and Nambiappan, Harish and Makedon, Fillia "A Review of Extended Reality (XR) Technologies for Manufacturing Training" Technologies , v.8 , 2020 https://doi.org/10.3390/technologies8040077 Citation Details
Doolani, Sanika and Wessels, Callen and Makedon, Fillia "Designing a Vocational Immersive Storytelling Training and Support System to Evaluate Impact on Working and Episodic Memory" , 2021 https://doi.org/10.1145/3453892.3462216 Citation Details
Doolani, Sanika and Wessels, Callen and Makedon, Fillia "vIIIS: A Vocational Intelligent Interactive Immersive Storytelling Framework to Support Task Performance" , 2021 https://doi.org/10.1145/3453892.3461631 Citation Details
Gattupalli, Srujana and Babu, Ashwin Ramesh and Brady, James Robert and Makedon, Fillia and Athitsos, Vassilis "Towards Deep Learning based Hand Keypoints Detection for Rapid Sequential Movements from RGB Images" Proceedings of the 11th PErvasive Technologies Related to Assistive Environments Conference on - PETRA '18 , 2018 10.1145/3197768.3201538 Citation Details
Gupta, Sanika and Owens, Luke and Tsiakas, Konstantinos and Makedon, Fillia "vIIS: a vocational interactive immersive storytelling framework for skill training and performance assessment" , 2019 https://doi.org/10.1145/3316782.3324016 Citation Details
Kanal, Varun and Brady, James and Nambiappan, Harish and Kyrarini, Maria and Wylie, Glenn and Makedon, Fillia "Towards a serious game based human-robot framework for fatigue assessment" , 2020 https://doi.org/10.1145/3389189.3398744 Citation Details
Kuanar, S and Athitsos, V and Pradhan, N and Mishra, A and Rao, K.R. "Cognitive Analysis of Working Memory Load from EEG, by a Deep Recurrent Neural Network" IEEE International Conference on Acoustics, Speech and Signal Processing , 2018 Citation Details
(Showing: 1 - 10 of 25)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

Project Title: PFI:BIC: iWork, a Modular Multi-Sensing Adaptive Robot-Based Service for Vocational Assessment, Personalized Worker Training and Rehabilitation.

PI: Fillia Makedon

Awardee: University of Texas at Arlington

Award Number: 1719031

Award Ends: 08/31/2023

Motivation and objectives

The objective of this research is to develop iWork, a smart robot-based vocational assessment and intervention service system to assess physical, cognitive and robot-collaboration skills of a worker while she/he performs simulated manufacturing tasks. To achieve this, multimodal human-robot interaction data are collected and analyzed. This is a modular closed loop data flow system withfour phases, assessment, recommendation, intervention, and evaluation, with a human-factors expert in the loop.

Summary of Activities

A testbed system was designed from the collected multisensing data while the user performed certain tasks. Manufacturing simulation tasks were used to provide vocational assessment and intervention service that can assess physical, cognitive and collaboration skills of a potential worker. Designed algorithms to collect multisensing data while the user performs simulated workplace tasks. Advanced machine learning algorithms were designed and used to analyze and evaluate the collected multisensing human-robot collaboration data. Experiments were designed using assistive robots to train the user and to assess work habits such as, work quality, cooperativeness, social skills and personal presentation as they relate to job preparation, engagement and retention. Different engineering methods were explored to provide workplace assessments. New software products and user interfaces were developed.

Project Outcomes and Findings

New methods were developed to assess cognitive and physical fatigue of users engaged in simulated vocational tasks. A multisensory system was developed to monitor and assess the performance of veterans with breathing/dyspnea issues. A smart wearable shirt called PNEUMON was developed to monitor persons with dyspnea issues that can cause fatigue and other problems relevant to vocational performance. An assistive robot called MINA was developed to explore gait issues of persons in a clinical setting and to assess the ability of working nurses to use such an assistive robot effectively. The ATEC system (Activate Test for Embodied Cognition) was developed to assess executive function capabilities of children and young adults through embodied cognition tasks/games/exercises, using metrics that relate to performance in a vocational setting. The project resulted in the graduation of 5 Ph.D. students, the graduation of 4 REU students, and one MS thesis student. They received competitive positions in industry and government, based on the work they did on the project (General Motors, SalesForce, Hewlett Packard and FDA). Additionally, 6 REU students were trained and mentored. The project was used to mentor a female postdoc for 2 years, who then became assistant professor at the University of Santa Clara, CA. The PI?S team received Best Technical Paper award on a paper describing project results and also best poster paper award at the 2021 PETRA conference. The project team applied for a patent for the analysis of multisensory data related to vocational simulation experiments. Project researchers published many papers in peer review conferences and journals. User interfaces were developed for the PNEUMON dyspnea system, for the ATEC assessment system and for monitoring the Human Robot Interaction with the MINA robotic assistant. 

Intellectual Merit: 

This is an interdisciplinary project that integrated three areas of expertise, (1) engineered service system design, where socially and rehabilitation assistive robots interact with and train the user to collaborate. 2) Computing, sensing, and information technologies, where machine learning, data mining and recommender algorithms are used to identify patterns of interest, (e.g., errors, delays, etc.) and recommend targeted interventions for the user. (3) Human Factors and cognitive engineering that deploy methods from the team?s Intelligent Interactive Learning and Adaptation Framework.  Expertise in workplace assessment, personalized psychiatric intervention, and evaluation methods of vocational satisfaction, guided experiments on Work Habits, Work Quality, Cooperativeness, Personal Presentation and Social (interpersonal) Skills, as they relate to job preparation and retention. Innovative new methods were developed for human-robot collaboration and training. 

 Broader Impacts:

The iWork vocational assessment system produces personalized vocational training solutions that have great economic and societal impact and affect economic sectors such as, energy, transportation, mining, electronics, robotics, and others.  System outcomes impact traditional disciplines such as education, social sciences, psychology, computer science, rehabilitation, robotics and others. Additionally, project outcomes potentially impact millions of persons seeking a manufacturing job, including those facing a type of learning or aging disability. The system?s mobile, low-cost methods can accelerate recognizing a worker?s ability, improve correlations between cognitive and physical assessments and elevate traditional Vocational Rehabilitation practices to new levels of efficient, user-centric training. Furthermore, the project?s robot-based training emphasis on safety and risk assessment, can reduce liability costs and productivity setbacks due to manufacturing accidents. The project?s educational plan engaged student programmers and interns throughout the system?s development, organized industrial internships and encouraged industry-university collaborations through the iPerform IUCRC NSF center. 

 


Last Modified: 10/07/2023
Modified by: Fillia S Makedon

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page