Award Abstract # 2149607
REU Site: Research Experiences for Undergraduates in Disinformation Detection and Analytics

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: OLD DOMINION UNIVERSITY RESEARCH FOUNDATION
Initial Amendment Date: March 3, 2022
Latest Amendment Date: April 13, 2024
Award Number: 2149607
Award Instrument: Standard Grant
Program Manager: Vladimir Pavlovic
vpavlovi@nsf.gov
 (703)292-8318
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: March 1, 2022
End Date: February 28, 2026 (Estimated)
Total Intended Award Amount: $324,000.00
Total Awarded Amount to Date: $332,000.00
Funds Obligated to Date: FY 2022 = $324,000.00
FY 2024 = $8,000.00
History of Investigator:
  • Sampath Jayarathna (Principal Investigator)
    sampath@cs.odu.edu
  • Jian Wu (Co-Principal Investigator)
Recipient Sponsored Research Office: Old Dominion University Research Foundation
4111 MONARCH WAY STE 204
NORFOLK
VA  US  23508-2561
(757)683-4293
Sponsor Congressional District: 03
Primary Place of Performance: Old Dominion University
5115 Hampton Blvd.
Norfolk
VA  US  23529-0001
Primary Place of Performance
Congressional District:
03
Unique Entity Identifier (UEI): DSLXBD7UWRV6
Parent UEI: DSLXBD7UWRV6
NSF Program(s): RSCH EXPER FOR UNDERGRAD SITES,
Secure &Trustworthy Cyberspace
Primary Program Source: 01002223DB NSF RESEARCH & RELATED ACTIVIT
01002425DB NSF RESEARCH & RELATED ACTIVIT

04002223DB NSF Education & Human Resource
Program Reference Code(s): 025Z, 9250
Program Element Code(s): 113900, 806000
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070, 47.076

ABSTRACT

This Research Experiences for Undergraduates Site project will engage motivated students in the rapidly growing research area of disinformation detection and analytics. With its focus on identifying disinformation, this program will broaden the views of the students, providing them with a holistic and in-depth understanding of disinformation and its viral spread across the Web. They will leverage knowledge and skills learned to discern and debunk disinformation, which could aid their families, friends, and social media contacts and eventually help prevent disinformation from spreading. Their knowledge and skills will not only prepare students for disinformation-related jobs in research or industry, it will encourage them to pursue graduate study and research in general. This program will also contribute to the development of a diverse, globally competitive workforce by providing research and education opportunities to an economically diverse group of students, including female, underrepresented minorities, first-generation college students, and those from undergraduate institutions who may not have previous exposure to research or graduate school.

Students participating in the REU Site project will learn: (1) The concept of disinformation, its types, examples, and active research topics, (2) Mainstream computational methods to detect disinformation on social media, including how to access, clean, preprocess, and visualize data and how to build analytical and predictive models using open source programming tools, (3) Important metrics to evaluate research results, compare baseline models, and produce preliminary results for publication quality research products, and (4) Essential research skills including brainstorming discussion, programming experiments, forensics, trial-and-error, technical writing, and research presentation. Students will conduct hands-on research on topics of their own choosing that fit within faculty members' existing broad disinformation research programs. The goal of this REU Site is to engage participating students in real-world projects studying disinformation from the perspectives of data analytics, information retrieval, applied machine learning, web archiving, and social computing. The participants will learn essential knowledge and skills to support future careers, whether research or application oriented. Co-funding for this project is being provided by the Secure and Trustworthy Cyberspace (SaTC) program and the Cybercorps/Scholarship for Service (SFS) program in recognition of the alignment of this project with the goals of these two programs.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Bragg, Haley and Jayanetti, Himarsha R. and Nelson, Michael L. and Weigle, Michele C. "Less than 4% of Archived Instagram Account Pages for the Disinformation Dozen are Replayable" , 2023 https://doi.org/10.1109/JCDL57899.2023.00025 Citation Details
Pineda, Kayla and Perrotti, Anne M. and Poursardar, Faryaneh and Graber, Danielle and Jayarathna, Sampath "Using BERT to Understand TikTok Users ADHD Discussion" , 2023 https://doi.org/10.1109/IRI58017.2023.00043 Citation Details
Perrotti, Anne Marie and Puwo, Isabelle and Jayarathna, Sampath "Exploring TikTok as an Educational Tool for Speech-Language Pathologists, Special Education, and General Education" , 2023 https://doi.org/10.1145/3565287.3617636 Citation Details
Richards, Johnovon and Dabhi, Saumya and Poursardar, Faryaneh and Jayarathna, Sampath "Poster: Leveraging Data Analysis and Machine Learning to Authenticate Yelp Reviews through User Metadata Patterns" , 2023 https://doi.org/10.1145/3565287.3617983 Citation Details
Evans, Michael and Soós, Dominik and Landers, Ethan and Wu, Jian "MSVEC: A Multidomain Testing Dataset for Scientific Claim Verification" , 2023 https://doi.org/10.1145/3565287.3617630 Citation Details

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page