Award Abstract # 2120098
Collaborative Research: SaTC: CORE: Large: Rapid-Response Frameworks for Mitigating Online Disinformation

Administratively Terminated Award
NSF Org: CNS
Division Of Computer and Network Systems
Recipient: THE LELAND STANFORD JUNIOR UNIVERSITY
Initial Amendment Date: July 25, 2021
Latest Amendment Date: July 30, 2024
Award Number: 2120098
Award Instrument: Continuing Grant
Program Manager: Sara Kiesler
skiesler@nsf.gov
 (703)292-8643
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: October 1, 2021
End Date: September 30, 2026 (Estimated)
Total Intended Award Amount: $748,437.00
Total Awarded Amount to Date: $648,600.00
Funds Obligated to Date: FY 2021 = $309,705.00
FY 2023 = $166,910.00

FY 2024 = $171,985.00
History of Investigator:
  • Jeffrey Hancock (Principal Investigator)
    hancockj@stanford.edu
Recipient Sponsored Research Office: Stanford University
450 JANE STANFORD WAY
STANFORD
CA  US  94305-2004
(650)723-2300
Sponsor Congressional District: 16
Primary Place of Performance: Stanford University
450 Jane Stanford Way
Stanford
CA  US  94305-2004
Primary Place of Performance
Congressional District:
16
Unique Entity Identifier (UEI): HJD6G4D6TJY5
Parent UEI:
NSF Program(s): Secure &Trustworthy Cyberspace
Primary Program Source: 01002122DB NSF RESEARCH & RELATED ACTIVIT
01002324DB NSF RESEARCH & RELATED ACTIVIT

01002425DB NSF RESEARCH & RELATED ACTIVIT

01002526DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 025Z, 065Z, 7434, 7925
Program Element Code(s): 806000
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Disinformation is a critical, pressing challenge for society. It diminishes our ability to respond to crisis events, including acts of terrorism and pandemics. It makes us vulnerable, as individuals, groups, and a society, to manipulation from foreign governments, financial opportunists, and a range of other bad actors. This problem, exacerbated by the design and widespread use of social media platforms, is inherently a problem of trust ? disinformation undermines trust in information, science, democratic institutions, journalism, and in each other. This research advances our understanding of online disinformation and applies innovative approaches and collaboration infrastructure to address this challenge at a sophistication and pace on par with the dynamic and interdisciplinary nature of the challenge. Through the development, implementation of rapid response frameworks, the research team rapidly identifies disinformation campaigns and communicates those findings uniquely to diverse stakeholders in government, industry, media, and the broader public ? helping to build societal resilience to this kind of manipulation.


This research has three integrated components: 1) developing models and theories of how disinformation is seeded, cultivated, and spread that take into account the sociotechnical nature of the problem; 2) developing and applying innovative, rapid-analysis frameworks for responding to disinformation quickly; and 3) implementing and evaluating the impact of multi-stakeholder collaborations to address disinformation in real-time during real-world events. The work applies a mixed-method approach that integrates novel visualizations and network analysis to identify patterns and anomalies with qualitative analysis that reveals the meanings of those features. Extending from a rapid response approach, investigators are also developing and evaluating, using interviews and experiments, strategies for communicating these findings with diverse stakeholders. Conceptually, this research leverages theories of rumoring from sociology and social psychology and the growing body of literature related to online manipulation to shed light on the participatory dynamics of disinformation campaigns. In terms of impacts on scientific infrastructure, this effort builds collaboration frameworks that others can use to create their own systems for rapid response.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Dahlke, Ross and Kumar, Deepak and Durumeric, Zakir and Hancock, Jeffrey T "Quantifying the Systematic Bias in the Accessibility and Inaccessibility of Web Scraping Content From URL-Logged Web-Browsing Digital Trace Data" Social Science Computer Review , 2023 https://doi.org/10.1177/08944393231218214 Citation Details
Feuerriegel, Stefan and DiResta, Renée and Goldstein, Josh A and Kumar, Srijan and Lorenz-Spreen, Philipp and Tomz, Michael and Pröllochs, Nicolas "Research can help to tackle AI-generated disinformation" Nature Human Behaviour , v.7 , 2023 https://doi.org/10.1038/s41562-023-01726-2 Citation Details
Kennedy, Ian and Wack, Morgan and Beers, Andrew and Schafer, Joseph S. and Garcia-Camargo, Isabella and Spiro, Emma S. and Starbird, Kate "Repeat Spreaders and Election Delegitimization: A Comprehensive Dataset of Misinformation Tweets from the 2020 U.S. Election" Journal of Quantitative Description: Digital Media , v.2 , 2022 https://doi.org/10.51685/jqd.2022.013 Citation Details
Lee, Angela Y. and Moore, Ryan C. and Hancock, Jeffrey T. "Designing misinformation interventions for all: Perspectives from AAPI, Black, Latino, and Native American community leaders on misinformation educational efforts" Harvard Kennedy School Misinformation Review , 2023 https://doi.org/10.37016/mr-2020-111 Citation Details
Moore, Ryan C and Dahlke, Ross and Hancock, Jeffrey T "Exposure to untrustworthy websites in the 2020 US election" Nature Human Behaviour , v.7 , 2023 https://doi.org/10.1038/s41562-023-01564-2 Citation Details
Prochaska, Stephen and Duskin, Kayla and Kharazian, Zarine and Minow, Carly and Blucker, Stephanie and Venuto, Sylvie and West, Jevin D. and Starbird, Kate "Mobilizing Manufactured Reality: How Participatory Disinformation Shaped Deep Stories to Catalyze Action during the 2020 U.S. Presidential Election" Proceedings of the ACM on Human-Computer Interaction , v.7 , 2023 https://doi.org/10.1145/3579616 Citation Details
Starbird, Kate and DiResta, Renée and DeButts, Matt "Influence and Improvisation: Participatory Disinformation during the 2020 US Election" Social Media + Society , v.9 , 2023 https://doi.org/10.1177/20563051231177943 Citation Details

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page