
NSF Org: |
IIS Division of Information & Intelligent Systems |
Recipient: |
|
Initial Amendment Date: | March 26, 2018 |
Latest Amendment Date: | June 13, 2022 |
Award Number: | 1749815 |
Award Instrument: | Continuing Grant |
Program Manager: |
Dan Cosley
dcosley@nsf.gov (703)292-8832 IIS Division of Information & Intelligent Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | July 1, 2018 |
End Date: | June 30, 2024 (Estimated) |
Total Intended Award Amount: | $550,000.00 |
Total Awarded Amount to Date: | $550,000.00 |
Funds Obligated to Date: |
FY 2019 = $102,374.00 FY 2020 = $106,583.00 FY 2021 = $116,217.00 FY 2022 = $120,780.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
4333 BROOKLYN AVE NE SEATTLE WA US 98195-1016 (206)543-4043 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
WA US 98195-2500 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | HCC-Human-Centered Computing |
Primary Program Source: |
01001920DB NSF RESEARCH & RELATED ACTIVIT 01002021DB NSF RESEARCH & RELATED ACTIVIT 01002122DB NSF RESEARCH & RELATED ACTIVIT 01002223DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
This project will improve our understanding of the spread of disinformation in online environments. It will contribute to the field of human-computer interaction in the areas of social computing, crisis informatics, and human centered data science. Conceptually, it explores relationships between technology, structure, and human action - applying the lens of structuration theory towards understanding how technological affordances shape online action, how online actions shape the underlying structure of the information space, and how those integrated structures shape information trajectories. Methodologically, it enables further development, articulation and evaluation of an iterative, mixed method approach for interpretative analysis of "big" social data. Finally, it aims to leverage these empirical, conceptual and methodological contributions towards the development of innovative solutions for tracking disinformation trajectories.
The online spread of disinformation is a societal problem at the intersection of online systems and human behavior. This research program aims to enhance our understanding of how and why disinformation spreads and to develop tools and methods that people, including humanitarian responders and everyday analysts, can use to detect, understand, and communicate its spread. The research has three specific, interrelated objectives: (1) to better understand the generation, evolution, and propagation of disinformation; (2) to extend, support, and articulate an evolving methodological approach for analyzing "big" social media data for use in identifying and communicating "information provenance" related to disinformation flows; (3) to adapt and transfer the tools and methods of this approach for use by diverse users for identification of disinformation and communication of its origins and trajectories. More broadly, it will contribute to the advancement of science through enhanced understandings and conceptualization of the relationships between technological affordances, social network structure, human behavior, and intentional strategies of deception. The program includes an education plan that supports PhD student training and recruits diverse undergraduate students into research through multiple mechanisms, including for-credit research groups and an academic bridge program.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
This research program explored online disinformation — or the intentional manipulation of online discourse for political or financial gain — from the perspective of human-centered computing. The research achieved three main objectives: (1) to improve understandings of the generation, evolution, and propagation of online disinformation; (2) to employ, evolve, and articulate an innovative, mixed-method approach for understanding online disinformation; and (3) to adapt and translate tools and methods for conducting research on online disinformation for use by non-researchers, primarily journalists.
The primary contribution of this work (described in [1]) was advancing the understanding of online disinformation as participatory — i.e. taking place through collaborations between witting agents and unwitting (though often willing) crowds. This perspective challenged earlier conceptualizations of disinformation as happening to online systems and users, and instead argued that disinformation occurs through online systems and users [2]. More recently, we proposed a collective sensemaking framework for understanding how disinformation (and its cousin, rumor) takes shape through interactions between — and manipulations of — both “facts” and “frames” [3,4].
Findings from this research were translated into dozens of research papers, presentations, and media articles as well as new curricula and policy recommendations. Perhaps most significantly, this work helped lay the foundations for the UW Center for an Informed Public (CIP), which PI Starbird co-founded with four collaborators, and directed from 2021-2024. The Center’s work to resist strategic misinformation, promote an informed society, and strengthen democratic discourse continues. With support from the CIP’s data engineers, our research team maintains infrastructure to collect data from a wide variety of social media platforms and quickly process and make that data available for rapid analysis by our research team, our collaborators, and other researchers.
This grant has supported the training of 9 PhD students and dozens of undergraduate and masters students in concepts and theories related to online rumoring, disinformation, and collective sensemaking, as well as an interpretative, mixed-method approach to analyzing digital trace data. Former students continue to work in academia and industry to better understand and address harmful disinformation and manipulation of online platforms.
[1] Starbird, Kate, Ahmer Arif, and Tom Wilson. "Disinformation as collaborative work: Surfacing the participatory nature of strategic information operations." Proceedings of the ACM on Human-Computer Interaction 3, no. CSCW (2019): 1-26.
[2] Starbird, Kate. "Disinformation's spread: bots, trolls and all of us." Nature 571, no. 7766 (2019): 449-450.
[3] Kate Starbird. (December 6, 2023). Facts, frames, and (mis)interpretations: Understanding rumors as collective sensemaking. Center for an Informed Public: Blog. https://www.cip.uw.edu/2023/12/06/rumors-collective-sensemaking-kate-starbird/
[4] Kate Starbird and Stephen Prochaska. (October 30, 2024). Misinformation is more than just bad facts. The Conversation. https://theconversation.com/misinformation-is-more-than-just-bad-facts-how-and-why-people-spread-rumors-is-key-to-understanding-how-false-information-travels-and-takes-root-241748
Last Modified: 11/15/2024
Modified by: Kate Starbird
Please report errors in award information by writing to: awardsearch@nsf.gov.