Award Abstract # 2334061
CRII: SaTC: RUI: Understanding and Addressing the Security and Privacy Needs of At-Risk Populations

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: NORTHEASTERN UNIVERSITY
Initial Amendment Date: July 13, 2023
Latest Amendment Date: November 10, 2023
Award Number: 2334061
Award Instrument: Standard Grant
Program Manager: Sara Kiesler
skiesler@nsf.gov
 (703)292-8643
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: March 15, 2023
End Date: March 31, 2024 (Estimated)
Total Intended Award Amount: $175,000.00
Total Awarded Amount to Date: $71,209.00
Funds Obligated to Date: FY 2020 = $71,209.00
History of Investigator:
  • Ada Lerner (Principal Investigator)
    ada@ccs.neu.edu
Recipient Sponsored Research Office: Northeastern University
360 HUNTINGTON AVE
BOSTON
MA  US  02115-5005
(617)373-5600
Sponsor Congressional District: 07
Primary Place of Performance: Northeastern University
360 HUNTINGTON AVE
BOSTON
MA  US  02115-5005
Primary Place of Performance
Congressional District:
07
Unique Entity Identifier (UEI): HLTMVS2JZBS6
Parent UEI:
NSF Program(s): Secure &Trustworthy Cyberspace
Primary Program Source: 01002021DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 025Z, 7434, 8228, 9229
Program Element Code(s): 806000
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070, 47.075

ABSTRACT

Technology and the internet are increasingly involved in our personal as well as public lives, providing people with many benefits, but also creating risks to our privacy. Minority and marginalized groups especially benefit from information, community, and social engagement when they use technology. However, they are also in more danger. They are more likely to be targeted, and may have fewer resources to protect themselves. Technology may not be designed for their needs. Such groups are especially at risk when their personal lives are revealed and, in some cases, communicated to many online. They can face a variety of unusual security and privacy concerns, including the revelation of their offline identity, targeted harassment or doxing, and usability failures such as account lockout when interacting with security systems that use abnormal behavior (such as the use of pseudonyms) as heuristics for suspicion. This project will study how people who may be at risk use technology and what security and privacy dangers they experience. It has the goal to create new technology that better meets the needs of all people in our society.

This project investigates how people's personal lives can be compromised online and what computer security and privacy challenges they have. The project uses surveys and interviews, and designs systems informed by the results of those data. Contributions will include the development of guidelines for inclusive design and the generalization of those guidelines to marginalized groups, the creation of new systems responding to problems faced by these groups, and the evaluation of the effectiveness of these systems for supporting users' real world needs.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

Marsh, Abby and Lerner, Ada "Privacy Norms of Transformative Fandom: A Case Study of an Activity-Defined Community" Proceedings of the ACM on Human-Computer Interaction , v.8 , 2024 https://doi.org/10.1145/3637388 Citation Details
Wang, Kelly and Bially_Levy, Dan and Nguyen, Kien T and Lerner, Ada and Marsh, Abigail "Counting Carrds: Investigating Personal Disclosure and Boundary Management in Transformative Fandom" , 2024 https://doi.org/10.1145/3613904.3642664 Citation Details

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

Technology is everywhere today, and everyone faces security and privacy challenges. However, some people face especially strong challenges because they are commonly targeted, are disadvantaged in society, or are otherwise vulnerable to harm. Such groups are important to study, both to protect them and to learn from the expertise they often develop about how to protect themselves individually and collectively. Studying these groups can therefore help make society more equitable not just for these groups but for others as well. We used a variety of methods, including interviews, surveys, and measurements of the web and social media, to study vulnerable groups, develop insights and new theories to describe their experiences, and guide people who make technology and researchers on how to build and use technology in ways that are safer for all.

Students and early career scientists from many backgrounds got to conduct research and explore career options through this project, with 9 undergraduate students, 1 PhD student, 1 postdoctoral researcher, and 1 early career principal investigator conducting the research and authoring scientific papers. Here, we summarize key results from peer-reviewed publications from this project:

 1) Online survey platforms are often used by researchers who need to survey lots of people. We wanted to know how representative of a) the general population and, b) specific, vulnerable populations the people who take surveys on these platforms are. We compared the results of a survey we ran on several platforms with the results of that same survey run by the Pew Research Center using best practices for representative surveys. We found that for certain types of questions, the responses of online participants are quite representative. But, people in online samples were significantly different in certain ways, such as in their factual knowledge about computer security and privacy. We also found that in general, the more vulnerable a population, the less accurate an online sample is. This guides researchers to know when they can use this convenient way of finding people to survey and when they must use other methods to accurately understand people's experiences of security and privacy.

2) Researchers have increasingly recognized the importance of investigating the experiences of vulnerable populations; however, our work found that lumping people together simply because of their demographics can be less effective, especially when it comes to security and privacy research, because the ways that people protect themselves and their communities are intimately related to the specific activities in which they are involved. We learned this by studying people who are members of the fandoms for various media, who often share identity characteristics that make them vulnerable and whose privacy practices are deeply informed by the cultural norms of the community of fans. We hope that this will support future work by our group and by others to expand these lessons to other examples of what we call "activity-defined communities".

3) Bridging the space between research and practical advice, we looked at how people understand (or misunderstand) words that are commonly used to discuss both legal and technical aspects of privacy. We found that different words or phrases are misunderstood much more or less often than other words or phrases. We also found that these misunderstandings vary significantly in terms of how they make people feel, such as in terms of how much they trust a service or app that uses the term to describe data privacy practices. We hope that these results can inform researchers, app makers, and regulators in determining when people are likely to be deceived or tricked by language and how people can be given truthful and fair understanding of the apps they use.

4) We explored how vulnerable populations interact within online communities, including how they think about risks of privacy, hate, and harassment. We call these ways of thinking "risk models", and they can help anyone building or regulating technology to understand what people different from themselves are afraid of and how technology can help. We also looked at how people in online fandom share personal information, and how sharing personal information and vulnerability help to build a trusting community that will respect and enforce privacy for one another.

5) Much of the moderation that takes place on apps such as Reddit and Discord is done by volunteer users with busy lives, making both practical and emotional support key to help them protect their communities from hate and harassment. We categorized the different ways that moderators of vulnerable communities help each other out, describing those ways in detail. We also investigated the ways that the features of an app can be co-opted or abused by people who harass others, and provided advice for how the designers of technology can anticipate these kinds of abuses and design their apps to be harder to abuse.


Last Modified: 07/30/2024
Modified by: Ada Lerner

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page