
NSF Org: |
CNS Division Of Computer and Network Systems |
Recipient: |
|
Initial Amendment Date: | July 26, 2018 |
Latest Amendment Date: | May 19, 2021 |
Award Number: | 1801316 |
Award Instrument: | Continuing Grant |
Program Manager: |
Anna Squicciarini
asquicci@nsf.gov (703)292-5177 CNS Division Of Computer and Network Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | September 1, 2018 |
End Date: | August 31, 2023 (Estimated) |
Total Intended Award Amount: | $399,482.00 |
Total Awarded Amount to Date: | $431,482.00 |
Funds Obligated to Date: |
FY 2019 = $141,137.00 FY 2020 = $148,979.00 FY 2021 = $16,000.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
5000 FORBES AVE PITTSBURGH PA US 15213-3815 (412)268-8746 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
5000 Forbes Avenue WQED Building Pittsburgh PA US 15213-3890 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): |
Special Projects - CNS, Secure &Trustworthy Cyberspace |
Primary Program Source: |
01001920DB NSF RESEARCH & RELATED ACTIVIT 01002021DB NSF RESEARCH & RELATED ACTIVIT 01002122DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
Current user-facing computer systems apply a "notice and consent" approach to managing user privacy: the user is presented with a privacy notice and then must consent to its terms. Decades of prior research show that this approach is unmanageable: policies are vague, ambiguous, and often include legal terms that make them very difficult to understand, if they are even read at all. These problems are magnified across Internet of Things (IoT) devices, which may not include displays to present privacy information, and may become so ubiquitous in the environment that users cannot possibly determine when their data is actually being captured. This project aims to solve these problems by designing new privacy management systems that automatically infer users' context-specific privacy expectations and then use them to manage the data-capture and data-sharing behaviors of mobile and IoT devices in users' environments. The goals of this research are to better understand privacy expectations, design privacy controls that require minimal user intervention, and demonstrate how emergent technologies can be designed to empower users to best manage their privacy.
The theory of "Privacy as Contextual Integrity" (CI) postulates that privacy expectations are based on contextual norms, and that privacy violations occur when data flows in ways that defy these norms. The framework can be applied by modeling data flows in terms of the data type, sender, recipient, as well as the specific context (i.e., the purpose for which data is being shared). While this model makes intuitive sense, there are several open research questions that have prevented it from being applied in computer systems. Specifically, the project investigates how privacy expectations change across varying contexts through the use of surveys, interviews, and behavioral studies, and designs systems to automatically infer contextual information so that the process of determining whether or not a data flow is likely to defy user expectations can be automated. The investigators develop a prototype of the novel privacy controls and validate their usability and privacy-preserving properties through iterative laboratory and field experiments.
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
The theory of privacy as contextual integrity (CI) asserts that people’s privacy expectations are shaped by contextual informational norms; it predicts that practices breaching these norms will be experienced as privacy violations. The theory further asserts that contextual informational norms prescribe data flows according to social context, actors, information types, and transmission principles, and that all of these parameters must be specified in modeling privacy expectations. Failure to do so results in faulty practices and ambiguous notices. Thus, to improve end-user privacy, research is needed to: 1) map information flows in terms of CI parameters, 2) inform users about them, and 3) ascertain whether these flows meet contextual norms and/or users’ expectations.
An important part of this project was to develop methods and techniques to identify attributes that influence people’s privacy expectations in different contexts and in relation to different applications and platforms. What are the important constraints that lead to nuanced preferences? How can systems detect when an application has shifted from an allowable data-sharing context to an unallowable one? How can knowledge of a user’s preferences in one context be used to infer her preferences in other contexts? The project focused in particular on contexts associated with mobile app privacy and Internet of Things (IoT) privacy.
The following summarizes some of the project's main outcomes:
-The project contributed extensive new insight into people's privacy attitudes across a wide range of contexts representative of recent deployments of video analytics functionality. Results from this research were shared with regulatory agencies, including the Federal Trade Commission
-The project also contributed to the development of a novel privacy infrastructure for the Internet of Things (IoT), which allows people who control and/or deploy Internet of Things resources (e.g. cameras, smart sensors, etc.) to publicize the presence of these devices and their data practices. The infrastructure also includes an IoT assistant app available in both the iOS store and the Google Play store, which enables people to discover nearby IoT resources, information about the data they collect, and settings that might possibly be available for them to restrict the collection and/or processing of their data. This infrastructure hosts well over 100,000 IoT resource descriptions and the IoT assistant app has been downloaded by tens of thousands of users.
-The project showed how it is possible to extend Contextual Integrity to study public health related privacy issues such as the acceptance of COVID vaccination mandates and certificates, leading to a study that shed new light on how different groups of people feel about such mandates and certificates in different contexts
-The project also helped better understand how security and privacy nudges based on protection motivation theory can be used to help people better protect themselves and adopt safer security and privacy practices
-The project also contributed a framework to help organize the space of possible privacy choices one can make available to users in the context of IoT scenarios
-The project contributed to the design, evaluation and adoption of a "Do Not Sell My Personal Information" button and accompanying text, which have been adopted by the State of California in the context of the California Consumer Privacy Protection Act (CCPA).
Research results from this project have also been incorporated in courses taught at Carnegie Mellon University and in particular in the context of the University's Privacy Engineering Program.
Last Modified: 01/02/2024
Modified by: Norman M Sadeh
Please report errors in award information by writing to: awardsearch@nsf.gov.