
NSF Org: |
CNS Division Of Computer and Network Systems |
Recipient: |
|
Initial Amendment Date: | May 31, 2013 |
Latest Amendment Date: | September 6, 2017 |
Award Number: | 1253204 |
Award Instrument: | Continuing Grant |
Program Manager: |
Nina Amla
namla@nsf.gov (703)292-7991 CNS Division Of Computer and Network Systems CSE Directorate for Computer and Information Science and Engineering |
Start Date: | September 1, 2013 |
End Date: | August 31, 2019 (Estimated) |
Total Intended Award Amount: | $545,623.00 |
Total Awarded Amount to Date: | $545,623.00 |
Funds Obligated to Date: |
FY 2014 = $237,928.00 FY 2016 = $123,854.00 FY 2017 = $98,094.00 |
History of Investigator: |
|
Recipient Sponsored Research Office: |
4200 FIFTH AVENUE PITTSBURGH PA US 15260-0001 (412)624-7400 |
Sponsor Congressional District: |
|
Primary Place of Performance: |
Pittsburgh PA US 15213-2303 |
Primary Place of
Performance Congressional District: |
|
Unique Entity Identifier (UEI): |
|
Parent UEI: |
|
NSF Program(s): | Secure &Trustworthy Cyberspace |
Primary Program Source: |
01001415DB NSF RESEARCH & RELATED ACTIVIT 01001617DB NSF RESEARCH & RELATED ACTIVIT 01001718DB NSF RESEARCH & RELATED ACTIVIT |
Program Reference Code(s): |
|
Program Element Code(s): |
|
Award Agency Code: | 4900 |
Fund Agency Code: | 4900 |
Assistance Listing Number(s): | 47.070 |
ABSTRACT
To date, the application of quantitative security and privacy metrics metrics has seen its greatest successes when exploring the worst-case properties of a system. That is, given a powerful adversary, to what extent does the system preserve some relevant set of properties? While such analyses allow experts to build systems that are resistant to strong attackers, many deployed systems were not designed in this manner. In fact, there is growing evidence that users' privacy is routinely compromised as a byproduct of using social, participatory, and distributed applications. Given that people find inherent utility in using systems that are not secure against worst-case adversaries, this project investigates a complementary question: Can we help users better manage their participation in systems that are not privacy-preserving in an absolute sense?
This project is developing a principled approach that enables individuals to (i) quantitatively specify and assess their security, privacy, and utility goals; (ii) qualitatively express preferences on the relative importance of these goals; (iii) explore the implications of their system interactions by leveraging the trade-off spaces resulting from these quantitative and qualitative specifications; and (iv) enact locally-enforceable changes to their system usage to better balance competing needs. This project is designing computational tools that enable everyday users to better manage their system participation by understanding the interplay between security, privacy, and utility. Educational materials are being developed to support two undergraduate courses---one for computer science majors and one for non-majors---that explore the social, technical, and privacy implications of our increasingly digitized society.
PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH
Note:
When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external
site maintained by the publisher. Some full text articles may not yet be available without a
charge during the embargo (administrative interval).
Some links on this page may take you to non-federal websites. Their policies may differ from
this site.
PROJECT OUTCOMES REPORT
Disclaimer
This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.
As the degree to which we rely on networked information systems like social networks, IoT-enabled homes and sensor-instrumented workplaces, and cloud-backed data services increases, our agency to control our own data becomes ever more reduced. In this project, we sought to understand the context surrounding individuals? perceived losses of privacy in the context of networked and data-centric systems, and to develop user-centric mechanisms for them to control the ways in which their personal data is made accessible to these types of systems. We considered these questions in the context of a number of domains.
The declarative nature of SQL has traditionally been a major strength: users can simply state what information they are interested in, and the database management system determines the best plan for retrieving it. However, as query optimizers typically chose the fastest plans for execution, in distributed database scenarios this can result in situations where (i) sensitive aspects of a user?s query are leaked unknowingly to lesser-trusted servers violating querier privacy constraints, and/or (ii) sensitive data from one server flows through multiple others servers in violation of data-owner policies. To address these types of issues, we have developed query optimization approaches that treat user privacy constraints and data owner access constraints as first-class citizens. The result is that our optimizers produce query evaluation plans that are optimal in terms of execution time, while also upholding privacy and security constraints imposed by stakeholders. This process was further refined by the development of query interfaces that support interactive exploration of data flows so that non-expert users can understand what their privacy concerns might be.
In the context of streaming data management systems, data from multiple sources (e.g., fitness trackers, IoT appliances, monitoring systems, etc.) flows constantly through largely static queries that are typically deployed to cloud infrastructures to meet reliability and elasticity goals. While useful, this comes at the cost of exposing sensitive user data to potentially untrusted infrastructure. We aimed to enable the users providing data to streaming data management systems to specify controls on who could access their data (e.g., their clinician, but not the cloud infrastructure used for data processing) while still enabling in-network processing to take advantage of the economies of scale offered by cloud infrastructures. Our initial solutions used computation-enabling encryption techniques to allow certain operations to occur in-network without directly revealing information. Unfortunately, this approach comes with runtime constraints, enables inferences of data distribution, and does not completely support the query languages used by streaming data systems. We then leveraged commodity trusted execution environments to both improve performance and extend support for the full expressive power of streaming query languages. This allows users to specify rich controls over the uses of their data, even when processed on infrastructure whose configuration remains opaque to the user.
Similar to the above, cloud storage infrastructures have become a ubiquitous feature of today?s computational landscape. These platforms allow for seamless data sharing across a user?s devices, and across groups of users. However, these systems involve the exposure of private user data to back-end infrastructure that is often not understood by the user. In addition, compromise or misconfiguration of these services can expose user data to unauthorized individuals, as is evidenced by frequent high-profile data breaches. A naive solution to this problem is to use purely cryptographic and traditional key distribution mechanisms to prevent unauthorized disclosure of data at rest. An analysis conducted by our team uncovered massive overheads entailed by this approach in the event that policies are dynamic. To counter this, we developed a novel approach to using client-side trusted execution environments to enable flexible and efficient enforcement of user-specified access control policies without requiring server-side support. This approach allows the use of unmodified, commodity storage platforms to store and distribute users? data without exposing the unencrypted data to these platforms.
A final area of exploration during this award was with respect to the increasing extent to which users are monitored by sensing infrastructures within the context of smart homes, connected workplaces, and mobile device instrumentation. To this end, we carried out formative studies to explore the ways in which user sentiments towards sensing varied as a function of sensing modality, sensing goal, and data dissemination. We developed design recommendations that balance the utility of these sorts of systems with user comfort, and created prototypes to demonstrate the efficacy of our approaches.
A common theme throughout this work has been that system functionality need not come at the cost of user privacy, and that balancing these two goals can be achieved without excessive overheads or cumbersome interfaces. We have the tools and techniques at our disposal to enable rich user control over data without impeding service providers, and it is worth reconciling these goals.
Last Modified: 05/13/2020
Modified by: Adam J Lee
Please report errors in award information by writing to: awardsearch@nsf.gov.