Award Abstract # 1254169
CAREER: The Value of Privacy

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: CALIFORNIA INSTITUTE OF TECHNOLOGY
Initial Amendment Date: May 15, 2013
Latest Amendment Date: November 21, 2017
Award Number: 1254169
Award Instrument: Continuing Grant
Program Manager: Dan Cosley
dcosley@nsf.gov
 (703)292-8832
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: June 1, 2013
End Date: May 31, 2018 (Estimated)
Total Intended Award Amount: $541,993.00
Total Awarded Amount to Date: $541,993.00
Funds Obligated to Date: FY 2013 = $105,093.00
FY 2014 = $105,695.00

FY 2015 = $108,002.00

FY 2016 = $110,378.00

FY 2017 = $112,825.00
History of Investigator:
  • Adam Wierman (Principal Investigator)
    adamw@caltech.edu
  • Katrina Ligett (Former Principal Investigator)
Recipient Sponsored Research Office: California Institute of Technology
1200 E CALIFORNIA BLVD
PASADENA
CA  US  91125-0001
(626)395-6219
Sponsor Congressional District: 28
Primary Place of Performance: California Institute of Technology
CA  US  91125-0001
Primary Place of Performance
Congressional District:
28
Unique Entity Identifier (UEI): U2JMKHNS5TG4
Parent UEI:
NSF Program(s): Secure &Trustworthy Cyberspace
Primary Program Source: 01001314DB NSF RESEARCH & RELATED ACTIVIT
01001415DB NSF RESEARCH & RELATED ACTIVIT

01001516DB NSF RESEARCH & RELATED ACTIVIT

01001617DB NSF RESEARCH & RELATED ACTIVIT

01001718DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 1045, 7434, 9102
Program Element Code(s): 806000
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

This project takes a new approach to problems involving sensitive data, by focusing on rigorous mathematical modeling and characterization of the value of private information. By focusing on quantifying the loss incurred by affected individuals when their information is used -- and quantifying the attendant benefits of such use -- the approaches advanced by this work enable concrete reasoning about the relative risks and rewards of a wide variety of potential computations on sensitive data.

Specifically, this work has four main technical thrusts. The first is the development of new models and definitions, enabling privacy considerations to be incorporated into agent utility functions. The second is analysis of the feasibility and costs of eliciting sensitive information, in light of these models. The third focus is on enabling more sophisticated computations in settings where individuals value their privacy. Finally, more complex settings incorporate the interests of additional actors.

One of the goals of this project is not only to develop a science of the value of private information, but to build bridges between computer science and economics that will enable such work. Further, the models and algorithms developed by this project could inform future regulation regarding the use, exchange, and monetization of sensitive data. The project supports and is supported by a wide variety of educational goals, including significant research involvement of students at a range of stages, development of a course series with a substantial research component, and assessment of a pedagogical technique created to facilitate meaningful engagement with research literature.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 21)
Blum, Avrim and Ligett, Katrina and Roth, Aaron "A Learning Theory Approach to Noninteractive Database Privacy" J. ACM , v.60 , 2013 , p.12:1--12: 10.1145/2450142.2450148
Heffetz, Ori and Ligett, Katrina "Privacy and Data-Based Research" Journal of Economic Perspectives , v.28 , 2014 , p.75-98 10.1257/jep.28.2.75
Jon Kleinberg and Katrina Ligett "Information-sharing in social networks" Games and Economic Behavior , v.82 , 2013 , p.702 - 716 http://dx.doi.org/10.1016/j.geb.2013.10.002
Omer Tamuz, Shai Vardi and Juba Ziani "Non-Exploitable Protocols for Repeated Cake Cutting" AAAI Conference on Artificial Intelligence , 2018
Ori Heffetz and Katrina Ligett "Privacy and Data-Based Research" Journal of Economic Perspectives , v.28 , 2014 , p.75
Palma London, Shai Vardi, Adam Wierman and Hanling Yi "A Parallelizable Acceleration Framework for Packing Linear Programs" AAAI Conference on Artificial Intelligence , 2018
Palma London, Xiaoqi Ren, Adam Wierman, Juba Ziani "Datum: Managing Data Purchasing and Data Placement in a Geo-Distributed Data Market" Transactions on Networking , 2018
Rachel Cummings, David M. Pennock, and Jennifer Wortman Vaughan. "The Possibilities and Limitations of Private Prediction Markets" Economics and Computation (EC) , 2018
Rachel Cummings, Katrina Ligett, Jaikumar Radhakrishnan, Aaron Roth, and Zhiwei Steven Wu. "Coordination Complexity: Small Information Coordinating Large Populations." Innovations in Theoretical Computer Science (ITCS) , 2016
Rachel Cummings, Katrina Ligett, Jaikumar Radhakrishnan, Aaron Roth, Zhiwei Steven Wu "Coordination Complexity: Small Information Coordinating Large Populations" Innovations in Theoretical Computer Science , 2016
Rachel Cummings, Katrina Ligett, Kobbi Nissim, Aaron Roth, and Zhiwei Steven Wu "Adaptive Learning with Robust Generalization Guarantees." Conference on Learning Theory (COLT) , 2016
(Showing: 1 - 10 of 21)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

This project has focused on the study of incentive issues related to data privacy, including questions such as: who might share data with whom and why, how could people be compensated for the use of their private data, and how to gather and use data coming from people who might have an incentive to misrepresent or hide their data. To this end, we have developed new models of incentives in information-sharing in social networks, reflecting a trade-off between the risks and benefits of sharing information.

This project has also considered issues that stem from making the formal mathematical tools used in the study of privacy more applicable real-world problems. This grant has supported work on a paper written for economists who work with personal data, in order to introduce them to the computer science privacy literature. One surprising result that has come from this work highlights the risks in blindly applying privacy-preserving technologies without considering the broader context; we see that if the parties involved might change their behavior as a result of an increased use of such technologies, their effect could have the opposite of what was intended. We have also studied the problem of maximizing the level of privacy that can be offered to participants in a dataset, subject to accuracy constraints on the computations that will be done with the data.

 

 


Last Modified: 09/06/2018
Modified by: Adam C Wierman

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page