Award Abstract # 1064688
TC: Medium: Semantics and Enforcement of Privacy Policies: Information Use and Purpose

NSF Org: CNS
Division Of Computer and Network Systems
Recipient: CARNEGIE MELLON UNIVERSITY
Initial Amendment Date: April 12, 2011
Latest Amendment Date: August 12, 2013
Award Number: 1064688
Award Instrument: Continuing Grant
Program Manager: Fen Zhao
CNS
 Division Of Computer and Network Systems
CSE
 Directorate for Computer and Information Science and Engineering
Start Date: August 1, 2011
End Date: July 31, 2017 (Estimated)
Total Intended Award Amount: $1,197,126.00
Total Awarded Amount to Date: $1,197,126.00
Funds Obligated to Date: FY 2011 = $293,651.00
FY 2012 = $624,322.00

FY 2013 = $279,153.00
History of Investigator:
  • Anupam Datta (Principal Investigator)
    danupam@andrew.cmu.edu
  • Jeannette Wing (Former Principal Investigator)
  • Anupam Datta (Former Co-Principal Investigator)
Recipient Sponsored Research Office: Carnegie-Mellon University
5000 FORBES AVE
PITTSBURGH
PA  US  15213-3890
(412)268-8746
Sponsor Congressional District: 12
Primary Place of Performance: Carnegie-Mellon University
5000 FORBES AVE
PITTSBURGH
PA  US  15213-3890
Primary Place of Performance
Congressional District:
12
Unique Entity Identifier (UEI): U3NKNFLNQ613
Parent UEI: U3NKNFLNQ613
NSF Program(s): TRUSTWORTHY COMPUTING,
Secure &Trustworthy Cyberspace
Primary Program Source: 01001112DB NSF RESEARCH & RELATED ACTIVIT
01001213DB NSF RESEARCH & RELATED ACTIVIT

01001314DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 7434, 7795, 7923, 7924, 9102
Program Element Code(s): 779500, 806000
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

Organizations, such as hospitals, financial institutions, and universities, that collect and use personal information are required to comply with privacy regulations, such as the Health Insurance Portability and Accountability Act (HIPAA), the Gramm-Leach-Bliley Act (GLBA), and the Family Educational Rights and Privacy Act (FERPA). Similarly, to ensure customer trust, web services companies, such as Google, Facebook, Yahoo!, and Amazon, publish privacy policies stating what they will do with the information they keep about customers' individual behaviors. These policies impose constraints on disclosure (or transmission) of personal information, articulate obligations (e.g., notifying customers about privacy breaches), and identify purposes for which personal information may or may not be used. Prior work has focused on formalisms for disclosure and obligations, but no such foundation has been developed for information use for specified purposes.

Intellectual Merit. This project addresses the central problem of developing a formal semantics that explains what it means to use information for a set of purposes, a logic for specifying such policies, and algorithmic methods for their enforcement. It advances the state of knowledge in the field of privacy by providing a foundation for a concept that is commonly used in practice, but has not been the subject of careful scientific study. The project also investigates the interaction of this concept with the previously studied concepts of disclosure and obligation, thereby enabling a more comprehensive understanding of privacy. The formal semantics the project develops is novel and draws on insights from prior work on philosophical theories of causation and intentions, and from the computer science literature on formal methods, information flow, and planning. The model is validated through user studies and its application through case studies in the healthcare domain.

Broader Impacts. The project addresses a problem of significant and growing importance to society. It initiates a new direction in providing foundations for privacy by studying the concept of information use for a purpose. This concept appears in privacy policies published by organizations in sectors as diverse as finance, web services, healthcare, insurance, education, and government - the cornerstones of modern society. The semantic foundation serves as the basis for developing practical tools to support the enforcement of such policies in such organizations. The project provides opportunities for engaging graduate and undergraduate students. The PIs plan to integrate the research results into their existing security and privacy courses, and, for wider dissemination, leverage outreach programs in Carnegie Mellon's Computer Science Department and CyLab aimed at K-12, women, persons with disabilities, and underrepresented minorities.

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

The project addresses the problem of precisely defining and enforcing a class of privacy properties that restrict information use for certain purposes.

The intellectual merit of the project lies in outcomes that provide precise definitions of what it means to use information for a purpose and algorithms and tools for auditing logs and programs for compliance with such policies. Specifically, the project provides a semantics of purpose restrictions that relates it to planning -- an action is for a purpose if it is part of plan for achieving the purpose, and information is used for a purpose if it affects the planning process. These insights enable the design of audit algorithms based on well-known planning methods from artificial intelligence. Another set of outcomes involves a theory of information flow experiments that support discovering personal information use in black box web services, such as Google's advertising system. 

The broader impact of the project includes a tool chain for privacy compliance that was built jointly with collaborators at Microsoft Research and is now deployed on their production system. Another significant outcome is a rigorous study of the Google advertising system that discovered the use of personal information in ways that raise significant concerns about privacy and fairness.


Last Modified: 09/06/2017
Modified by: Anupam Datta

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page