New Program Abstractions for Privacy
Autor: | Sebastian Hunt, David Sands |
---|---|
Přispěvatelé: | Di Pierro, A., Malacaria, A., Nagarajan, P. |
Rok vydání: | 2020 |
Předmět: |
QA75
050101 languages & linguistics Thesaurus (information retrieval) Measure (data warehouse) Computer science media_common.quotation_subject 05 social sciences 02 engineering and technology Computer security computer.software_genre 0202 electrical engineering electronic engineering information engineering Differential privacy 020201 artificial intelligence & image processing 0501 psychology and cognitive sciences Quality (business) computer media_common |
Zdroj: | From Lambda Calculus to Cybersecurity Through Program Analysis ISBN: 9783030411022 From Lambda Calculus to Cybersecurity Through Program Analysis |
Popis: | Static program analysis, once seen primarily as a tool for optimising programs, is now increasingly important as a means to provide quality guarantees about programs. One measure of quality is the extent to which programs respect the privacy of user data. Differential privacy is a rigorous quantified definition of privacy which guarantees a bound on the loss of privacy due to the release of statistical queries. Among the benefits enjoyed by the definition of differential privacy are compositionality properties that allow differentially private analyses to be built from pieces and combined in various ways. This has led to the development of frameworks for the construction of differentially private program analyses which are private-by-construction. Past frameworks assume that the sensitive data is collected centrally, and processed by a trusted curator. However, the main examples of differential privacy applied in practice - for example in the use of differential privacy in Google Chrome’s collection of browsing statistics, or Apple’s training of predictive messaging in iOS 10 -use a purely local mechanism applied at the data source, thus avoiding the collection of sensitive data altogether. While this is a benefit of the local approach, with systems like Apple’s, users are required to completely trust that the analysis running on their system has the claimed privacy properties.\ud \ud In this position paper we outline some key challenges in developing static analyses for analysing differential privacy, and propose novel abstractions for describing the behaviour of probabilistic programs not previously used in static analyses. |
Databáze: | OpenAIRE |
Externí odkaz: |