On The Privacy-Utility Tradeoff in Participatory Sensing Systems

Autor: Nouha Sghaier, Mohamed Ali Moussa, Yacine Ghamri-Doudane, Rim Ben Messaoud
Přispěvatelé: Laboratoire Informatique, Image et Interaction - EA 2118 (L3I), Université de La Rochelle (ULR), Laboratoire d'Informatique Gaspard-Monge (LIGM), Centre National de la Recherche Scientifique (CNRS)-Fédération de Recherche Bézout-ESIEE Paris-École des Ponts ParisTech (ENPC)-Université Paris-Est Marne-la-Vallée (UPEM), BEN MESSAOUD, Rim, La Rochelle Université (ULR)
Jazyk: angličtina
Rok vydání: 2016
Předmět:
Zdroj: Proceedings of the 15th IEEE International Symposium on Network Computing and Applications
NCA 2016
NCA 2016, Oct 2016, Boston, United States. pp.1-8
NCA
Popis: International audience; The ubiquity of sensors-equipped mobile devices has enabled citizens to contribute data via participatory sensing systems. This emergent paradigm comes with various applications to improve users’ quality of life. However, the data collection process may compromise the participants’ privacy when report- ing data tagged or correlated with their sensitive information. Therefore, anonymization and location cloaking techniques have been designed to provide privacy protection, yet to some cost of data utility which is a major concern for queriers. Different from past works, we assess simultaneously the two competing goals of ensuring the queriers’ required data utility and protecting the participants’ privacy. First, we introduce a trust worthy entity to the participatory sensing traditional system. Also, we propose a general privacy-preserving mechanism that runs on this entity to release a distorted version of the sensed data in order to minimize the information leakage with its associated private information. We demonstrate how to identify a near-optimal solution to the privacy-utility tradeoff by maximizing a privacy score while considering a utility metric set by data queriers (service providers). Furthermore, we tackle the challenge of data with large size alphabets by investigating quantization techniques. Finally, we evaluate the proposed model on three different real datasets while varying the prior knowledge and the obfuscation type. The obtained results demonstrate that, for different applications, a limited distortion may ensure the participants’ privacy while maintaining about 98% of the required data utility.
Databáze: OpenAIRE