OptiSGD-DPWGAN: Integrating Metaheuristic Algorithms and Differential Privacy to Improve Privacy-Utility Trade-Off in Generative Models

Autor: Alshaymaa Ahmed Mohamed, Yasmine N. M. Saleh, Ayman A. Abdel-Hamid
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: IEEE Access, Vol 12, Pp 176070-176086 (2024)
Druh dokumentu: article
ISSN: 2169-3536
DOI: 10.1109/ACCESS.2024.3502909
Popis: Recently, the necessity to balance model performance with data confidentiality in synthetic data generation has become a significant challenge in deep learning analysis of medical databases. In this paper, the OptiSGD-DPWGAN model is proposed, that incorporates metaheuristic algorithms and differential privacy into the Wasserstein Generative Adversarial Network (WGAN) architecture to protect sensitive data during the training process. The integration of Simulated Annealing and Backtracking Line Search with Stochastic Gradient Descent (SGD) optimizes the exploration of the solution space of complex parameters in non-convex deep learning models, significantly avoiding local minima. In differentially private synthetic data generation, adjusting the epsilon value critically influences the trade-off between preserving privacy and maintaining the utility of the data. Typically, a lower epsilon value strengthens privacy guarantees but can inversely affect the model’s effectiveness due to increased noise in the data processing. Empirical results demonstrate that OptiSGD-DPWGAN effectively mitigates this trade-off. Compared to other schemes, OptiSGD-DPWGAN consistently achieves lower privacy costs without compromising the quality of the synthetic data generated. These results not only show the capability of OptiSGD-DPWGAN to set a new standard in privacy-preserving synthetic data generation but also highlight its potential to generate high-quality synthetic data crucial for the medical domain which requires strict confidentiality and high precision.
Databáze: Directory of Open Access Journals