Rxivist logo

Differential Privacy Protection Against Membership Inference Attack on Machine Learning for Genomic Data

By Junjie Chen, Wendy Hui Wang, Xinghua Shi

Posted 04 Aug 2020
bioRxiv DOI: 10.1101/2020.08.03.235416

Machine learning is powerful to model massive genomic data while genome privacy is a growing concern. Studies have shown that not only the raw data but also the trained model can potentially infringe genome privacy. An example is the membership inference attack (MIA), by which the adversary, who only queries a given target model without knowing its internal parameters, can determine whether a specific record was included in the training dataset of the target model. Differential privacy (DP) has been used to defend against MIA with rigorous privacy guarantee. In this paper, we investigate the vulnerability of machine learning against MIA on genomic data, and evaluate the effectiveness of using DP as a defense mechanism. We consider two widely-used machine learning models, namely Lasso and convolutional neural network (CNN), as the target model. We study the trade-off between the defense power against MIA and the prediction accuracy of the target model under various privacy settings of DP. Our results show that the relationship between the privacy budget and target model accuracy can be modeled as a log-like curve, thus smaller privacy budget provides stronger privacy guarantee with the cost of losing more model accuracy. We also investigate the effect of model sparsity on model vulnerability against MIA. Our results demonstrate that in addition to prevent overfitting, model sparsity can work together with DP to significantly mitigate the risk of MIA. ### Competing Interest Statement The authors have declared no competing interest.

Download data

  • Downloaded 947 times
  • Download rankings, all-time:
    • Site-wide: 26,470
    • In bioinformatics: 3,027
  • Year to date:
    • Site-wide: 8,174
  • Since beginning of last month:
    • Site-wide: 54,017

Altmetric data


Downloads over time

Distribution of downloads per paper, site-wide


PanLingua

News