Browsing by Subject "Differential privacy"
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Open Access Differential privacy with bounded priors: Reconciling utility and privacy in genome-wide association studies(ACM, 2015-10) Tramèr, F.; Huang, Z.; Hubaux J.-P.; Ayday, ErmanDifferential privacy (DP) has become widely accepted as a rigorous definition of data privacy, with stronger privacy guarantees than traditional statistical methods. However, recent studies have shown that for reasonable privacy budgets, differential privacy significantly affects the expected utility. Many alternative privacy notions which aim at relaxing DP have since been proposed, with the hope of providing a better tradeoff between privacy and utility. At CCS'13, Li et al. introduced the membership privacy framework, wherein they aim at protecting against set membership disclosure by adversaries whose prior knowledge is captured by a family of probability distributions. In the context of this framework, we investigate a relaxation of DP, by considering prior distributions that capture more reasonable amounts of background knowledge. We show that for different privacy budgets, DP can be used to achieve membership privacy for various adversarial settings, thus leading to an interesting tradeoff between privacy guarantees and utility. We re-evaluate methods for releasing differentially private χ2-statistics in genome-wide association studies and show that we can achieve a higher utility than in previous works, while still guaranteeing membership privacy in a relevant adversarial setting. © 2015 ACM.Item Open Access On the tradeoff between privacy and utility in genomic studies: differential privacy under dependent tuples(2020-08) Alserr, Nour M. N.The rapid progress in genome sequencing and the decrease in the sequencing costs have led to the high availability of genomic data. Studying these data can greatly help answer the key questions about disease associations and our evolution. However, due to growing privacy concerns about the sensitive information of participants, accessing key results and data of genomic studies (such as genomewide association studies - GWAS) is restricted to only trusted individuals. On the other hand, paving the way to biomedical breakthroughs and discoveries requires granting open access to genomic datasets. Privacy-preserving mechanisms can be a solution for granting wider access to such data while protecting their owners. In particular, there has been growing interest in applying the concept of differential privacy (DP) while sharing summary statistics about genomic data. DP provides a mathematically rigorous approach to prevent the risk of membership inference while sharing statistical information about a dataset. However, DP has a known drawback as it does not take into account the correlation between dataset tuples, which is a common situation for genomic datasets due to the inherent correlations between the genomes of family members. This may degrade the privacy guarantees offered by the DP. In this Thesis, focusing on static and dynamic genomic datasets, we show this drawback of the DP and we propose techniques to mitigate it. First, using a real-world genomic dataset, we demonstrate the feasibility of an attribute inference attack on differentially private query results by utilizing the correlations between the entries in the dataset. We show the privacy loss in count, minor allele frequency (MAF), and chi-square queries. The results explain the scale of vulnerability when we have dependent tuples in the dataset. Our results demonstrate that the adversary can infer sensitive genomic data about a user from the differentially private results of a sum query by exploiting the correlations between the genomes of family members. Our results also show that using the results of differentially-private MAF queries on static and dynamic genomic datasets and utilizing the dependency between tuples, an adversary can reveal up to 50% more sensitive information about the genome of a target (compared to original privacy guarantees of standard DP-based mechanisms), while differentially-privacy chi-square queries can reveal up to 40% more sensitive information. Furthermore, we show that the adversary can use the inferred genomic data obtained from the attribute inference attack to infer the membership of a target in another genomic dataset (e.g., associated with a sensitive trait). Using a log-likelihood-ratio (LLR) test, our results also show that the inference power of the adversary can be significantly high in such an attack even by using inferred (and hence partially incorrect) genomes. Finally, we propose a mechanism for privacy-preserving sharing of statistics from genomic datasets to attain privacy guarantees while taking into consideration the dependence between tuples. By evaluating our mechanism on different genomic datasets, we empirically demonstrate that our proposed mechanism can achieve up to 50% better privacy than traditional DP-based solutions.