Distributionally-robust machine learning using locally differentially-private data
Source TitleOptimization Letters
University of Melbourne Author/sFarokhi, Farhad
AffiliationElectrical and Electronic Engineering
Document TypeJournal Article
CitationsFarokhi, F. (2021). Distributionally-robust machine learning using locally differentially-private data. OPTIMIZATION LETTERS, https://doi.org/10.1007/s11590-021-01765-6.
Access StatusOpen Access
We consider machine learning, particularly regression, using locally-differentially private datasets. The Wasserstein distance is used to define an ambiguity set centered at the empirical distribution of the dataset corrupted by local differential privacy noise. The radius of the ambiguity set is selected based on privacy budget, spread of data, and size of the problem. Machine learning with private dataset is rewritten as a distributionally-robust optimization. For general distributions, the distributionally-robust optimization problem can be relaxed as a regularized machine learning problem with the Lipschitz constant of the machine learning model as a regularizer. For Gaussian data, the distributionally-robust optimization problem can be solved exactly to find an optimal regularizer. Training with this regularizer can be posed as a semi-definite program.
- Click on "Export Reference in RIS Format" and choose "open with... Endnote".
- Click on "Export Reference in RIS Format". Login to Refworks, go to References => Import References