Mechanisms for Robust Local Differential Privacy

Milan Lopuhaä-Zwakenberg*, Jasper Goseling

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

21 Downloads (Pure)


We consider privacy mechanisms for releasing data X =(S,U), where S is sensitive and U is non-sensitive. We introduce the robust local differential privacy (RLDP) framework, which provides strong privacy guarantees, while preserving utility. This is achieved by providing robust privacy: our mechanisms do not only provide privacy with respect to a publicly available estimate of the unknown true distribution, but also with respect to similar distributions. Such robustness mitigates the potential privacy leaks that might arise from the difference between the true distribution and the estimated one. At the same time, we mitigate the utility penalties that come with ordinary differential privacy, which involves making worst-case assumptions and dealing with extreme cases. We achieve robustness in privacy by constructing an uncertainty set based on a Rényi divergence. By analyzing the structure of this set and approximating it with a polytope, we can use robust optimization to find mechanisms with high utility. However, this relies on vertex enumeration and becomes computationally inaccessible for large input spaces. Therefore, we also introduce two low-complexity algorithms that build on existing LDP mechanisms. We evaluate the utility and robustness of the mechanisms using numerical experiments and demonstrate that our mechanisms provide robust privacy, while achieving a utility that is close to optimal.
Original languageEnglish
Article number233
Issue number3
Publication statusPublished - 6 Mar 2024


  • Local differential privacy
  • Robust optimization
  • Rényi divergence


Dive into the research topics of 'Mechanisms for Robust Local Differential Privacy'. Together they form a unique fingerprint.

Cite this