The Arcology Garden

Protecting Privacy with MATH

LifeTechEmacsTopicsArcology

Archive

This Video provides a good lay-man's explanation of how Differential Privacy works in the context of the 2020 US census.

I was introduced to differential privacy at Uber while our team worked with a pair of UC Berkely grad students on a system to automatically calculate what the sensitivity of a query was and adjust amount of noise injected appropriately. Ask for a million users' data in your aggregation and your noise is negligable, even with an aggressive privacy budget. Ask for a thousand, or a hundred, or one, and you're out of luck.

https://medium.com/uber-security-privacy/differential-privacy-open-source-7892c82c42b6

https://arxiv.org/pdf/1706.09479.pdf

There's something interesting about the idea of quantifying, both on a personal level and societal level, the jitter ranges that people are comfortable with for various personal facts, or collections of them, but I haven't spent much time thinking about how that would work. loose thread for Moving Towards Real Privacy