With the Census Bureau beginning to transition to differential privacy to protect against re-identification of individuals from its numerous data products, increasing attention is being given to the effect of differential privacy on data users. Differential privacy (DP) is a framework involving perturbative methods of statistical disclosure control that provides a formal privacy guarantee- a quantifiable measure of disclosure risk that does not rely on assumptions about information held by potential attackers attempting record linkage. It also allows users to make inferences from the data that take into account the data protection methods applied to the data, something not typically true when methods like top-coding, suppression, or data swapping are used. This transition to differential privacy entails a wholesale change in the generation and consumption of statistical information. There remain many unsolved challenges, and addressing them is an active area of research.
展开▼