Microsoft and Harvard University partner on open source differential privacy platform

Microsoft partnered with Harvard OpenDP Initiative to develop and open source the first platform for differential privacy. This work has been going on for almost a year, and last September, John Kahan, Microsoft’s chief data analytics director, announced the project: “We need to find a way to analyze data to unlock the full potential of the data without risking the privacy of the data holder.” “

Microsoft and Harvard University partner on open source differential privacy platform

The concept of differential privacy was developed in 2006 by Cynthia Dwork of Microsoft Research and Gordon McKay, a professor of computer science at Harvard University. It is possible to analyze the overall data set without disclosing individual privacy information, draw effective conclusions, and prevent differential attacks.

This is done primarily by adding errors or noise. The right amount of noise is added to the statistics to mask the contribution of individual data points. Differential privacy practices allow no one to infer any particular personal information from a dataset or even tell if a particular individual is included in a data set.

The technology is still in its infancy, and Microsoft says open source platforms are important for the maturation and widespread use of technology. “Large, open data sets have more potential than you think, and differential privacy platforms pave the way for people to contribute, collaborate, and leverage it.”

The OpenDP platform is now open source for testing and build. There are currently eight repositories, platform systems, cores, algorithms, sample sample sleuths, and more, mainly in Python and Rust.