Differential Privacy
Data protection now has a universal standard: Differential Privacy. No other definition of privacy provides satisfying protection against the risk of re-identification using additional information (e.g.: private or public datasets, triangulation from overlapping requests).
A data processing computation is deemed differentially private if the outputs are not significatively different should one protected entity be added or removed. This property guarantees that the recipient of the output will not be able to make any significant inference on all possible protected entities, whether they belong to the dataset or not. This definition is powerful because it makes no assumption as to what data or what means may be available to the recipient of information.
At Sarus, Differential Privacy is at the core of all interactions with sensitive data. Whether it is metadata, synthetic data, SQL analyses, or machine learning models, every bit of information derived from sensitive datasets is measured and controlled using the principles of Differential Privacy to ensure privacy is protected.