Privacy Umbrella

Health data is becoming more detailed all the time and is accessible to many parties. However, this means it is also vulnerable to attacks that breach the privacy of individual patients. Anonymization processes can protect patient privacy by ensuring it is no longer possible to identify individuals in data sets. However, protecting data is not the only challenge here; it is also important to ensure that the data being shared is sufficiently informative for data analysis processes in the context of personalized medicine — and to enable machine learning on a large scale.

In the PrivacyUmbrella project, Fraunhofer ITEM scientists are collaborating with the university hospitals in Frankfurt and Mainz and MCS Data Labs GmbH with the goal of achieving more reliable anonymization using a combination of different established techniques, while also maximizing usability for data analysis at the same time. The project is being financed by the German Federal Ministry of Education and Research (BMBF) and through NextGenerationEU funding.

Rethinking Data Privacy: From Storage to Source

Traditional approaches to data privacy primarily focus on secure data storage. However, ensuring patient privacy requires a paradigm shift—protection must begin at the point of data collection, not just at centralized storage facilities. Our contribution to this consortium project is to integrate state-of-the-art privacy-preserving methodologies directly into the source: SmarKo, our wearable device.

How Do We Safeguard Data at the Source? - Configurable Anonymization
  •  Our anonymization framework offers flexible deployment, allowing data anonymization to occur at various points in the data flow.)
  •  The system architecture comprises three main components: an administrative server, the user’s smartphone, and the wearable device.
  •  Data administrators can configure where anonymization is applied—on the server, the mobile device, the wearable device, or any combination of these three locations.

Empowering Patients with Privacy Control - Autonomous Privacy Management
  •  Our solution enables patients to control the degree of anonymization applied to their data.
  •  The wearable device is paired with a smartphone application that allows users to set their preferred level of anonymization.
  •  This patient-centric design ensures compliance with key principles of the General Data Protection Regulation (GDPR), particularly with regard to data subject autonomy and informed consent.

An Additional Layer of Protection for Downstream Data Use: On-Device Anonymization and Federated Learning

Contemporary research highlights a key limitation in anonymized data: the risk of re-identification by malicious actors cannot be entirely eliminated. To address this vulnerability, our approach introduces an additional layer of protection by integrating on-device anonymization with federated learning.

For instance, in scenarios where data is used for downstream applications such as medical research or data mining, federated learning enables machine learning models to be trained directly on the user’s mobile device. This means that neither raw nor anonymized data needs to leave the patient’s device.

By keeping data local and combining this with robust anonymization, we significantly reduce the risk of re-identification while preserving the utility of the data for analytical purposes.