Personal data – a treasure trove which can safely be unlocked by AI or a privacy concern?
Munich, 26 March 2024
Artificial intelligence (AI) is a key technology, but data protection hurdles still seem to be hindering its use in companies. Yet valuable data is often generated there that could also be used for the common good with the help of AI technology – for example through personalised healthcare or solutions for greater road safety. In addition to learning algorithms, this requires a lot of data, including personal data. Experts discussed the tension between future-oriented business models and informational self-determination with guests at acatech am Dienstag, which took place on 19 March in cooperation with the Plattform Lernende Systeme (PLS) at the acatech Forum in Munich.
The ever-increasing amount and availability of data and its intelligent linking opens up new applications and perspectives. Jan Wörner, Co-Chair of the Plattform Lernende Systeme and acatech President, illustrated this in his welcoming address using an image of Mount Mauna Kea generated by fusing data from various satellite images, which was able to visualise its entire size for the first time. Measured from its base on the seabed to its peak, Mauna Kea is 10,203 metres in height and towers above Mount Everest.
In her introduction, the moderator outlined the tension between the socially relevant possibilities of AI-based applications and data protection hurdles. Erduana Wald, scientific advisor at the Plattform Lernende Systeme, presented two fictitious use cases: sharing individual mobility data can optimise fleet management for rental vehicles, for example. Health data from fitness apps or pacemakers can help to better predict the development of illnesses. According to current legislation, the disclosure of this data requires the information and consent of the individual persons as well as a purpose limitation. In practice, this often represents a major hurdle for the development of innovative business models or the use of data for research purposes. Dealing with this dilemma was explored in greater depth in the subsequent discussion.
Protecting personal data – technically and individually
Detlef Houdeau, Senior Director Business Development at Infineon Technologies AG, emphasised that the development of AI applications often only requires part of the data – such as age and gender, but not other personal characteristics. There are also already technical options for using data flexibly in the interests of the common good while protecting data privacy, such as privacy-preserving machine learning or data trustees. However, these options are often little known and hardly recognised in legal terms, which leaves room for interpretation and uncertainty. This makes the widespread commercial application of AI more difficult.
Cordula Kropp, Professor of Sociology and Director of the Centre for Interdisciplinary Risk and Innovation Research at the University of Stuttgart (ZIRIUS) and acatech member, emphasised the importance of informational self-determination. Users must always be able to decide what happens to their data. The willingness to share data is generally rather high among the population, as the TechnikRadar surveys on the sharing of mobility, building and energy consumption data for smart city applications show. The fact that the release of usage data is a prerequisite for many mobile applications also contributes to this. According to Cordula Kropp, there is often no alternative to what is known as informed consent – similar to when patients are informed about the risks of anaesthesia during an upcoming operation. However, caution is sometimes required when interpreting the data using AI-based applications: Does the increased blinking of a car driver necessarily indicate tiredness? Or is it possibly a personal idiosyncrasy?
Digital ecosystem for the sovereign exchange of operational data
According to Jan Fischer, project manager of the Gaia-X Hub Germany, which is based at acatech, it is crucial for companies in the digital transformation to utilise existing data treasure troves in order to develop new business models or services. Personal characteristics hardly play a role in this context. Instead, the concerns of SMEs in particular are focussed on how operational data can be shared with partners securely, observing the principles of data sovereignty. According to Jan Fischer, a European solution is needed for a reliable and legally compliant exchange. The Gaia-X project, funded by the German Federal Ministry for Economic Affairs and Climate Protection (BMWK), is creating a European digital ecosystem of interconnected data spaces on an open source basis. The aim is to enable data exchange in which the participating companies and organisations always retain sovereignty over their data.
The subsequent discussion with the audience focussed on the question of what the AI Act recently passed by the EU will bring to the regulation of AI-based applications. The need for digital literacy, the self-determined use of digital applications, was emphasised – both for consumers and for SMEs, doctors and other stakeholders who work with data-based applications. Because, as Detlef Houdeau concluded: “AI is here to stay – in all areas of our lives.”
Further information on the topic and a detailed presentation of the use cases presented can be found in the current white paper Unlock the wealth of data while protecting privacy with AI of the Plattform Lernende Systeme. The TechnikRadar also surveys what Germans think about the use of data by various stakeholders. In 2022, the handling of health data and in 2023 the handling of data for smart city applications were examined in more detail.