Disinformation: acatech topic conference places the dangers and countermeasures on the agenda
Munich, 19 December 2023
Disinformation has become more of an issue in recent years. Reasons include the digitalisation of the public domain and developments in the area of artificial intelligence. This issue is a threat to liberal democratic values: reliable, multifaceted information is essential for the formation of public opinion to function. Given how topical the matter of disinformation is, acatech devoted a special topic conference to it on 14 December 2023, which gave a clear snapshot of the current situation.
The spectrum of targeted disinformation today ranges from the intentional omission of information to the falsification of mainstream media coverage. It spreads quickly, especially on social media platforms. The speakers at the online conference, who come from academia, journalism and the non-profit sector, discussed how institutions, companies and individuals can differentiate between facts and fake news amid the torrent of information, and which technologies can be of help.
“It’s a question of being able to trust – or not to trust – what we hear, read and see,” said acatech President Jan Wörner in his opening address. He made the point that even small pieces of misinformation can lead to conspiracy theories.
acatech member Jörn Müller-Quade from the Karlsruhe Institute of Technology, who moderated the event, next took the participants through three blocks of short presentations, which went into the structures of and strategies behind disinformation. These were followed by discussions delving into the topics.
Theory: The system of disinformation – the methods used by those responsible
The first part of the topic conference shone a spotlight on the theoretical aspect of disinformation and truth.
Ortwin Renn: “We rely more than ever on trust.“
acatech Executive Board member Ortwin Renn from the Research Institute for Sustainability Helmholtz Centre Potsdam (RIFS) began with an overview of how to navigate these times of disinformation. He explained how loss of trust and experience influence public perception. People can no longer verify current, major challenges such as climate change based on their own experience. Also, what the evidence tells us can sometimes seem less plausible the more complex the challenges are.
“We rely more than ever on trust because we don’t have the personal experience to make a judgement. At the same time, we are always more mistrustful of people who can provide us with this evidence,” said Ortwin Renn in summary. The role of science must be to build trust into the communication of messages and to decipher disinformation. “If we offer people the possibility to take part in making the decisions, then they will be inclined to accept evidence-based messages and act accordingly,” said Ortwin Renn in conclusion.
Lucia A. Reisch: “Many people have access to the tools to create deep fakes.”
acatech member Lucia A. Reisch from Cambridge Judge Business School elevated the discussion about disinformation and truth to an international level, explaining why disinformation often slips through so easily. “Many people have access to the tools to create deep fakes; they don’t cost much,” said Lucia A. Reisch. Generative AI can exacerbate the problem. Take, for example, the use of multi-agent AI; no human involvement is needed any more. Policies and regulation such as the AI Act and the EU Digital Services Act can have an effect.
It is also important for users to have a high level of media competence, which gives them meta-knowledge about authentication and communicators’ motives.
Christoph Neuberger – Verify before you publish.
The next speaker Christoph Neuberger, Weizenbaum Institute, Freie Universität Berlin, member of Plattform Lernende Systeme and acatech member, focused on the platform revolution: opinion formation now largely takes place on social media, which bypass traditional journalistic gatekeeping. The practice of verifying before publishing rarely applies any more.
He also pointed out that counter-messaging and correcting the record (or debunking) can backfire. The risk is that repetition engrains misinformation in the minds of media consumers even more deeply. The polarisation of groups seems to be a bigger problem on platforms than groups becoming isolated in echo chambers or filter bubbles.
Christoph Neuberger proceeded to give an overview of empirical research into fake news and misinformation. Researchers are conducting a detailed and contextualised analysis of the spread and impact of fake news and misinformation. There is a problem with the spread of misinformation but it is not very extensive and is very unequally distributed within societal groupings. Most affected are groups where the disinformation chimes with their world view. Also, there are generally no signs of fake news spreading faster or having a wide impact. People tend to have a low level of trust in social media.
Practice: How can disinformation be exposed?
The second block of topics focused on actual examples of handling disinformation and methods used in journalism.
Sophie Timmermann – Fact-checking and context
Sophie Timmermann, Deputy Director of CORRECTIV.Faktencheck shared her experience with fact-checking. She illustrated how certain tools have become established in journalistic practice to spot potential disinformation and reliably determine whether it is or isn’t. She described the level to which this can be automated based on the fact-checking process at CORRECTIV.
In their day-to-day work, various research tools such as reverse image search and satellite images or press-specific tools such as InVID from AFP help them to verify images and their contexts or check video metadata. What’s important here is to find out when such recordings or images were originally published in order to know the correct context. Fact-checking databases can also give an overview of fact checks currently under way. The important thing here is only to check facts and not to express opinions.
Henriette Löwisch: “People need our scepticism.”
Henriette Löwisch, Head of Journalistenschule München, spoke from the point of view of journalism education, saying that in times of targeted disinformation campaigns, fact-checking and digital verification are two of the technical tools of the journalism trade. The use of AI tools brings with it lots of challenges for journalists: while they are very useful for the analysis of large sets of data, all AI output must be double-checked.
In addition, the use of AI tools can make journalism more anonymous, and make it harder to attribute information to a medium or a media professional. “Journalists must understand the basics of AI for them to be able to follow the technical development with a critical eye – its economic, societal, political and regulatory aspects,” said Henriette Löwisch.
Technological measures to counteract disinformation
The third section of the topic conference dealt with technological means to combat disinformation.
Hanna Katharina Müller: “Disinformation cannot simply be regulated out of existence.”
Hanna Katharina Müller, head of Federal Ministry of the Interior Division H III 4 (Political Systems; Hybrid Threats; Disinformation) described the ways in which governmental bodies deal with the dangers at national level and the strategies that can be taken from this for other stakeholders. The topic of disinformation is relevant to security policy and social policy. Journalism and institutions within civil society can help with countermeasures.
“Disinformation is a threat to our democratic freedom because it is intended to sow division in society and cause harm. The aim of disinformation is to intensify social conflict, undermine trust in institutions of the state and foment hatred and anger. The German government therefore takes decisive action against disinformation – in addition to taking appropriate reactive measures, such as setting the record straight, it focuses on prevention and building up resilience at a national and social level,” explained Hanna Katharina Müller.
But, she said, disinformation is not the biggest problem: narratives that sow doubt are more challenging. Hanna Katharina Müller thus raised the question of how to reach target groups that are very far removed from the state. The Federal Ministry of the Interior has developed a number of formats for this, which are in the process of being set up, e.g. a task force against disinformation and a citizen’s advisory service in 2024.
Johannes Wörle – Disinformation and extremism
Johannes Wörle, senior government official in Division SG E5 of the Bavarian State Ministry of the Interior, for Sport and Integration, spoke from the perspective of the federal state. He gave an insight into the implications of disinformation for internal security; that is, from the perspective more specifically of protection of the constitution. Essentially, this concerns targeted disinformation campaigns from external sources, in some cases from foreign states, which are distributed predominantly in extremist circles, mostly on social media. In particular, he said, working closely with the Federal Ministry of the Interior and stakeholders in other German states helps to combat disinformation.
Nicolas Müller: “It’s much better to get AI on our side.”
Next, Nicolas Müller, Research Associate at Fraunhofer Institute for Applied and Integrated Security (AISEC), focused on technical support for detecting disinformation as well as tried-and-tested countermeasures. In terms of the means of creating and spreading disinformation, Nicolas Müller pointed out that AI is getting easier and easier to use. He presented three kinds of countermeasure in greater detail. Firstly, informing the public and teaching media skills is paramount. Secondly, AI-supported recognition of deep fakes is available. “It’s much better to get AI on our side and recognise fakes,” said Nicolas Müller. He presented the platform “Deepfake Total”, which can be used to verify files or YouTube videos. A third strategy is the use of certificates to verify content. Embedded in metadata, these certificates provide cryptographic security and broad community support. The disadvantage is that metadata can also be removed or tampered with. Despite the numerous promising solutions, Nicolas Müller asserted: “There is no one-size-fits-all solution. A combination is required, depending on the particular case.”
Isabelle Sonnenfeld – Prevention and “mental antibodies” against disinformation
Isabelle Sonnenfeld, Head of EMEA, News Lab @Google, showed how a global information conglomerate counters targeted disinformation, and which developments can help in the future. One way is by giving users additional tools to get more context when doing a web search, thereby enabling them to do further research into sources. Also, Google’s fact-checking tool – Fact Check Explorer – can help. This also applies to media-based misinformation that is deliberately taken out of context and used. Isabelle Sonnenfeld also explained prebunking, a communication technique that aims to develop “mental antibodies” to disinformation. This involves building up people’s preemptive resilience before they see this misleading information on their phones, explained Isabelle Sonnenfeld. Users thus find out how manipulation techniques work and learn how to spot them, so they are able to react to them.
Hans Brorsen – Recognising quality news
Hans Brorsen, co-founder and co-CEO of valid.tech, was the final speaker of the third part of the conference. He spoke about reliable news sources and how to recognise quality news across platforms. Using an app, journalists can digitally sign their articles before they publish them and embed them into the blockchain – so the name of the publication and author are stored. This makes it impossible to tamper with the metadata. Thus, the verification can be accessed in all channels as well as checked subsequently.
Better media skills and fact-checking tools against disinformation
The discussion that followed made it clear how varied the systems and methods used in disinformation campaigns can be. So, too, are the countermeasures. The need for greater media competence was mentioned a number of times as an important tool to enhance resilience to disinformation.
In his closing words, acatech President Jan Wörner emphasised the array of challenges and countermeasures. He welcomed the broad discussion that took place, from the terminology to the AI-supported countermeasures. “Precisely this is acatech’s forte – covering the entire spectrum.”
Further information
- Programme of the topic conference (in German)
- Technology Communication Working Group
- Topic network Safety and Security
- Book: Gefühlte Wahrheiten
(Subjective Truth – Finding your way in times of post-factual uncertainty) by Ortwin Renn (in German) - Event: Artificial intelligence improves society?