Information disorder: Exploring remedial potential

12 Nov 2018 09:00h - 10:00h

Event report

[Read more session reports and live updates from the 13th Internet Governance Forum]

Information disorder as an issue is here to stay. False and fabricated information causes mistrust, disengagement and societal division. To mitigate these unwanted consequences, a response that is multifaceted, context-based and co-operative is needed. Potential remedies will have to address digital intermediaries as the new gatekeepers and the impact of algorithms as drivers of content distribution.

The session moderator, Mr Rasmus Nielsen, Director, Reuters Institute for the Study of Journalism and Professor of Political Communication, University of Oxford, provided initial observations on how research and practice connect in tackling the issues of information disorder in different contexts. Nielsen mentioned the Council of Europe (CoE) report, ‘Information disorder’ and reminded the audience that information disorder is like light pollution – often the unintended by-product of the interplay between humans and technology. The goal is to minimise the unwanted consequences of technological use and avoid end-all solutions.

He stressed the problem of using the term ‘fake news’. First, the term is dangerous as politicians, and even independent researchers, demonstrably use it and undermine the credibility of independent information providers. Second, it is misleading, as the information is often factually accurate, but is strategically released with malicious intent. He further stressed that disinformation plays out differently around the world. Most of the research is US-centred and we should be careful when applying findings elsewhere. There is a need for an accurate evidence-based grasp of the scale and scope of the problem. For the majority of people the term ‘fake news’ means ‘poor journalism’. According to Nielsen, poor journalism is not the central driving force of information disorder as we see it today, but the fact that most of the public sees it as such is relevant.

Ms Tanja Kerševan-Smokvina, Expert, Committee of experts on Human Rights Dimensions of automated data processing and different forms of artificial intelligence (MSI-AUT), Council of Europe (CoE), presented international approaches to regulating information disorder. Nation states mostly act as discussion facilitators, and establish task forces, but do not often engage at institutional administrative or legislative levels. The main problem with information disorder is that it undermines trust in democratic institutions, and creates fertile ground for conspiracy theories. This lowers trust in evidence, and in the long run, creates disengagement, and sharpens societal division. At the regulatory level, the challenge is to balance the fear of censorship technology. ‘Digital intermediaries are the new gatekeepers’, she said. In March 2018, the CoE published guidelines on the roles and responsibilities of intermediaries. Balance and co-operation between the response of states, and the self-regulation of digital platforms is key in tackling information disorder. Co-operation should also extend to civil society organisations and academia because a relevant independent analysis of the implications of different interventions is necessary. 

Mr Giacomo Mazzone, Head of Institutional Relations, European Broadcasting Union (EBU), presented the Eurovision Social Newswire project. This is an EBU community-based newswire made up of over 450 journalists, collaborating in real time to discover and verify user-generated, eyewitness content. It addresses the issue that more and more of the broadcast news we rely on is based on material found on the Internet, and not provided by traditional media actors. The project has been successful so far, but as disinformation issues have to be tackled with an immediate reaction it is fundamental to have the co-operation of social networks. The EBU is further working on supporting quality journalism and the safety of journalists, investigating the benefits of open source tools to verify information, and creating tutorials for journalists to improve their information technology (IT) skills. Lastly, understanding algorithms is crucial because the social media companies remain the only ones that can assess what has been done, and currently they are not obliged to disclose their tools.

Mr Olaf Steenfadt, Project Director, Journalism Trust Initiative – Reporters Without Borders (RWB) and Expert, EU High Level Group (HLEG) on Fake News and Online Disinformation, stressed that journalism has to ‘follow the money’ in order to survive. An economically healthy and sustainable media system is necessary for quality journalism. The World Press Freedom Index assesses traditional physical and legal threats against journalists, but is now considering the new technological and economic threats to journalism, called ‘invisible prisons’. To analyse these threats the focus is on understanding main instruments that define the production and distribution of information. This is currently twofold: Production should remain traditional, led by a code of ethics and practices, but the algorithms created by the technical community lead distribution. RWB see the disconnectedness of the two as an issue and has created the Journalism Trust Initiative project to connect the two spheres through the ’standard setting’ mechanism.

 

By Jana Misic