Disinformation Online: Reducing Harm, Protecting Rights

Report

event

Session date
to

assignment

Session:
Open Forum 44

linkSession page

[Read more session reports and updates from the 14th Internet Governance Forum]

According to Mr Sebastian Bay (Senior Expert, NATO Stratcomms), roughly 90% of the online disinformation currently produced is aimed at commercial companies, such as hotel.com and TripAdvisor.

Yet, we find a difference in policy standards between democracies and non-democracies, and this difference results in a deliberate disinformation war between the two, opined Mr Damian Tambini (Associate Professor, London School of Economics). Bay expressed concern over the globalisation of disinformation, in which certain countries specialise in developing software for social media manipulation. Mr Jakub Kalensky (Senior Fellow, Digital Forensic Research Lab) added how local actors in the EU are learning from bad actors.

Regarding policies for addressing disinformation, Tambini highlighted the need for adopting a mature approach to the regulation of freedom of expression with justified rules. The current trend of countries to devise laws to address disinformation, as is happening in France and Germany with their recent procedural and liability rules, may not address all issues. Ms Miranda Sissons (Director of Human Rights, Facebook) expressed concern over current legislations related to fake news or hate speech, citing a lack of rigour and a failure to address the whole issue.

On content moderation, Sissons highlighted the open engagement process adopted in creating the Facebook oversight, and highlighted a significant rise in number in the identification of fake accounts using artificial intelligence in Facebook's latest transparency report. Tambini questioned Facebook on competition policy, censorship, and content take-down policy, highlighting the need for more civil society engagement to build trust. Tambini also expressed his concern over implications of censorship and trust while labelling content.

Regarding liability of intermediaries, Tambini highlighted the need for platforms to share more data, citing the example of Facebook with reference to competition policy and questions of censorship and free speech. He questioned the content take-downs of Facebook that could be considered censorship in the EU; in general, he considered that while platforms are developing take-down solutions, they need to engage civil society to build trust. Bay shared the need for social media companies to put more resources into combatting this problem, to create a level playing field. He highlighted the differences between different social media platforms, and even differences within platforms owned by the same company. The need to regulate the market to avoid social media manipulation was highlighted by Bay, who also expressed his appreciation of WhatsApp's decision to sue any company that spreads misinformation.

Bay suggested a need for standardisation of transparency reports from companies; these reports should include how each social media company blocks accounts and the amount of resources that platforms allocate for this. This step, he said, can make a 50% difference in the ability to combat the issue.

For addressing the issue of misinformation, all social media companies, large and small, need to work together said Tambini.

By Amrita Choudhury

Share on FacebookTweet