Media and content

12 Nov 2018 11:30h - 12:45h

Event report

[Read more session reports and live updates from the 13th Internet Governance Forum]

The session discussed the central role of the media and its independence in assuring the healthy status of a democracy. Firstly, the panellists considered the detrimental impact of fake news on democratic institutions and concluded that the social media represent a perfect vehicle for such a phenomenon, as information is omnidirectional and often un-checked before shared. Secondly, the speakers questioned whether self-regulation is an effective solution, tackling both fake news and hate speech. Although self-adopting measures were welcomed, the speakers warned against the solutions pushed by only one actor, thus preferring a multistakeholder approach involving the private sector, users, government and civil society. All of them stressed the importance of capacity building in providing critical education to the audience/users on the information they read.

The Chair, Ms Salanieta Tamanikaiwaimaro, Executive Director, Pasifika Nexus, Fiji, organised the discussion on how to address fake news, media freedom and media trust, as well as hate speech regulation. 

Mr Rasha Abdulla, Professor of Journalism and Mass Communication, the American University in Cairo, considered that the media’s role in society and democracy depends on the political system we take into consideration. Whether the media can be seen as a watchdog for democracy or as a mere appendage of those in power depends on different degrees of Freedom of Expression (FoE) granted in a given country. She also clarified that in order to define fake news we need to have two elements: intent, and consequence. 

Mr Giacomo Mazzone, Head of Institutional Relations, European Broadcasting Union, World Broadcasting Union, explained that media can be used as tools to create hate. He affirmed that digitalisation has a disruptive effect on society: with social media ‘we can say anything to everyone without the presence of mediators’. He also underlined that digitalisation also offers many opportunities to make journalism more interactive, as communication is not  unidirectional anymore (as it used to be with television).

Ms Ankhi Das, Director Public Policy, Facebook, illustrated Facebook’s three-pronged framework in countering misinformation. Firstly, the content that violates the platform’s standards is removed. Secondly, the content that undermines authenticity is reduced. Lastly, users are provided with more context and information on the platform’s content. She explained that Facebook works with third-party fact-checkers in order to quickly target suspicious content. 

Ms Luz E. Nagle, Stetson University College of Law, stressed the media’s crucial role in assuring the health of a democracy by safeguarding the transparency of democratic processes. She warned against the rising mistrust towards the traditional media because the social media offer an easy channel to spread misinformation. She explained that this is due to news agencies’ business model: media outlets are pressured to be the first ones to spread information – sometimes overlooking the fact-checking. When the trust towards the media is challenged, the media’s mistakes are perceived as fake news purposefully spread by the outlets.  

Ms Shmylah Khan, Digital Rights Foundation, Pakistan, considered that content regulation oscillates between two extremes: underregulation or overregulation. She explained that usually we have an idealised version of freedom of expression^, whereas in reality, it is rather constrained in its applications. For example, in Pakistan, FoE is limited by some constitutional constraints, i.e. for content threatening the national interest or Islamic law. She concluded saying that fake news as a phenomenon rely  on an old concept: people’s tendency to believe rumours and speculation. 

Ms Yik Chan Chin, Lecturer in Media and Communication Studies, Xi’an Jiatong-Liverpool University, explained that media around the world are facing the same three challengesFirstly, in a post-truth era journalists are advocating for specific causes rather than being objective. Secondly, the business model of the social media and thus their increasing profit is threatening the traditional media’s capacity to make revenues. Thirdly, new technologies challenge the work of traditional journalists, as in the case of Artificial Intelligence (AI) filtering content. 

Hate speech: self-regulation or government’s action needed?

Rasha warned against single actors being the sole regulators of online content as the choice of filtering information would be too discretional and not transparent. ‘Governments, as corporations, are after their own specific interest which determines what hate is or is not’.

Giacomo explained that fake news is part of a broader phenomenon called ‘information disorder’ as reliability of the traditional media is under threat. He also affirmed that in order to tackle hate speech and fake news, self-regulation by the Internet platforms is not a successful strategy because the business model of such platforms relies on having the highest amount of users possible and not on the accuracy of information distributed. ‘The current business model of the Internet platforms is incompatible with the truth’.

Luz stated that the role and responsibilities of corporations in controlling content is crucial because they have great influence of how news is perceived. Having media agencies controlled by corporations that are only pushed by profit is dangerous for democracy. She stressed the crucial role of education in countering the manipulation of citizens’ perception. ‘Most of the university graduates have problems in understanding the difference between facts and opinions’. 

Anki maintained that ‘behind fake news there is always a fake profile’, thus Facebook’s efforts are taking accountability as a priority. Moreover, transparency is crucial: Facebook is making its third-party policies available on its website. Moreover, she also affirmed that developing specific measures, such as limiting the maximum number of messages forwarded on Whatsapp to four could significantly help stopping the spread of viruses and fake news.  

Yik joined Giacomo in affirming the inadequacy of self-regulations in tackling fake news. She put forth two reasons. Firstly, fake news is financially motivated because of the revenue deriving from ads put on such pages. Secondly, major corporations have high-profit business models pushing them to promote their own interest. That is why she suggested a multi-stakeholder approach to tackle fake news. 

 

By Marco Lotti