EBU Big Data Conference: The discussions during Day 1

Share on FacebookTweet
Author
Barbara Rosen Jacobson

The EBU Big Data Conference 2018 started with an opening statement by Mr Noel Curran, director general of the EBU, who noted that while in the past big data seemed like a distant goal, digital is now taken for granted and big data is no longer just a buzzword. Data has become an essential parameter in many media companies. Still, there is ‘huge untapped potential’ for data journalism, and the field of big data remains active and vibrant. Therefore, this year’s Big Data Conference focuses on change: how to make the right choices and innovate, while not losing sight of core values and identity? How to remain relevant in a future economy driven by artificial intelligence (AI)?

Curran’s opening remarks were followed by a keynote address from Prof. Fernand Gobet, professor of decision-making and expertise at Liverpool University, who explored how AI, despite the fears it raises, can be beneficial for all citizens and enhance media services. After pointing out some of the remarkable developments in AI innovation, he urged for optimism, as AI can address some of the limits of human expertise. As humans often do not act rationally, they are likely to benefit from AI in many different fields. In the area of media, AI could assist in data visualisation, news alerts, and fact-checking, and it could even write autonomously. However, there will still be a place for humans, as they will need to ‘fill the gaps between different AI systems’ in the absence of general AI.

Mr Carl-Johan Nakamura, chief data steward at IBM, provided a second keynote address, focusing on how data strategies can change a business, and how this change can be managed. Sharing his experience at IBM on how to institute a chief data office in 100 days, he addressed five key themes:

  1. developing a clear strategy
  2. executing enterprise-wide management and governance
  3. becoming the central and trusted data source
  4. building data and analytics partnerships
  5. developing and scaling talent and capabilities across the company

He urged organisations to start small with scalable use-cases.

The two keynotes were followed by a panel discussion on the theme of 'Implementing big data strategies: insights from the media sector and beyond'. The panel was comprised of representatives of the media, insurance, and retail sectors. First, Ms Annick Deseure, data and insights manager, readers market at Mediahuis, highlighted the transition of Mediahuis towards a data-driven culture. Explaining the developments of her organisation, she emphasised that the role of the data officer in an organisation needs to be flexible, as it changes over time. She also shared a number of challenges that need to be mitigated to implement a big data strategy, which include preventing data leaks, ensuring data quality, attracting talented data scientists, and enhancing personalisation while avoiding being perceived as ‘big brother’.

Ms Linda van Dijk, adjunct director of analytics at DKV, provided the perspective from a health insurer. She explained the characteristics of an ‘intelligent organisation’, where insights are harvested consistently and continuously through analytics. She identified four pillars that need to be addressed to get there: technology, data governance, analytics competence, and a data-driven decision-making culture. Mr Xavier Valentini, customer insights manager at Delhaize, shared his experience from the retail sector. He explained that data analytics used to be performed by different departments, generating difficulties in centralising and coordinating data collection and analyses. The teams have now merged into one ‘consumer intelligence’ team, tasked both with generating insights and designing tools and applications. As key challenges for 2018, Valentini identified the task to move from analysis to story-telling; creating a cultural shift in the organisation; and complying to EU’s General Data Protection Regulation (GDPR), which requires intensive collaboration with the data protection team.

The discussion that followed addressed the challenge of avoiding ‘being swallowed by the Facebooks and Googles of this world’. The speakers agreed that collaboration is necessary, sometimes even with competitors in the sector. In addition, Valentini recommended playing on the organisation’s strengths compared to the Internet industry. Furthermore, to be able to recruit talent, it is important to create the right environment for data scientists within the organisation, for example by giving them flexibility and a sense of ownership.

The next panel, entitled ‘Awake the sleeping giants: bringing data into newsrooms’, focused on the promises of data journalism, and how they can be put into practice. Moderator Mr Mirko Lorenz, innovation manager at Deutsche Welle, highlighted that data is so far only used by a small number of newsrooms, and that in time-constrained situations, there is often no incentive to check and analyse data. Bringing data into newsrooms will therefore require new skills and a new way of thinking. At the same time, data journalism is not confined to big news outlets; starting with a small team is often a better first step. Mr Julian Schmidli, reporter and editor for SRF data, is part of such a small data team. His team benefits from the flexibility provided by the SRF, and he explained that the team works as transparently as possible, publishing every code together with the data analysis.

Mr David Bauer, head of storytelling at NZZ, presented ‘Q’, a toolbox that NZZ created to make it easy for journalists to add visual elements to their stories, such as supporting maps and charts. Bauer and his team started with the aim that it should not take journalists longer than 5 minutes to create a visual element. Yet, he emphasised that the availability of tools is not the crucial point; ‘it’s about getting people to use them’. Changing towards a data culture requires both bottom-up and top-level decisions. Ms Sylke Gruhnwald, reporter at Republik and chairwoman of Journalismfund.eu, presented the recently-founded Swiss magazine Republik, which combines data journalism with story-telling, and Mr Gian-Paolo Accardo, co-founder and CEO of VOXEurop and editorial coordinator of the European Data Journalism Network, presented some of the tools developed by the European Data Journalism Network, which aim to make complex issues as simple as possible, in order for Europeans to better understand Europe.

The discussion that followed raised the question of whether these initiatives risked reinventing the wheel, bearing in mind the many tools that are already ‘out there’. Bauer explained that although there are a lot of tools, they all serve different purposes, have different interfaces, and might not be compatible. In addition. Lorenz added that tools that are specifically developed for journalists allow news outlets to add their own layout and branding. Schmidli cautioned against relying on third-party software, as it could cease to exist in the future.

When asked about their hopes for the future of data journalism, Bauer and Gruhnwald argued to put the term ‘data journalism’ to rest: ‘data journalism is journalism when it’s good’, it should be expected of journalists to use data whenever it is available for reporting. Schmidli added that innovation in journalism will always evolve, and that it is important to create data literacy among journalists. Accardo wished for the public to have more confidence in data journalism’s ability to counter nonsense.

Continuing on the topic of data journalism, the next panel demonstrated two case studies of data journalism by public broadcasters. Ms Christine Jeavans, senior data journalist at the BBC, talked about the creation of the NHS Tracker, which presents the average waiting time for each hospital in the UK. Mr Teemo Tebest, data journalist at YLE, presented the Municipality Radar, which was used during the municipal elections in Finland. Both cases emphasised the need to create analyses that are relevant for citizens, affecting a lot of people, and that can be used in multiple ways. They also expressed the need for the data teams to be well-integrated within the rest of the organisation, and to make use of the expertise that is already in the organisation.

The day closed with two breakout sessions, one of which was related to 'data, elections, polls and the democratic game'. The session aimed to explore the influence of data on democracy and elections. How is data mismanaged to influence election results? Are polls still relevant? Can we use big data to contribute to a better democracy? Prof. Arturo Di Corinto, professor, journalist and hacktivist, spoke about the role of fake news, in particular related to the upcoming Italian elections. He explained that there are still many unknowns about the role of fake news in guiding the electorate. While it can ‘pollute public debate’, fake news rarely creates a new way of thinking; rather, it confirms prejudices. The role of fake news in elections is furthermore dependent on the credibility of public media and institutions.

Ms Vidya Narayanan, researcher at the Computational Propaganda Project, Oxford University, spoke about how social media supports misinformation campaigns, how they muddy the democratic discourse, and possibly influence voter preferences. Showing two case studies, she concluded that while there is extensive ‘junk news’ in the social media ecosystem, there is no sufficient evidence that this changes voting preferences.

Mr Leendert de Voogd, project director Social Intelligence and Analytics, IPSOS, reflected on the continued relevance of election and public opinion polls. He recommended a smart combination of traditional methods with big data and AI: ‘The future of polling is to leverage the power of AI in methodology, mixed with a sound analytical framework that has been successful over the last 80 years’.

Share on FacebookTweet