This high-level exchange was devoted to the role of corporate social responsibility (CSR) in the digital age. MrPaul Mitchell (Senior Director, Tech Policy, Microsoft) defined CSR in this context as sustainable governance of digital technologies with respect to human rights and new labour-tech relationships that achieve equity in access among all different sectors of society.
Participants of the high-level exchange addressed three main questions:
What does CSR in the digital world mean and what are the challenges of digital transformation?
For Mr Jan Kleijssen (Director of Information Society and Action against Crime, Council of Europe), CSR is the awareness and willingness of the private sector to shoulder responsibilities that come with the development of their products in a way that respects the rule of law. The participation of the private sector in drafting common legal standards and regulations, which is the primary role of the Council of Europe, is crucial.
Ms Miya Baratang (CEO, Girlhype Women Who Code) connected CSR to 17 Sustainable Development Goals (SDGs) that are focusing on transforming the world and giving guidance to corporate social investment. She stressed that inclusive business models do not necessarily come from governments, but directly from business leaders with company budgets larger than the physical budgets of some African countries. Tech leaders and creators need to realise that humanity is at stake and needs to be saved because people are scared of losing jobs and the future in general. Baratang pointed to the gender inequality in the tech industry as well.
Mr Lucio Adrian Ruiz (Dicastery for Communication of the Holy See) recalled Newton’s third law and applied it to CSR: for every action, there is an equal and opposite reaction. Thus, CSR in the digital world means ‘being attainable to the design, accountable by the design, should look at all consequences: positive ones to promote, negative ones to avoid’.
Is it possible to regulate digital technologies without slowing down innovation? What could be regulated and how to address CSR?
The example of innovative business under regulation – broadcasting – was given by Mr Noel Curran (Director General of EU Broadcasting Union). Regulation and innovation are not mutually exclusive. Instead, tech regulation should focus on basic values and principles. Curran noted several shifts in debates:
the responsibilities of the technical sector
more focus on what is causing disinformation rather than how it spreads – new developments at the EU level – Digital Services Act (DSA) and Digital Markets Act (DMA), commissions on disinformation
search for voluntary and non-regulatory ways for tech giants to ensure that the public receives trusted sources of information
Baratang added that innovation has now created a massive digital divide between rich and poor, and said: ‘We need ethical regulation with good standards, models and proper governance that is united globally to make sure that both quantitative and qualitative research from industry and society is done properly and it is increasing the real average global human life’.
In his turn, Kleijssen claimed that regulation and innovation are mutually reinforcing. However, he said that self-regulation is not enough: ‘Self-regulation, ethical standards are useful as a source of inspiration, they don’t confer rights of users or citizens, and they do not confer legal obligations on the companies, on those that use them, importantly, they also don’t offer remedies if something goes wrong’. Government regulation or co-regulation enhance innovation but does confer rights, does provide for remedy, and does create obligations. Curran upheld Kleijssen’s view that for tech companies self-regulation does not work: ‘I think some of these larger organisations need to be pushed and they need to know that regulation will follow if they don’t adhere to set goals and fulfil them’.
Where should be the boundary between self-regulation and formal regulation?
From the perspective of the Council of Europe, the boundary should be defined by the effect that technology has on the real day-to-day exercise of human rights. For example, facial recognition technologies have discriminatory potential for citizens and need to be regulated, while algorithms behind recommendations in music streaming services have less impact on human rights. Kleijssen concluded that self-regulation simply does not offer enough protection.
By Ilona Stadnik
Session in numbers and graphs
Diplo’s AI Lab experiments with automated summaries generated from the IGF sessions. They will complement our traditional reporting. Please let us know if you would like to learn more about this experiment at firstname.lastname@example.org.
The automated summary of this session can be found at this link.