Child online safety: Industry engagement and regulation | IGF 2023 Open Forum #58

10 Oct 2023 09:00h - 10:00h UTC

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Julie Inman Grant

The analysis covered a range of topics related to online safety and abuse. One of the main points discussed was Australia’s strong online content scheme, which has been in place for over 22 years. The scheme is primarily extraterritorial, as almost all of the illegal content it deals with is hosted overseas. This highlights Australia’s commitment to tackling online content and ensuring a safe online environment for its citizens.

Another important aspect highlighted in the analysis is the need for a more individuals-centered approach in addressing online abuse. Schemes have been put in place to address individual abuse cases, and understanding current trends in online abuse is deemed integral to applying systemic powers effectively. Taking into account the experiences and needs of individuals affected by online abuse can lead to more targeted and effective interventions.

A concerning finding from the analysis is the significant increase in cases of online child sexual exploitation and sexual extortion. It is reported that there has been a doubling of child sexual exploitation cases and a tripling of sexual extortion reports. Shockingly, one in eight analyzed URLs involves coerced and self-produced abuse through smartphones and webcams. These figures highlight the urgent need for robust measures to combat online child sexual abuse and protect vulnerable children.

The role of online platforms in preventing abuse was also discussed. Currently, online platforms are being used as weapons for abuse. However, platforms like Snap and Instagram have been provided with intelligence reports on how to prevent this abuse. The analysis suggests that online platforms should do more to proactively guard against their services being exploited for abusive purposes.

The analysis also touched upon the topic of corporate responsibility in online safety. The introduction of the basic online safety expectations tool allows the government to ask transparency questions and compel legal answers from companies. Furthermore, companies can be fined based on their truthful and complete responses. These expectations play a pivotal role in compelling companies to operate safely and protect their users.

Global collaboration and transparency were identified as crucial factors in tackling online child abuse. Initiatives like the Heaton Initiative are putting pressure on large companies, such as Apple, to do more to address child sexual abuse. Additionally, future enforcement announcements targeted at five more countries are to be made next year, indicating the ongoing commitment to global collaboration in combating online child abuse.

The analysis also highlighted the challenges faced in safeguarding children online. While the internet has become an essential tool for children’s education, communication, and exploration, it was not initially built with children in mind. Notably, there has been a notable increase in reports of cyberbullying among younger children during COVID-19 lockdowns. It is imperative to strike a balance between safeguarding children appropriately and allowing them to benefit from the internet’s use.

Regarding age verification, the analysis presented differing viewpoints. Companies were encouraged to take responsibility in verifying users’ ages and facilitating meaningful checks. However, it was suggested that age verification should not restrict children’s access to necessary and beneficial content. Trials for age verification are currently being conducted by platforms like Roblox, and Tinder and Instagram have begun implementing age verification in Australia. However, there are concerns about the effectiveness and potential restrictions on access for marginalized communities.

The effectiveness of META’s Oversight Board in reviewing content moderation decisions was called into question. In the past year, the board received around 1.3 million requests for content moderation reviews but was only able to cover 12 cases. This raises concerns about the board’s efficiency in handling the sheer volume of cases.

Lastly, the analysis emphasized the importance of multinational regulation for online platforms and the need for specialized agencies to handle investigations. The gray area of regulation poses significant challenges, requiring multi-layered investigations to effectively address abuse and ensure accountability.

In conclusion, the analysis shed light on various aspects of online safety and abuse. It highlighted Australia’s strong online content scheme, the need for individuals-centered approaches in tackling online abuse, the concerning increase in cases of online child sexual exploitation, and the role of online platforms in preventing abuse. The importance of global collaboration, corporate responsibility, and safeguarding children online was also emphasized. Critical evaluations were made regarding age verification measures, META’s Oversight Board, and the need for multinational regulation and specialized agencies. These insights provide valuable information for policymakers, platforms, and organizations to address online safety and combat abuse effectively.

Audience

The discussion revolves around striking a balance between children’s right to access information online and ensuring their safety, particularly in relation to sexuality education. It is important to provide children with accurate and scientific information while also protecting them from potentially harmful content. This highlights the need for a comprehensive and inclusive approach to online education.

There are ongoing discussions regarding the implementation of new regulations to safeguard children online. The speaker questions whether there is a balance between raising awareness and imposing obligations on service providers under these regulations. This reflects the growing recognition of the importance of protecting children from abuse, exploitation, and violence online.

In terms of ensuring child safety online, the audience argues for not only blocking but also removing harmful content. Simply blocking such content may not be sufficient, as individuals seeking it may find ways to circumvent these blocks. Therefore, the removal of harmful content becomes crucial to guarantee the safety of children.

In conclusion, the discussion emphasizes the need for a balanced approach that upholds children’s right to access accurate information while safeguarding them from harmful content. The introduction of new regulations and the emphasis on removing, not just blocking, harmful content further demonstrate the commitment towards ensuring online child safety. This signifies progress in protecting children from abuse, exploitation, and violence in the digital realm.

Noteworthy topics discussed include children’s rights, online safety, access to information, and sexuality education. Additionally, the discourse touches upon the relevance of the UN Convention on the Rights of the Child and the impact of digital regulation on children’s rights and internet safety. These aspects contribute to a comprehensive understanding of the subject matter and highlight the interconnections between various global initiatives, such as SDG 4: Quality Education, SDG 5: Gender Equality, and SDG 16: Peace, Justice, and Strong Institutions.

Tatsuya Suzuki

During the discussion, the speakers emphasised the need to enhance internet safety for children. They highlighted the importance of having a comprehensive plan in place to ensure the secure use of the internet by children. This plan involves collaborative efforts with various stakeholders, including academics, lawyers, communications companies, and school officials. These groups can work together to develop strategies and guidelines that promote responsible internet use among children.

The speakers also expressed their support for public-private initiatives aimed at addressing online child abuse and exploitation. They recognised the crucial role of the Child Welfare Agency in upholding the interests of the private sector in these efforts. Additionally, they highlighted the active collaboration between the agency and the Ministry of Education, Culture, Sports, and Tourism, as well as the involvement of Japanese UNICEF. These collaborations are important in developing effective and comprehensive approaches to combating online child abuse and exploitation.

Overall, the sentiment expressed during the discussion was positive, with a strong emphasis on implementing measures to protect children online. The speakers recognised the urgency and importance of ensuring the safety and security of children in their online activities.

Through the analysis, it is evident that this issue is aligned with Sustainable Development Goal 16.2, which aims to end abuse, exploitation, trafficking, and all forms of violence and torture against children. By addressing the challenges of internet safety and working towards its improvement, progress can be made towards achieving this goal.

In summary, the discussion highlighted the necessity of implementing initiatives to improve the safe and secure use of the internet for children. Collaboration with various stakeholders, such as academics, lawyers, communications companies, and school officials, is essential in developing a comprehensive plan. The support for public-private initiatives in tackling online child abuse and exploitation was also emphasised, acknowledging the roles of the Child Welfare Agency, Ministry of Education, Culture, Sports, and Tourism, and Japanese UNICEF. Overall, there was a positive sentiment towards the implementation of measures that protect children online, aligning with Sustainable Development Goal 16.2.

Moderator – Afrooz Kaviani Johnson

Child exploitation on the internet is an ongoing issue that has evolved over the years. It now encompasses more than just explicit materials, but also the ways in which technology enables abuse. To effectively address this issue, collaboration across sectors is crucial.

Australia’s eSafety Commissioner is at the forefront of combating online abuse. This government agency has implemented a range of regulatory tools to drive industry-wide change. The role of Australia’s eSafety Commissioner in spearheading these efforts is commendable.

The involvement of the private sector is also vital in protecting children online. Companies are increasingly being called upon to take proactive measures and be accountable for their responsibilities in ensuring online child safety. These discussions involve industry experts from various countries, including Japan’s private sector and BSR Business for Social Responsibility.

Japan is making significant strides in enhancing internet safety for young adolescents. The country’s Child Welfare Agency and multiple stakeholders, such as academics, lawyers, communications companies, school officials, and TTA organizations, are actively involved in creating a safe and secure online environment for young people. Japan’s measures in this regard have been positively received and appreciated.

Recognising the importance of private sector involvement, Japan’s Child Welfare Agency has developed the Internet Environment Management Act, which respects the individual and subjective interests of private organizations. These organizations are actively engaged in ensuring the safe and secure use of the internet by children.

Addressing online child abuse is a complex and challenging task. Mr Suzuki, a prominent speaker, highlighted the various ways in which children can fall victim to online abuse, emphasising the need for parental involvement and proper ‘netiquette’. In Ghana, collaborative regulation involving tech companies has been adopted to tackle online child abuse.

Continued learning and knowledge exchange are crucial in combating online child abuse. A recent discussion on internet literacy and online child abuse served as a fruitful exercise and a positive step in addressing the issue. Ultimately, promoting sustainable development by ensuring all learners acquire the necessary knowledge and skills is vital.

In conclusion, addressing the issue of child exploitation on the internet requires collaboration across sectors, involvement of government agencies like Australia’s eSafety Commissioner, proactive engagement of companies, efforts from countries like Japan, and continued learning. These various approaches collectively work towards protecting children online and making the digital world a safer space for young people.

Toshiyuki Tateishi

Summary: The Japanese private sector has adapted over the past decade to address the challenges of online child sexual abuse and exploitation. Japan has a constitutional law that protects the secrecy of communication, preventing the blocking of certain websites. They have also implemented mechanisms, such as the Jeopardy system, to block access to illegal sites. If the abusive site is located in Japan, it is deleted by the ISP and investigated by the police. If the site is overseas and found to be sexually abusive, it is promptly blocked. Japan’s approach to internet safety has been commended for its low level of government interference with digital freedoms. They emphasize balancing freedom of communication, security, and innovation online. Communication is seen as crucial before taking down any content, even engaging with parties located overseas. Overall, Japan’s comprehensive approach demonstrates its commitment to creating a safe online environment and addressing online child abuse and exploitation.

Edit: The private sector in Japan has proactively responded to the emerging challenges of online child sexual abuse and exploitation in recent years. Japan has enacted a constitutional law that safeguards the confidentiality of communication, thereby prohibiting the blocking of certain websites. To combat online child abuse, Japan has established mechanisms like the Jeopardy system, which enables DNS servers to block access to illegal sites. If an abusive site is discovered within Japan, the Internet Service Provider (ISP) will delete it and involve the police for further investigation. In the case of overseas sites, a thorough examination is conducted, and if found to contain sexually abusive content, it is promptly blocked. Japan’s efforts to combat online child abuse have been recognized by a 2016 UN report for preserving digital freedoms with minimal government interference. They place a particular emphasis on striking a balance between freedom of communication, security, and innovation in the online realm. Additionally, before taking down any content, Japan believes in attempting communication with the relevant parties, even if they are located overseas. This approach underscores the importance of dialogue and potential collaboration with foreign entities to effectively address online safety concerns. Overall, Japan’s comprehensive strategy exemplifies its unwavering dedication to promoting a secure online environment and combating online child abuse and exploitation.

Dunstan Allison-Hope

Human rights due diligence plays a vital role in upholding child rights and combating the alarming issue of online child sexual exploitation and abuse. Business for Social Responsibility (BSR) emphasises that incorporating human rights due diligence is essential for companies to demonstrate their commitment to the well-being of children. BSR has conducted over 100 different human rights assessments with technology companies, highlighting the significance of this approach.

A comprehensive human rights assessment involves a systematic review of impacts across all international human rights instruments, focusing on safeguarding rights such as bodily security, freedom of expression, privacy, education, access to culture, and non-discrimination. It is crucial to adopt a human rights-based approach, which includes considering the rights of those most vulnerable, particularly children who are at a greater risk.

The European Union Corporate Sustainability Due Diligence Directive now mandates that all companies operating in Europe must undertake human rights due diligence. As part of this process, companies must evaluate the risks to child rights and integrate this consideration into their broader human rights due diligence frameworks. By explicitly including child rights in their assessments, companies can ensure that they are actively addressing and preventing any potential violations.

However, it is important to maintain a global perspective in human rights due diligence while complying with regional laws and regulations. Numerous regulations from the European Union and the UK require human rights due diligence. However, there is a concern that so much time and attention going towards the European Union and the United Kingdom takes time away from places where human rights risks may be more severe. Therefore, while adhering to regional requirements, companies should also consider broader global approaches to effectively address human rights issues worldwide.

A holistic human rights-based approach seeks to achieve a balance in addressing different human rights, with a specific focus on child rights. Human rights assessments typically identify child sexual exploitation and abuse as the most severe risks. To ensure the fulfilment of all rights, a comprehensive assessment must consider the relationship between different human rights, with considerations given to tensions and the fulfilment of one right enabling the fulfilment of other rights.

Another crucial aspect of human rights due diligence is the application of human rights principles to decisions about when and how to restrict access to content. Cases before the meta-oversight board have shown that having the time to analyse a condition can provide insights and ways to unpack the relation between rights. Applying human rights principles like legitimacy, necessity, proportionality, and non-discrimination to decisions about when and how to restrict access to content helps ensure a balanced approach.

It is also important to provide space to consider dilemmas, uncertainties, and make recommendations in cases relating to human rights, particularly child rights. Highlighted is the use of space available for the meta-oversight board to make decisions and the idea for similar processes to take place concerning child rights is welcomed. This helps ensure that informed decisions, considerations of different perspectives, and recommendations can be made.

In conclusion, human rights due diligence is vital to respect and safeguard child rights and combat online child sexual exploitation and abuse. By integrating child rights into their broader human rights due diligence, companies can demonstrate their commitment to the well-being of children. While complying with regional laws, it is crucial to adopt a global approach to effectively address human rights risks. A holistic human rights-based approach considers the interrelationships between different rights, while the application of human rights principles guides decisions about content access. Providing space for deliberation and recommendations in cases involving child rights is fundamental to making informed decisions and ensuring the protection of children’s rights.

Albert Antwi-Boasiako

The approach adopted by Ghana in addressing online child protection is one of collaborative regulation, with the objective of achieving industry compliance. In line with this, Section 87 of Ghana’s Cyber Safety Act has been established to enforce industry responsibility in safeguarding children online. The act provides provisions that compel industry players to take action to protect children from online threats.

Furthermore, Ghana’s strategy involves active engagement with industry players, such as the telecommunications chamber, to foster mutual understanding and collaboratively develop industry obligations and commitments. This collaborative approach highlights the importance of involving industry stakeholders in shaping regulations and policies, rather than relying solely on self-regulation.

The evidence supporting Ghana’s collaborative regulation approach includes the passing of a law that includes mechanisms for content blocking, takedown, and filtering to protect children online. These measures demonstrate the government’s commitment to ensuring the safety of children in the digital space.

The argument put forth is that self-regulation alone cannot effectively keep children safe online, as it may not provide sufficient guidelines and accountability. On the other hand, excessive regulation can stifle innovation and hinder the development of new technologies and services. Ghana’s approach strikes a balance by fostering collaboration between the government and industry players, promoting understanding, and establishing industry obligations without impeding innovation.

In conclusion, Ghana’s collaborative approach to online child protection aims to ensure industry compliance while striking a balance between regulation and innovation. By actively engaging with industry stakeholders, Ghana seeks to develop effective measures that safeguard children online without stifling technological advancement. This approach acknowledges the limitations of self-regulation and excessive regulation, thus presenting a more holistic and effective approach to online child protection.

Session transcript

Moderator – Afrooz Kaviani Johnson:
Okay well welcome everyone, welcome everyone in the room and I understand we’ve got at least 20 that have logged on online as well to join this evening session in Kyoto so I know it’s been a long day for many people and we appreciate you taking the time and joining us in this session. We are going to be exploring different models of industry engagement and regulation to tackle online child sexual abuse and exploitation. My name is Afrooz Kaviani, I work for UNICEF headquarters in New York as the global lead on child online protection. I’m joined by my colleague Josie who leads our work on child rights and responsible business conduct in the digital environment. So Josie is managing our online moderation today and she’ll be looking out for questions and comments that may be coming from our online participants and we’re delighted to have with us expert speakers from different sectors and really from around the globe joining us representing Australia’s East Safety Commissioner, Japan’s Children and Families Agency, Japan’s private sector, Ghana’s Cyber Security Authority and BSR Business for Social Responsibility. Our aim today is to foster collaboration and the exchange of ideas, experiences and innovative strategies on this difficult topic of child sexual abuse online. So I do want to give the content warning that we are speaking about a difficult topic and it may be disturbing for people in the room or online so please feel free to step out or do what you need to do to you know safeguard your your own well-being. Many of you already know that this challenge of child exploitation on the internet is not new, however its nature has changed over the last decades and in the early stages efforts primarily were looking at halting the spread of child sexual abuse materials on the internet but today we’re seeing how technology is also being used to enable or facilitate child sexual abuse in a wide range of ways including the live streaming of child sexual abuse, the grooming of children for sexual abuse, the coercion, deception and pressuring of children into creating and sharing explicit images of themselves. So obviously it goes without saying that addressing this issue requires collaboration across sectors and it requires strengthening of systems for protection for children you know in their homes and their communities and in their countries but today we’re zooming in on a specific dimension of this response and it’s about how different jurisdictions are engaging companies in this effort and we’ve got one round of questions for our panelists and then we’re going to open the floor for questions and discussions from the audience. So I’m really delighted to turn to Australia to start us off and we’re so pleased to have with us Julie and Mum Grant who is Australia’s eSafety Commissioner. Thank you Commissioner for joining us and the question for you is really being the world’s pioneering government agency for online safety. I’m interested to hear from you about the suite of regulatory tools that you’re deploying to really drive systemic change in industry against online child abuse.

Julie Inman Grant:
It’s important to start with the fact that Australia has had a strong online content scheme for more than 22 years, which means almost none of the content, illegal content that we’re dealing with online, is hosted in Australia. It’s almost all extraterritorial and overseas. So you see the world moving towards some much more process and systemic types of laws. We’re seeing with the online safety bill in the UK, with the Digital Services Act. We do have process and safety powers, but I also want to start by talking a little bit about the complaint schemes that we have, because I believe it’s one of the most important things that we do. We seem to forget that it’s individuals who are being abused online, and that’s how harm manifests, and the ability to take down that content to prevent the re-traumatisation, but also to understand the trends that are happening through engaging with the public is really critical to our success in applying the systems and process powers. So just to give you an example, we’ve seen a doubling this year of child sexual exploitation. When we analysed about 1,300 URLs, we found that one in eight is now, instead of inter-familial abuse, which tends to be more typical, one in eight is coerced and self-produced through smartphones and webcams in children’s bedrooms and bathrooms, in the safety of the family home. So that’s really significant. It just shows that the internet is becoming a new receptacle for targets, for predators, and it’s no longer one of convenience. The other huge trend that a number of us are seeing is we’ve had a tripling of sexual extortion reports coming into our image-based abuse schemes. So image-based abuse, the non-consensual sharing of intimate images and videos, we are seeing younger children being subject to that, but it’s now young men between the ages of 14 and 24 that are largely being targeted. And while 18 is the year that we consider young people adults, they’re not totally cognitively developed. They may be leaving school, so they don’t have the pastoral care and protections that they might once have had. So it’s a very distressing kind of crime, and sometimes it can happen very rapidly. Organized criminals have figured out that young men will take off their clothes and perform sexual acts for the camera more readily than young women, and they will negotiate down. We’ve seen some negotiations where they’ll try and extract $15,000 from a teenager, and they’ll say, well, I’m just a teenager, I don’t have that money, well, how much can you give me? And it’s relentless. But they’ll also use guilt and shame and other tools of social engineering. So all this is really important for us to understand. We’ve actually developed some intelligence reports for companies like Snap and Instagram to say this is how we see your platform being weaponized. If you use some AL and machine learning, you can see that these same images are being used in 1,000 different reports, and if you use some natural language processing, you’ll see that they’re using the same language. So we need to encourage the companies to step up, and that’s where safety by design is a key systemic tool. But I guess the most potent one that we have is what we call the basic online safety expectations, and that’s where we lay out a set of foundational expectations we have for online companies, whether they’re gaming or dating sites or social media sites or messaging sites, to operate in our country. And it gives us the opportunity to ask transparency questions and compel legal answers. Questions we’ve been asking for six years, basic things like photo DNA has been used for more than ten years. Which services are you using it on? Are you using it at all? Are you looking at traffic indicators for live stream child sexual abuse material? Again, we can find the companies based on whether or not they respond truthfully and fulsomely in the manner and form. So that’s where the penalties are. stunning report, I think, in December of 2006, looking at the most powerful countries and companies in the world that have the financial resources and the capability to do things, but we’re not doing enough. So shining that light, with sunlight being the best disinfectant, is, I think, an effective tool. We’ve already seen in the United States the Heaton Initiative and others, you know, putting pressure on companies like Apple to target child sexual abuse material. You can’t tell me that in 2022 they only had 234 cases of child sexual abuse when they’ve got more than a billion handsets and iPads in the market and iCloud and iMessage. So we really need to lift the hood. We’ll be making a similar set of enforcement announcements next year focused on five more countries. We need to continue to work together. We need to lift the lid. We need to focus sunlight on so that we don’t let darkness fester in the darkest recesses of the web. Thank you, Commissioner. That is so fascinating, just the breadth of tools, and really I have to apologize and let everyone in the room know that I’ve given a very small amount of time to each speaker. So the Commissioner did an amazing job there really covering the breadth, but I think we’re going to have time to unpack and understand better. But I think just what you’ve managed to do and just those analogies of, you know, shining the light and using those regulatory tools to lift the hood. I forgot just to mention that we have codes, mandatory codes and standards covering eight sectors of the technology ecosystem, five of which we’ve filed a search engine code which now includes generative AI and synthetic generation of CSAM and TVEC, but we’re creating standards for a broader range of what we call designated internet services and relevant electronic services.

Moderator – Afrooz Kaviani Johnson:
Thank you, Commissioner. We’re now going to move to Japan. We’re here in Japan, so it’s very timely and it’s actually very exciting to introduce the next speaker. Mr. Tatsuya Suzuki. He’s the director of a newly formed agency in Japan, which is very significant for the child protection and child rights architecture in this country. So he’s the director of the Child Safety Division of the Children and Families Agency. He’ll be joining us online. And the question for Mr. Suzuki is to understand with his extensive experience, which includes roles at Japan’s National Police Agency, we’re wanting to know more about how this newly formed agency is really going to push forward public-private initiatives in order to tackle the specific issue of online child sexual abuse and exploitation. Do we have…

Tatsuya Suzuki:
SPEAKING JAPANESE SPEAKING JAPANESE SPEAKING JAPANESE SPEAKING JAPANESE SPEAKING JAPANESE The first is to use mobile information transfer machines and appropriately select and utilize internet-specific information so as to polish their ability to appropriately utilize internet. That is the first. However, the development and progress made on developing a safe and secure environment for young adolescents was to avoid military smuggling and military destruction of the newer internet developed with this current state-of-the-art new moira technologies. As we heard Mr. Kambiyu’s acceptance, a scale of research and innovation of the potential applications of modern data in the development of the environment of the moira in Japan and Japan to the world. Considering the characteristics of the Internet, there are more involved activities and more meaningful activities for the Münster-based demandé in our nation, it is necessary to earnestly appreciate the sacrifices of the standpointattled favored by the challengers. Give credit, for those who continue to support us in the spirit of universal demand, We are also working on a comprehensive plan for the proper use of children. We have been working on this work at the Cabinet Office for a long time, but this April, the Child Welfare Agency was established, so we are working on it. As I mentioned earlier, the Child Welfare Agency’s Internet Environment Management Act respects the private sector’s individual and subjective interests, and we are working with private organizations and organizations. For children’s safe and secure use of the Internet, we are working with experts in various fields, including academicians, lawyers, communications companies, school officials, and TTA organizations, and holding a seminar on the maintenance of the Internet environment for adolescents. We are also discussing the revision of the basic plan. Finally, I will explain a little about child sexual harassment prevention measures. In the past, the National Police Agency and the Japanese government have been working together on child sexual harassment prevention measures. Last year, we implemented the Child Sexual Harassment Prevention Plan 2022, but from this year, we will be in charge of the children’s home prevention measures. Regarding the promotion of the measures, we are also actively working with the Ministry of Education, Culture, Sports and Tourism, the Ministry of Education, Culture, Sports and Tourism, and the Japanese UNICEF. That’s all I have to say. Thank you.

Moderator – Afrooz Kaviani Johnson:
Thank you very much. and answers, but fantastic to hear how there is, you know, these basic standards in the law and now you are starting the implementation measures which are taking that multi-sectoral approach, but with a strong engagement of the private sector. On that note, I’m very pleased to shift the mic to the private sector representative from Japan, Mr. Toshiyuki Tateishi, who is representing the Japan Internet Provider Association as well as the Internet Content Safety Association, and my question is if you can let us know how private sector initiatives in Japan have adapted over the last decade to address emerging challenges relating to online child sexual abuse and exploitation.

Toshiyuki Tateishi:
Thank you. I’m very happy to be here. So I’d like to explain some Japanese situation, so could you check? So in Japan we have the Secrecy of Communication, it’s a constitutional law or something, but the brokering system is very hard to the ordinary people, how do they work, and then I make a small slide, so could you put something? So ordinary people think like that. So we are going to go outside and then I want to go somewhere, the house or any other building, so then we find we cannot go into them. But this is not the brokering, this is just a real world block. So I’m sorry for a little bit still Japanese. Any question with DNS? Normal website access with DNS. We ask the, we put the someone. So the LeZova replies actually may not, because So the the brockin system server answered the wrong kind of maybe another server’s IP address. So we cannot access to the overlap place. about that I first mentioned to enter some buildings. But actually the blocking Next please. So it’s like this. So when I want to go to the karaoke, the gatekeeper will say, okay, push it, next please. Then one more push, please. You can go there. But in fact the blocking system is like this, next please. He said I want to go to the house A. But when I want to go home, next please. So it’s like this, right? You cannot go out even from your house. So this is a broken system. But so in the other view, so if we broke them in Japan, but many other countries can access the each whatever, access the content. So only Japanese don’t know. Next, please. So this is a blocking scheme as a measure against these sites. So many left side of this slide, the users of the internet report, which you always have a legal content something. Then they will, the report is coming from. Then our association, Internet Content Safety Association, we make it a list. And automatically, the DNS server retrieves the list weekly, then update the DNS. So we can block the illegal contents. Next, please. Next, please. So if the website is located in Japan, deleted by the USP, ISP, and put it again, please. And the police will have an investigation about that. Push, please. So arrest the criminals. But not in located in Japan, located overseas, please. Check the site, which is really exist or not. Then we have a validation about that, which is a sexual abuse something. And then after that, we create a list and distribute a list to the ISP. And then the sites are blocked. After that, but we check weekly, which is exist or not. and then we delete the URL from the list. Thank you very much. And lastly, one more express. So in 2016, the UN report about freedom of expression in Japan, I was talking with him. He said this is a model, a kind of a great model in Japan presents in the area of freedom of internet, he said. So a very low level of government interference with digital freedoms addresses the government’s commitment to freedom of expression. As the government considers the legislation related to the wiretaps and new approaches to cybersecurity, I hope that this spirit of freedom, communications, security, and innovation online is kept as a forefront of regulatory efforts. So he said, so I’d like to keep this situation. Thank you.

Moderator – Afrooz Kaviani Johnson:
Thank you so much, Mr. Tateishi. That was very helpful to have the images. I appreciate your effort in those bespoke images. And I think you raise some important points that we may get to discuss as we go on, looking at the various rights that are implicated and making sure that we do advance human rights and children’s rights holistically and ensuring that every child has the right to protection from sexual abuse and exploitation. So from Japan, we’re now going to move to Ghana. And I’m so delighted to have Dr. Albert Antwi Boasiyako, who’s the Director General of Ghana’s Cybersecurity Authority. So Director General, as the Cybersecurity Authority really pioneers its role, because it is also relatively new in the scheme of things, interested to hear how Ghana is championing industry responsibility and fostering innovation to tackle this issue of online child sexual abuse and ex-

Albert Antwi-Boasiako:
And thank you also, Yannis, for the topri food privacy efforts and the violation of food access for land exploitation. thank you, and colleagues, and speakers, and everybody here and hopefully online on behalf of the for the invitation to contribute to the discussion. Thank you very much, Yannis, and thank you to all of you for being here today. I’m a government leader, and I’m impressed about how Australia has advanced that, but as a government lead for the past close to seven years, I think there are different maturity levels, and I want to speak from the developmental context. It’s very important, first things first. If you jump without doing the first things, you’re likely to create problems. So, I think the first thing is to have a commonality, and I think that’s one of the things that I heard that was here, the commonality, you know, whether you’re starting or whether you advance, I think the baseline requirement is key. But one also need to appreciate a bit of context. Developing country perspective, sometimes my Western colleagues tell me, but you have this law, especially when they come in, I say, well, the culture of other people is different. You have an interest in making progress but two aspects, technical competency or capacity of the host country. But my early part of this job as national adviser to government before I was appointed as general director, I realized that there are other factors that affect enforcing certain mineralsetting. you to have a certain degree of In fact, we run a lot of ideas with my partners. Once you mention regulations, my Western colleagues said no regulation, especially my US folks. But I think over the past few years, we’ve emerged to the matter. There is that sort of consensus that self-regulation alone cannot what? Keep our children safe. I think by and large, some of my colleagues have shifted a little bit. Possibly, I didn’t also stay too extreme, because there is that concern. If you over-regulate, then you are also going to kill innovation and others, especially a private sector perspective. But Ghana came up with a certain strategy, what we call collaborative regulation. Is that regulation all right? Because without regulation, I don’t think we’ll be able to achieve it. But how? How unique is it? I realize that it’s not just a government making a law and expect the industry to what? To comply. Sometimes, even understanding, and I can confirm that, the industry that we expect to follow certain best practices, to implement certain measures, themselves do not appreciate the risks that our children are facing, either the content they assess, the conduct themselves, or the content that they establish. When you have this realization, I think one will be very careful in terms of how you start your regulatory process. So taking inspiration from Julie, the basics of online safety expectations, we had to pass the law. And the law incorporated the issue of blocking, taking down content, filtering. It was quite a difficult one, because, of course, the suspicion from the civil society. Again, we had to sit together to debate, and eventually, Section 87 of our Cyber Safety Act make provisions to compel industry to act. in a manner that will protect children on the Internet. But that is just a basic framework. I think my colleague from the common law country will appreciate that. The primary law is one that you need an ally, a legislative instrument, to also effectively and practically operationalize the law. And I think, Alfred, we’re grateful we had to invite you yourself. We open up, not just industry, but international partners. Alfred has to visit Ghana for the first time to take part in a public consultation to formulate the specific mechanism by which industry plays. And I think she saw that. The industry is sitting together with us. In fact, they are suggesting. And as I sit here, I can mention that Ghana is active. The first active private sector player is the equivalent, the one who has the telecommunication chamber. Arguably, that is the most important industry body with that. And they have been actively involved in terms of even developing the ally. That is what I refer to the collaborative regulation. Because if you’re certain that we’re doing this together, you lose the morale or to say you are not complying. Of course, it doesn’t mean that is the only two. Ghana’s law incorporates sanctions, both administrative and criminal sanctions. Of course, we needed to fund cybersecurity. And in the developing context, you don’t just allow that. So we incorporated that. So if you do not comply, you are sanctioned. And telecommunication firms have got money. So you pay. And then we’re used to what? To finance cybersecurity. So we have these tools available in our law. I mean, nonetheless, at the core, what I wanted to share as a model from our perspective is this collaborative approach, that you engage with them because you need to build understanding. The concept of regulation in this age is not like in our contests, you know, all these headmaster and the student to do, go and do it. I think that’s a focus is quite, we need to engage. And I think it has been successful, even at governance level of my authority, 11 board members, three are from the industry. And I think that approach has work and other international practices such as the guidelines for industry by the IT, UNICEF, we have been incorporated into the ally as a best practices. But currently what we’re doing most is intensifying the awareness creation. The allies in the process, because that is really what is going to personalize the industry obligation and commitment. But we, I don’t think I will achieve much without really raising the awareness among the industry players that these are the risks. That’s the reason why you need to comply if you need to take down a content. This is why you need to apply if you need to, I mean, you need to comply with the law if you need to block certain content as far as the protection of children are concerned. So I think in a nutshell, ours is a developing situation, I must admit. Ours is a collaborative regulation because I think that is the best, it’s not really a government just giving instruction to industry, it doesn’t work like that. I think if you have a case, you discuss, you argue on the table, and I think that’s what Ghana has been able to use to get the industry sitting at the table, and I think some of our international partners who visit see the discussions, it’s open, transparent, there are risks, the government has to lead, industry need to get on board. But I think we do that by way of talking. Thank you.

Moderator – Afrooz Kaviani Johnson:
Thank you. Thank you, Director-General. No, that’s really fascinating, and indeed, the purpose of this whole discussion is that exchange of experiences, because there are very different approaches, different contexts, but what I really heard from you was going along that journey together with industry and looking at what was fit for purpose in your context, and really moving from just what is on paper to practice, and the best way to do that is bringing industry along with you. Now I’m really pleased to introduce Mr. Dunstan Allison-Hope, who’s the Vice President for Human Rights at BSR. Now, as I mentioned just earlier, this issue of online child sexual abuse and exploitation, it’s a human rights issue, it’s a children’s rights issue, and we do know that there are various tools in the human rights suite of tools, including human rights due diligence, including impact assessments, which are conducted by companies, and these can be key instruments in advancing responsible business conduct. So the question for you is, what does robust human rights due diligence entail, and how can it play a role in addressing this particular issue of online child sexual abuse and exploitation? Thanks. Great. Well, first of all, thank you

Dunstan Allison-Hope:
for the invitation to speak. Much appreciated. I’d love an invitation to Ghana as well, if that’s forthcoming. That was quick. So the main purpose of my comments today is to share how human rights due diligence, based on the UN guiding principles on business and human rights, can form an essential part of company efforts to respect child rights and to address online child sexual exploitation and abuse. I have really two main thoughts to share. The first thought is around the value of human rights due diligence, and the second is about some regulatory trends that are going to transform the landscape of human rights due diligence that I think it’s important to think about. So for context, the technology and human rights team at BSR has now conducted well over 100 different human rights assessments with technology companies. They come in a wide variety of different shapes and sizes. Sometimes it’s new products, sometimes it’s content policy, sometimes it’s market entry, market exit as well. They come in lots of different shapes and sizes. And in doing those assessments, I think we’ve experienced three main benefits of taking the human rights-based approach that you mentioned. So the first is the systematic review of impacts across all international human rights instruments, including all rights in the Convention on the Rights of the Child. So in a child rights context, that forces us to consider rights such as bodily security, freedom of expression, privacy, education, access to culture, and non-discrimination. It forces us to consider all of these rights holistically and to consider the relationship between them. So these rights are interdependent. Sometimes there’s tension between them. Sometimes the fulfillment of one right enables the fulfillment of other rights. So one clear benefit has been to take that holistic approach. The second is that a human rights-based approach requires us to give special consideration to those at greatest risk of becoming vulnerable, which clearly includes children. So this means that a robust human rights assessment would need to consider and find ways to consider the best interests of the child. The third is that the UN Guiding Principles provides a framework for appropriate action to address adverse human rights impacts. And one thing that we’ve really noted in the technology industry is that that appropriate action may vary considerably according to where in the technology stack a company sits. So the UN Guiding Principles have been written for all companies, in all industries, in all countries of all sizes. They apply to everybody, but it forces us to think through how you apply them in the context of the company that you’re working with. Now, till now, everything I’ve mentioned, all this human rights due diligence, has mainly been of a voluntary activity by companies. It is about to become much more mandatory with some very important implications. And this is my second point, and I’m going to share a long list with you in slide form, too. I started writing this long list, and I thought actually putting them on the slide might be helpful. So there is a very long list of things that companies are now. having to respond to. We have the European Union Corporate Sustainability Due Diligence Directive that’s going to require all companies doing business in Europe, so not just European companies, all companies doing business in Europe, to undertake human rights due diligence. The Corporate Sustainability Reporting Directive will require all companies doing business in Europe to report material topics informed by the outcome of human rights due diligence. And people often think of this as a reporting directive, which it is, but it has this really important line, informed by the outcome of human rights due diligence in it. And we’ve mentioned already the EU Digital Services Act that requires large online platforms and search engines to assess their impacts on fundamental rights, and it specifically calls out child rights as something to be assessed. We have the UK Online Safety Bill, which requires social media companies to assess content that may be harmful or age-inappropriate for children. We have the EU AI Act, which is still being debated as we speak, but essentially it includes the EU Charter of Fundamental Rights as the basis for understanding risk. In Japan, we have the Guidelines on Respecting Human Rights and Responsible Supply Chains. So if you put yourself in the shoes of a company, that’s a lot to take on in one go. And what we’ve noted, what we advise companies about, and what we talk to companies about a lot, is that throughout these regulations, human rights assessment requirements that are based on are very similar to the UN Guiding Principles on Business and Human Rights. So our position has been, if you want to prepare yourself to comply not just with the letter of these laws, but the spirit of these laws, the outcomes that they’re seeking to achieve, taking an approach based on the UN Guiding Principles is going to get you there and is going to get you to the right place with not just one of these regulations, but all of them. My point here is quite a simple one, which is that the rights of children, including efforts to address online sexual exploitation and abuse, should be fully embedded into these broader methods of human rights due diligence. We need to make sure that assessment of risk to rights of children is fully embedded into these broader methods of human rights due diligence. So that could mean, for example, child rights impact assessments being a modular part of much broader human rights due diligence. It might mean making sure that children or those with insight into the best interests of the child being meaningfully consulted and included in the process to undertake human rights due diligence. There’s lots that we can unpack there, but my advice is to sort of invest a lot of effort and thought into these processes. So this trend towards mandatory human rights due diligence I think is a massive regulatory and cultural shift for companies. It’s one I think will be well advised to harness for the child rights outcomes that we want to see. I am reasonably optimistic on all this with one caveat, which is you’ll notice the European Union and the UK features very strongly on this list, and I do fear that so much time and attention goes towards the European Union and the United Kingdom that that takes time and attention away from places where human rights risks may be more prominent, may be more severe. So one sort of flag that we’re raising is to make sure that companies take global approaches while applying to these quite regional laws and regulations. I’ll stop

Moderator – Afrooz Kaviani Johnson:
there. Fantastic. Thank you so much. Another very impressive effort of condensing a lot of information for us in that short time. Thank you, Dunstan. That was fantastic. A lot of food for thought, and really it’s a timely discussion because of this global, you know, this massive shift in the global landscape, and also at the same time these massive child rights and child protection challenges that we’re facing. So online participants and people in the room, we now do have a few moments for questions. We have a microphone behind us here, and we also have Josie monitoring the chat there. I’m not sure if there are any questions. Please, if you could come up to the mic and put your question. we can take a few and then we can open to the panel to discuss. Thank you so much for this

Audience:
great discussions and presentations. My name is Yulia, I work for UNESCO and I would like to bring a kind of challenging topic and a rather provocative question. So we are talking here about protection and safety which is of course the key of existing of children online but at the same time we do consider the right of children to access to information and it comes more pressing when we are talking about for instance sexuality education. So basically it is easy to ban all the content on on sexuality online but at the same time the reason children right to get access to correct and scientific information and content on sexuality and I wonder what are your I don’t know thoughts ideas how we might proceed with those you know challenging intersections between safety and access right to access this information. Thanks a lot. I might just take a couple of questions

Moderator – Afrooz Kaviani Johnson:
so just in the interest of time and then really it will be open to the panel to answer so please

Audience:
go ahead. Thank you my name is Jutta Kroll from the German Digital Opportunities Foundation. Just to answer before I put my question I would refer to the general comment number 25 to the UN Convention the rights of the child. I’ve brought some copies for those who have not been come across that and it will probably answer some of the question that has been put. I have a question to the first speaker I have to apologize that I came in a bit late but what I heard on the the new law and regulation was that it is also on raising awareness for parents and children in regard of protection on with regard to see them there. And I wanted to know whether there is a relation between, or a balance between, on the one hand, raising awareness and education part, and on the other hand, the obligations to service providers. And then the second question is going to this colleague. I have seen you’ve been talking about DNS blocking, but also we would need then removal of the content, not only blocking, because then it would still be there, and probably those who are looking for that content might find ways to circumvent the blocking that you’ve been talking about. Could you explain that maybe a little bit deeper? Thank you so much.

Moderator – Afrooz Kaviani Johnson:
Okay, it looks like we don’t have any other questions from the floor for now. So there’s three main questions, if I can summarize it for our panel, and then we can just pass the mic along, and if you would like to respond to one or all. So the first one is the really important balancing of the children’s rights to access information, and particularly sexual and reproductive health information, and getting that balance right when we’re talking about harmful content and restricting access, and making sure that that doesn’t inadvertently restrict children’s other rights. The second one, I think, may have been for our second speaker, the Japanese law. Yeah, so just understanding more about the awareness-raising content. And then the third one, which you’ve addressed it to the Japanese private sector, but perhaps other jurisdictions might like to share, you know, how they’re making sure that it’s not only about blocking, but also taking down, and also responding, you know, safeguarding children as well. So I think there is that whole system. So any volunteers from the panel? Commissioner?

Julie Inman Grant:
I was just saying earlier that clearly the internet was not built for children, although one-third of the users of the internet are children, and their online lives are inextricably intertwined with their everyday lives. It’s their schoolroom. It’s their friendship circle. It’s their place for learning, commuting, creating, and exploring, whether it’s exploring their sexuality or affinity groups. And we need to make sure that as we’re trying to make this safer, that we’re not—we’re mitigating the harms, but we’re not— we’re harnessing the benefits as well. So, you know, we came up against this. We did a two-year consultation on age verification, which was probably one of the most difficult processes I’ve gone through because there’s just so much polarization. But one of the things that we were so conscious of— of what was the ability of marginalized communities to be able, and particularly young people, to be able to do that exploration. And that doesn’t mean, age verification doesn’t mean restricting their access to everything. Again, I think there are a lot of things that companies can do beyond age gating. And I think Roblox is trialing some age verification. Tinder just announced they’re doing so in Australia, as is Instagram. So it’s good to see that companies are starting to think about what is our responsibility to make sure that children are 13 and above, and that we’re making meaningful checks. And I can say, from our experience of youth-based cyberbullying, what we saw post-COVID is that because parents were so much more permissive with technology use when we were locked down, we now have kids that are eight or nine reporting cyberbullying to us, whereas prior to COVID, the average age was about 14. So once you are permissive with technology use, you really cannot ratchet that back. So I would just say, I’m the Australian regulator. And yes, we have powers. But we have a model where we talk about the three Ps of prevention, protection, and proactive and systemic change. You’ve got to prevent the harms from happening in the first place by having fundamental research, by understanding how harms manifest against different vulnerable communities in different ways, and then co-designing solutions with those communities, doing this with communities rather than two communities. I think I heard Albert say, and we struggle with this too, one of the biggest challenges is raising awareness and encouraging young people to engage and help seeking behaviors. And I’d say parents are the hardest cohort to reach. So all of the things are interrelated. If they weren’t hard, then we would have nailed this

Dunstan Allison-Hope:
already, but they are. I just have a comment on the, I think it’s the first question, maybe the second one. So it’s a great question because it enables me to say a human rights-based approach is designed to achieve precisely that. So a couple of things to say. First of all, when we do human rights assessments, it is quite typical for child sexual exploitation and abuse to raise to the surface is one of the most severe human rights risks that companies need to address. So first of all, that risk tends to come up as one of the top priority risks to address based on the criteria that UNGPs set out. However, we do take this holistic approach. So we consider the relationship between different human rights. So when does the fulfillment of one human right enable the fulfillment of other rights? When does a violation of one right, like the violation of the right to privacy, present risks to other human rights, like the ability to access information or express yourself freely? When there are tensions between rights, how do you address those tensions? How do you apply human rights principles like legitimacy, necessity, proportionality, non-discrimination to decisions about when and how to restrict access to content? And just one idea to throw out into the room that came to me when the question was asked. One of the interesting developments in the sort of business and human rights field and the tech industry over recent years has been the meta-oversight board, where they publish case decisions on particular cases that come to the oversight board and make recommendations for what meta should do to address the whatever failings they’ve identified. And I read a lot of those cases and they’re very long and it includes a segment that undertakes a human rights analysis of that particular case. And the Oversight Board has the time and space to do that, because they’re not making rapid decisions like META does. They have weeks and months to do this analysis. And I find it a really helpful source of insight. I’m not sure that there have been many child rights-related cases before the Oversight Board, but some place where we can do that type of thinking to unpack tensions between rights, the relationship between rights in a child rights context, I think would be really useful, because we come across this all the time when we do human rights assessments. Dilemmas, uncertainties, we’re not sure what recommendations we should be making sometimes. And I’d love space for that thinking to take place.

Julie Inman Grant:
Can I make a comment about? I’m glad you’re reading the cases of the META Oversight Board. It raises an interesting issue, because there’s a lot of discussion now about multi-stakeholder regulation of the platforms. And I believe that in the last transparency report, the META Oversight Board received about 1.3 million requests to review content moderation decisions that were made. And because these are such long, drawn-out decisions, they were able to cover 12 in 12 months. Now, we’re a very small agency, but we’ve dealt with tens of thousands of investigations. And you’re just able to be a lot more nimble. So I think there’s a really important role for that, and to kind of interrogate some of those more difficult and contextual issues. It’s always the gray area that’s going to be challenging. But I’m not sure also how many of the decisions that the META has actually accepted based on Oversight Board recommendations you might have a better sense.

Moderator – Afrooz Kaviani Johnson:
I’m just wondering if if you would like to respond to the question around the blocking and take down

Toshiyuki Tateishi:
So first of all we have to take down So it’s it so many times we try to the and sometimes before we are booking that We ask the other even foreign countries. We ask there. There are some time police or any governmental Offices asked that at last if we ask Them if there is no reply or something. So the last measure to block some sexual contents

Moderator – Afrooz Kaviani Johnson:
Thank you, and I’m not sure if Mr. Suzuki is still online because there was a question about Just explaining more about the provision in the law about raising awareness Whether it’s that’s broad around all of children’s rights in relation to the digital environment Would Suzuki son like to answer that question or we can

Tatsuya Suzuki:
Say seat You must have cornelia on the full it’s demand so No, internet literacy Guinevere Because the most of its Failure You write about home for some of us you are not Know Villa voodoo duty I mean Oh Oh 日本でインターネットで子供が性被害に遭う場合としては 1つには騙されて、自分の裸の写真を送ってしまうというパターンがあります。 これは多分外国でもあると思います。もう一つ、これはどちらかというと日本に特有のものなのかもしれませんが、 インターネットで知り合った人と実際にリアル空間で面会して、そこで性被害に遭ってしまうというところがあります。 こういったインターネットの使い方について、その危険性について教えるということを子供家庭庁だけでなく、 警察庁や文部科学省あるいは総務省といったいろいろな機関において、 それからもう一つは保護者ですね。保護者にお願いしているのは、ヘアレンタルコントロールと言いまして、やはりお子さんでも一人一人発達段階はいろいろと違います。 親子でよく話し合って、何をしているのかというと、 ある先生がよく言っているんですけれども、子どもたちは自分が話し合って決めたルールがよく守ります。 そういったことをお願いしているところです。 今での時点で、子どもの話し合いによって、 子どもの話し合いによって、 子どもの話し合いによって、 子どもの話し合いによって、 子どもの話し合いによって、 今でちょっと時間をお待ちいただきません。

Moderator – Afrooz Kaviani Johnson:
Thank you so much Mr Suzuki and thank you again to all our online and in-person participants. We have come to the end of our time together, though I think this is a topic that deserves a lot more time because as was just mentioned, there is a lot of complexity to this. There are very challenging dilemmas that regulators are dealing with, that companies are dealing with, that civil society, that people working on these issues are dealing with. So it is something that I hope we can keep up the exchange. I hope that everyone found that a fruitful exercise, at least a start of the discussion. We’re meant to capture key takeaways and key actions from each of the sessions. I don’t know that they’re fully formulated yet. yet, but certainly I think I’ve taken away that there is this need to continue the learning and the exchange, that there is this need to ensure that these solutions are consultative, that everyone is involved in the journey, particularly companies when we’re talking about regulation, co-regulation or collaborative regulation as Ghana is doing. Obviously tech companies are vital stakeholders in this effort to protect children from online abuse, but we also see this massive global landscape shifting, I think I really took that away from your points Dunstan, and just this opportunity to fully embed online protection of children from online sexual abuse and exploitation into these broader methods that are becoming increasingly mandatory. So thank you to our esteemed panellists, the Commissioner had to dash away to catch her Shinkansen to Tokyo, so she sends her apologies, but a huge thank you to our panellists, a huge thank you to our interpreters and everyone supporting today, so thank you. Thank you.

Albert Antwi-Boasiako

Speech speed

179 words per minute

Speech length

1310 words

Speech time

438 secs

Audience

Speech speed

164 words per minute

Speech length

405 words

Speech time

148 secs

Dunstan Allison-Hope

Speech speed

176 words per minute

Speech length

1708 words

Speech time

583 secs

Julie Inman Grant

Speech speed

165 words per minute

Speech length

1803 words

Speech time

656 secs

Moderator – Afrooz Kaviani Johnson

Speech speed

150 words per minute

Speech length

2003 words

Speech time

800 secs

Tatsuya Suzuki

Speech speed

72 words per minute

Speech length

531 words

Speech time

445 secs

Toshiyuki Tateishi

Speech speed

124 words per minute

Speech length

761 words

Speech time

368 secs