Risks and opportunities of a new UN cybercrime treaty | IGF 2023 WS #225
Table of contents
Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.
Knowledge Graph of Debate
Session report
Full session report
Sophie
The importance of children’s digital rights in the digital world is underscored by the United Nations. These rights encompass provision, protection, and participation, which are essential for children’s empowerment and safety in online spaces. General Commendation 25 by the UN specifically emphasises the significance of children’s digital rights. It is crucial to ensure that children have access to digital resources, that they are protected from harm and exploitation, and that they have the opportunity to actively engage and participate in the digital world.
Young children often seek support from their parents and teachers when faced with online risks. They rely on them as safety contact persons for any issues they encounter on the internet. As they grow older, children develop their own coping strategies by employing technical measures to mitigate online risks. This highlights the importance of parental and teacher support in assisting children in navigating the digital landscape and promoting their online safety.
Furthermore, the design of online spaces needs to be tailored to cater to the diverse needs of different age groups. Children, as active users, should have digital platforms that are user-friendly and age-appropriate. Children are critical of long processing times for reports on platforms, advocating for more efficient and responsive mechanisms. It is important to consider children’s perspectives and ensure that their voices are heard when designing and developing online spaces.
Human resources play a significant role in fostering safe interactions online. Children are more likely to use reporting tools that establish a human connection, thereby enhancing their sense of safety and anonymity. The THORN study conducted in the United States supports this viewpoint and suggests that human involvement positively affects children’s willingness to report online incidents.
The introduction of the Digital Services Act in the European Union is seen as a critical tool for protecting children’s data. This legislation is set to come into force next year and aims to enhance data protection measures for individuals, including children, in the digital sphere. The act aims to address issues related to privacy, security, and the responsible use of digital services to safeguard children’s personal information.
Children’s rights by design and their active participation in decision-making processes regarding the digital environment should be prioritised. The United Nations’ General Comment 25 highlights the importance of young people’s participation in decisions about the digital space. The German Children’s Fund has also conducted research that emphasises the need for quality criteria for children’s participation in digital regulations. By involving children in decision-making, their perspectives and experiences can inform policies and ensure that their rights are respected and protected.
Creating safe socio-digital spaces for children and adolescents is of paramount importance. These spaces should not be primarily influenced by product guidelines or market-driven interests but rather should prioritise the well-being and safety of children and young people. Civil society and educational organisations are seen as key stakeholders in shaping and creating these safe social spaces for children to engage in the digital world.
In conclusion, a holistic approach is necessary to advocate for children’s rights in the digital world. This entails promoting children’s digital rights, providing support and guidance from parents and teachers, adapting the design of online spaces to meet the needs of different age groups, harnessing the potential of human resources for safe interactions, and enacting legislation such as the Digital Services Act for protecting children’s data. Children and young people should be actively involved in their rights advocacy and be included in decision-making processes in the digital environment. The involvement of all stakeholders, including governments, organisations, and communities, is essential in advancing and safeguarding children’s rights in the digital world.
Steve Del Bianco
In the United States, the states of Arkansas and California faced legal action for implementing a controversial rule that required legal consent from a parent or guardian for individuals under the age of 18 to use social media sites. Steve Del Bianco, representing an organization, sued the states and deemed this measure to be aggressive.
The sentiment expressed towards this rule was negative, as it was seen as a potential infringement upon the rights of children and young individuals. The argument presented was that broad child protection laws have the potential to restrict a child’s access to information and their ability to freely express themselves. Judges who presided over the case acknowledged the importance of striking a balance between child rights and the need for protection from harm.
Steve Del Bianco, in the course of the proceedings, emphasized the significance of considering the best interest of the child. He argued that the state’s laws should undergo a test that balances the rights of the child with their protection from potential harm. According to Del Bianco, these laws should not excessively limit a child’s access to information or their ability to express their beliefs.
Moreover, it became evident that lawmakers lacked an understanding of the broader implications of their laws. This led to legal challenges and raised concerns about the effectiveness of these policies. Del Bianco’s organization obtained an injunction that effectively blocked the states from enforcing these laws. It was suggested that lawmakers should be educated and gain a better understanding of the potential consequences of their legislative decisions to avoid such legal challenges.
To summarize, the implementation of a rule requiring verifiable consent for underage individuals to use social media sites in certain US states sparked controversy and legal disputes. The negative sentiment towards this rule arose from concerns about potential limitations on the rights of children to access information and express themselves freely. The need to strike a balance between child rights and protection from harm was highlighted. Additionally, the lack of understanding by lawmakers about the broader implications of their laws was emphasized, underscoring the importance of better education and consideration in the legislative process.
B. Adharsan Baksha
AI adoption among children can pose significant risks, particularly in terms of data privacy. The presence of chatbots such as Synapse and MyAI has raised concerns as these tools have the capability to rapidly extract and process vast amounts of personal information. This raises the potential for exposing children to various cyber threats, targeted advertising, and inappropriate content.
The ability of chatbots to collect personal data is alarming as it puts children at risk of having their sensitive information compromised. Cyber threats, such as hacking or identity theft, can have devastating consequences for individuals, and children are especially vulnerable in this regard. Moreover, the information gathered by chatbots can be used by marketers to target children with ads, leading to potential exploitation and manipulation in the digital realm.
Inappropriate content is another concerning aspect of AI adoption among children. Without proper safeguards, chatbots may inadvertently expose children to age-inappropriate material, which can have a negative impact on their emotional and psychological well-being. Children need a secure and regulated online environment that protects them from exposure to harmful content.
It is crucial to recognise the need to ensure a secure cyberspace for children. This includes focusing on the development and implementation of effective measures related to artificial intelligence, children, and cybersecurity. Governments, organisations, and parents must work together to mitigate the risks associated with AI adoption among children.
In conclusion, AI adoption among children brings forth various risks, with data privacy issues at the forefront. Chatbots that possess the ability to collect personal data may expose children to cyber threats, targeted advertising, and inappropriate content. To safeguard children’s well-being and protect their privacy, it is essential to establish a secure online environment that addresses the potential risks posed by AI technology. The responsibility lies with all stakeholders involved in ensuring a safe and regulated cyberspace for children.
Katz
Child rights are considered fundamental and should be promoted. Katz’s child-focused agency actively advocates for the promotion of child rights. However, conflicts between child rights and freedom of expression can arise. Survey results revealed such conflicts, underscoring the need for balance between these two important aspects.
Misunderstandings or misinterpretations of child rights are common and must be addressed. Some people mistakenly believe that virtual child sexual abuse material (CSAM/SEM) can prevent real crime, indicating a lack of understanding or misinterpretation of child rights. Efforts should be made to educate and provide correct information regarding child rights to combat these misunderstandings.
Regulating AI in the context of child protection is a topic under discussion. Many respondents believe that AI should be regulated to ensure child protection, particularly in relation to CSAM/SEM. However, opinions on this matter are mixed, highlighting the need for further dialogue and research to determine the most appropriate approach.
Public awareness of the risks and opportunities of AI needs to be raised. Approximately 20% of respondents admitted to having limited knowledge about AI matters and associated risks. This signifies the need for increased education and awareness programs to ensure the public understands the potential benefits and dangers of AI technology.
Japan currently lacks regulations and policies concerning AI-generated imagery. Katz’s observation reveals a gap in the legal framework, emphasizing the necessity of establishing guidelines and regulations to effectively address this issue.
There is also a need for greater awareness and information dissemination about AI developments. Katz suggests that the media should take more responsibility in informing the public about advancements and implications of AI. Currently, people in Japan are not adequately informed about ongoing AI developments, highlighting the need for improved communication and awareness campaigns.
Katz recommends that the public should gather information from social networking services (SNS) about AI developments. This highlights the importance of utilizing various platforms to stay updated and informed about the latest developments in the field of AI.
A rights-based approach is crucial in designing regulation policies. It is essential to ensure that the rights of children and humans are protected in the digital world. Advocating for the enhancement of child and human rights in the digital sphere is a vital aspect of creating an inclusive and safe environment.
In conclusion, promoting child rights is essential, although conflicts with freedom of expression may arise. Addressing misunderstandings and misinterpretations of child rights is crucial. The regulation of AI in the context of child protection requires further examination and consideration. Public awareness about the risks and opportunities of AI needs to be improved. Japan lacks regulations for AI-generated imagery, and greater awareness about AI developments is necessary. Gathering information from SNS can help individuals stay informed about AI happenings. A rights-based approach is needed when designing regulation policies, and enhancing child and human rights in the digital world is vital.
Amy Crocker
During the event, the speakers highlighted the significant importance of children’s digital rights in creating a safe and secure online environment. They stressed that children’s rights should be protected online, just as they are in the offline world. General Comment Number 25 to the UN Convention on the Rights of the Child was mentioned as a recognition of the importance of children’s digital rights, with state parties being obligated to protect children from all forms of online exploitation and abuse.
In terms of internet governance, the speakers advocated for a proactive and preventive approach, rather than a reactive one. They argued that governments often find themselves playing catch-up with digital issues, reacting to problems after they have already occurred. A shift towards a preventive model of online safety was deemed necessary, which involves designing for safety before potential issues arise.
Effective implementation was seen as the key to turning digital policies into practice. The speakers emphasized the need to understand how to implement policies in specific local contexts to realize the full benefits. They argued that implementation is crucial in ensuring that children’s rights are protected and upheld online.
The need for public understanding of technology and its risks and opportunities was also highlighted. It was mentioned that improving public understanding is necessary for individuals to make informed decisions about their online activities. Empowering parents to understand technology and facilitate their children’s rights was seen as an important aspect of ensuring a safe online environment for children.
Trust was identified as a crucial element in the digital age, particularly with the growing reliance on technology. The speakers discussed the importance of trust against the backdrop of emerging risks related to data breaches, data privacy problems, and unethical practices. Building and maintaining trust were seen as essential for a secure online environment.
Safeguarding the younger generations online was viewed as a collective responsibility. The speakers stressed that parents and guardians cannot solely shoulder this responsibility and must have a certain level of knowledge of online safety. The importance of all stakeholders, including businesses, industries, and governments, working together to protect children’s rights online was emphasized.
Regulation was seen as an important tool for keeping children safe online. However, it was noted that regulation alone is not a solution for the challenges posed by emerging technologies. The speakers argued that both regulation and prevention through education and awareness are crucial in effectively addressing these challenges.
Differentiated regulation based on context was advocated for. The speakers highlighted that different online services offer different opportunities for children to learn and be creative. They also emphasized that children’s evolving capacities are influenced by various factors, such as their geographical and household contexts. Understanding the link between online and offline contexts was seen as essential in developing effective regulation.
Transparency, a culture of child rights, and collaborative efforts were identified as crucial for the protection of children’s rights online. All stakeholders, including businesses, industries, and governments, were urged to work together and have a shared understanding of child rights. The need for transparency in their commitment to protecting child rights was emphasized.
The challenges faced by developing countries in terms of technology and capacity building were acknowledged. The speakers discussed the specific challenges faced by countries like Bangladesh and Afghanistan in terms of accessing technology and building the necessary capacity. Opportunities for codes of conduct that can be adapted to different contexts were also explored.
Consulting children and young people was highlighted as an important approach to addressing online safety issues. The speakers emphasized the need to understand how children and young people feel about these issues and to learn from approaches to regulation that have been successful.
Amy Crocker, one of the speakers, encouraged people interested in children’s rights issues to join the Dynamic Coalition and continue similar conversations. Flyers and a QR code were mentioned as ways to sign up for the mailing list. The importance of creating more space within the IGF for discussing children’s rights issues was also emphasized.
In conclusion, the event highlighted the significant importance of protecting children’s digital rights and creating a safe and secure online environment for them. It emphasized the need for proactive and preventive internet governance, effective implementation of digital policies, public understanding of technology, empowering parents, trust, collective responsibility, regulation along with education and awareness, differentiated regulation based on context, transparency, and collaborative efforts. The challenges faced by developing countries were acknowledged, and the involvement of children and young people was seen as essential in addressing online safety issues.
Ahmad Karim
In a discussion concerning the design of advancing technology, Ahmad Karim, representing the UN Women Regional Office for Asia and the Pacific, stressed the importance of carefully considering the needs of girls, young adults, females, and marginalized and fragile groups. It was noted that, in such discussions, there is often a tendency to overlook gender-related issues, which indicates a gender-blind approach.
Another argument put forth during the discussion underscored the significance of making the design of the metaverse and technologies more considerate towards marginalized and fragile groups, especially girls and women. The rapid advancements in technology were acknowledged as having disproportionate effects on females and marginalized sectors of society. It was highlighted that national laws frequently do not adequately account for the specific needs and challenges faced by these groups.
The supporting evidence provided includes the fact that girls, young adults, and women are often underrepresented and encounter barriers in accessing and benefiting from technological advancements. Additionally, marginalized and fragile groups, such as those from low-income backgrounds or with disabilities, are particularly vulnerable to exclusion and discrimination in the design and implementation of technology.
The conclusion drawn from the discussion is that there is an urgent need for greater attention and inclusivity in the design of advancing technology. Consideration must be given to the unique needs and challenges faced by girls, young adults, females, and marginalized and fragile groups. It is imperative that national laws and policies reflect these considerations and ensure that these groups are not left behind in the technological progress.
This discussion highlights the significance of addressing gender inequality and reducing inequalities in the design and implementation of technology. It sheds light on the potential pitfalls and repercussions of disregarding the needs of marginalized and fragile groups, and calls for a more inclusive and equitable approach to technological advancements.
Tasneet Choudhury
During the discussion, the speakers highlighted the importance of ensuring the protection and promotion of child rights within AI strategies, policies, and ethical guidelines. They particularly emphasized the significance of these efforts in developing countries, such as Bangladesh. Both speakers stressed the need to include provisions that safeguard child rights in AI policies, especially in nations that are still in the process of development.
The speakers also connected their arguments to the Sustainable Development Goals (SDGs), specifically SDG 4: Quality Education and SDG 16: Peace, Justice, and Strong Institutions. They proposed that by embedding measures to protect child rights in AI strategies and policies, countries can contribute to the achievement of these SDGs. This link between AI development and the attainment of global goals highlights AI’s potential role in promoting inclusive and sustainable development.
Although no specific supporting facts were mentioned during the discussion, the speakers expressed a neutral sentiment towards the topic. This indicates their desire for a balanced and equitable approach to integrating child rights into AI strategies and policies. By addressing this issue neutrally, the speakers emphasized the need for a comprehensive and ethical framework that protects the rights and well-being of children in the context of AI development.
One notable observation from the analysis is the focus on child rights in the discussion of AI policies. This underscores the growing recognition of the potential risks and ethical implications that AI may pose for children, particularly in countries with limited resources and regulations. The emphasis on child rights serves as a reminder that as AI continues to advance, it is crucial to ensure that these technologies are developed with the best interests of children in mind.
In conclusion, the discussion underscored the importance of protecting and upholding child rights within AI strategies, policies, and ethical guidelines. The speakers highlighted the specific significance of this endeavor in developing countries like Bangladesh. The incorporation of child rights in AI policies aligns with the Sustainable Development Goals of Quality Education and Peace, Justice, and Strong Institutions. The neutral sentiment expressed by both speakers indicates the need for a balanced approach to addressing this issue. Overall, the discussion shed light on the need for a comprehensive and ethical framework that safeguards the rights of children amidst the development of AI technologies.
Jenna
Children today are immersed in the online world from a very young age, practically being born with access to the internet and technology. This exposure to the digital age has led to an increased need for trust in this new environment. Trust is seen as a cornerstone of the digital age, particularly as we rely on technology for almost every aspect of our lives. Without trust, our reliance on technology becomes more precarious.
Creating a reliable and ethical digital environment for younger generations requires imparting fundamental digital knowledge and nurturing trust. Building trust and instilling digital literacy are essential steps in safeguarding children online. Parents play a crucial role in this process, but it is also a shared responsibility that extends to all stakeholders. Informed parents are key as they are often the first line of defense for children facing challenges online. However, they cannot do it alone, and it is important for all stakeholders to be aware of their responsibility in protecting younger generations.
The challenges faced by teenagers today in the online world are more multifaceted and harmful than ever before. Cyberbullying has evolved from early stages of internet flaming and harassment via emails to more advanced forms like cyberstalking and doxing. The rise of generative AI has made creating hate image-based abuse relatively easier, contributing to a growing concern for online safety. It is important to address these issues effectively and efficiently to ensure the well-being of young people online.
The approach to online safety varies across different jurisdictions, with each adopting their own strategies and measures. For example, Australia has an industry code in place, while Singapore employs a government-driven approach. This diversity highlights the need for clear definitions and standards regarding online safety threats. A cohesive understanding of these threats is imperative to effectively combat them and ensure consistency across different regions.
Capacity building is essential for addressing the challenges of the digital age. Empowering young people and ensuring their voices are heard can lead to a better understanding of their needs and concerns. Additionally, understanding the technical aspects of internet governance is vital in developing effective solutions to address issues of online safety and security.
Inclusion and diversity are crucial in creating a safe online space. It is important to include the voices of different stakeholders and ensure that everyone has a seat at the table. Language can be a barrier, causing loss in translation, so efforts must be made to overcome this and make conversations more inclusive.
The perspective and insights of young people are valued in discussions on gender and technology. Gaining fresh and unique insights from the younger generation can contribute to the development of more inclusive and gender-responsive approaches. Jenna, a participant in the discussion, highlighted the need to engage young people in discussions related to explicit content and self-expression, as well as providing safe spaces for their voices to be heard.
Modernizing existing legal frameworks is seen as a more effective approach to addressing the impacts of AI and other technological advancements. Rather than a single legislative solution, updating legislation such as the Broadcasting Act, Consumer Protection Act, and Competition Act is seen as crucial in integrating present issues and adapting to the digital age.
Collaboration among stakeholders is essential for success. Capacity building requires research support, and the cooperation of multiple stakeholders is crucial in terms of legislation and regulations. By working together and leveraging each other’s strengths, stakeholders can more effectively address the challenges faced in the digital world.
Lastly, inclusive involvement of the technical community in the policy-making process is advocated. The technical community possesses valuable knowledge and insights that can contribute to the development of effective policies. However, it is acknowledged that their involvement may not always be the best fit for all policy-making decisions. Striking a balance between technical expertise and broader considerations is key to ensuring policies are robust and comprehensive.
In conclusion, children today are growing up in a digital age where they are exposed to the internet and technology from a young age. Building a reliable and ethical digital environment requires imparting digital knowledge and nurturing trust. Safeguarding younger generations online is a shared responsibility, requiring the involvement of all stakeholders. The challenges faced by teenagers today, such as cyberbullying and hate speech, are advanced and harmful. Different jurisdictions have varying approaches to online safety, emphasizing the need for clear definitions and standards. Capacity building and the inclusion of diverse voices are crucial in creating a safe online space. The perspective and insights of young people are valuable in discussions on gender and technology. Modernizing existing legal frameworks is advocated, and engaging young people in discussions on explicit content and self-expression is important. Collaboration among stakeholders and the inclusion of the technical community in policy-making processes are considered essential for success in addressing the impacts of the digital age.
Larry Magid
In the analysis, the speakers engage in a discussion regarding the delicate balance between protecting children and upholding their rights. Larry argues that protection and children’s rights are sometimes in conflict. He cites examples of proposed US laws that could suppress children’s rights in the guise of protection. Larry also highlights the UN Convention, which guarantees children’s rights to freedom of expression, participation, and more.
On the other side of the debate, another speaker opposes legislation that infringes upon children’s rights. They point out instances where such legislation may limit children’s rights, such as requiring parental permission for individuals under 18 to access the internet. Their sentiment towards these laws is negative.
Lastly, a speaker emphasises the need for a balanced approach to regulation, one that can protect and ensure children’s rights while acknowledging the inherent risks involved in being active in the world. They argue for a fair equilibrium between rights and protection. Their sentiment remains neutral.
Throughout the analysis, the speakers recognize the challenge in finding the proper balance between protecting children and preserving their rights. The discussion highlights the complexities and potential conflicts that arise in this area, and stresses the importance of striking a balance that safeguards children’s well-being while still allowing them to exercise their rights and freedoms.
Katarzyna Staciewa
In a recent discussion focusing on the relationship between the metaverse and various sectors such as criminology and child safety, Katarzyna Staciewa, a representative from the National Research Institute in Poland, shared her insights and emphasized the need for further discussions and research in criminology and other problematic sectors. Staciewa drew upon her experiences in law enforcement and criminology to support her argument.
Staciewa discussed her research on the metaverse, highlighting its significance in guiding the development of developing countries. The metaverse, an immersive virtual reality space, has the potential to shape the future of these countries by offering new opportunities and addressing socio-economic challenges. Staciewa’s positive sentiment towards the metaverse underscored its potential as a tool for fostering quality education and promoting peace, justice, and strong institutions, as outlined in the relevant Sustainable Development Goals (SDGs).
However, concerns were raised during the discussion regarding the potential misuse of the metaverse and AI technology, particularly in relation to child safety. Staciewa analyzed the darknet and shed light on potentially sexually interested groups involving children, revealing alarming trends. The risks associated with the metaverse lie in the possibility of AI-generated child sexual abuse material (CSAM) and the potential for existing CSAM to be transformed into virtual reality or metaverse frames. The negative sentiment expressed by Staciewa and others reflected the urgency to address these risks and prevent harm to vulnerable children.
The speakers placed strong emphasis on the importance of research in taking appropriate actions to ensure child safety. Staciewa’s research findings highlighted the constant revictimization faced by child victims, further underscoring the need for comprehensive measures to protect them. By conducting further research in the field of child safety and child rights, stakeholders can gain a deeper understanding of the challenges posed by the metaverse and AI technology and develop effective strategies to mitigate these risks.
In conclusion, the discussion on the metaverse and its impact on various sectors, including criminology and child safety, highlighted the need for more research and discussions to harness the potential of the metaverse while safeguarding vulnerable populations. While acknowledging the metaverse’s ability to guide the development of developing countries and the positive impact it can have on education and institutions, concerns were expressed about the possibility of misuse, particularly with regards to child safety. The importance of research in understanding and addressing these risks was strongly emphasized, particularly in the context of the continuous victimization of child victims.
Patrick
During the discussion on child safety and online policies, the speakers emphasised the importance of taking a balanced approach. While regulation was acknowledged as a crucial tool in ensuring child safety, the speakers also highlighted the significance of prevention, education, and awareness.
It was noted that regulation often receives more attention due to its visibility as a commitment to child safety. However, the lack of proportional investment in prevention aspects, such as awareness-raising and education, was seen as a gap.
Addressing the specific needs of children in relation to their evolving capacities and contexts was deemed crucial. A differentiated approach to regulation was recommended, taking into consideration the diverse services and opportunities available for children to learn digital skills. The household environment, geographical context, and access to non-digital services were identified as factors that influence children’s evolving capacities.
A unified understanding and commitment to child rights were highlighted as prerequisites for effective regulation. The speakers pointed out that there is often a significant variation in how child rights are interpreted or emphasised in different regional, cultural, or religious contexts. It was stressed that a transparent commitment and culture of child rights are necessary from industries, businesses, and governments for any successful regulation to be established.
The tendency of developing countries to adopt policies and legislation from key countries without critically analysing the unique challenges they face was criticised. The speakers observed this trend in policy-making from Southern Africa to North Africa and the Asia Pacific region. The need for developing countries to contextualise policies and legislation according to their own specific circumstances was emphasised.
An issue of concern raised during the discussion was the reluctance of countries to update their legislation dealing with sexual violence. The process for legislation update was noted to be lengthy, often taking up to five to ten years. This delay was seen as a significant barrier to effectively addressing the issue and protecting children from sexual violence.
The role of industries and companies in ensuring child safety was also highlighted. It was advocated that industries should act as frontrunners in adopting definitions and staying updated on technologically enhanced crimes, such as AI-generated child sexual abuse material (CSAM). The speakers argued that industries should not wait for national policies to change but should instead take initiative in adhering to certain definitions and guidelines.
The importance of engaging with children and listening to their experiences and voices in different contexts was emphasised. The speakers stressed that children should have a critical say in the internet space, and adults should be open to challenging their own thinking and assumptions. Meaningful engagement with children was seen as essential to understanding their needs and desires in using the internet safely.
In addition, the speakers highlighted the need for cross-sector participation in discussing internet safety. They recommended involving experts from various fields, such as criminologists, educators, social workers, public health specialists, violence prevention experts, and child rights legal experts. A holistic and interdisciplinary approach was deemed necessary to address the complex issue of internet safety effectively.
Overall, the discussion on child safety and online policies emphasised the need for a balanced approach, taking into account regulation, prevention, education, and awareness. The importance of considering the evolving capacities and contexts of children, a unified understanding and commitment to child rights, and the role of industries and companies in taking initiative were also highlighted. Additionally, the speakers stressed the significance of engaging with children and adopting a cross-sector approach to ensure internet safety.
Andrew Campling
The discussions revolve around the significant impact that algorithms have on child safety in the digital realm. One particularly tragic incident occurred in the UK, where a child took their own life after being exposed to suicide-relevant content recommended by an algorithm. This heartbreaking event highlights the dangerous potential of algorithms to make malicious content more accessible, leading to harmful consequences for children.
One key argument suggests that restrictions should be placed on surveillance capitalism as it applies to children. The aim is to prevent the exposure of children to malicious content by prohibiting the gathering of data from known child users on platforms. These restrictions aim to protect children from potential harms caused by algorithmic recommendations of harmful content.
Another concerning issue raised during these discussions is the use of AI models to generate Child Sexual Abuse Material (CSAM). It is alarming that in some countries, this AI-generated CSAM is not yet considered illegal. The argument is that both the AI models used in generating CSAM and the circulation of prompts to create such content should be made illegal. There is a clear need for legal measures to address this concerning loophole and protect children from the creation and circulation of CSAM.
Furthermore, it is argued that platforms have a responsibility towards their users, particularly in light of the rapid pace of technological change. It is suggested that platforms should impose a duty of care on themselves to ensure the safety and well-being of their users. This duty of care would help manage the risks associated with algorithmic recommendations and the potential harms they could cause to vulnerable individuals, especially children. Importantly, the argument highlights the difficulty regulators face in keeping up with the ever-evolving technology, making it crucial for platforms to step up and take responsibility.
In conclusion, the discussions surrounding the impact of algorithms on child safety in the digital realm reveal significant concerns and arguments. The tragic incident of a child’s suicide underscores the urgency of addressing the issue. Suggestions include imposing restrictions on surveillance capitalism as it applies to children, making AI-generated CSAM illegal, and holding platforms accountable for their users’ safety. These measures aim to protect children and ensure a safer digital environment for their well-being.
Amyana
The analysis addresses several concerns regarding child protection and the legal framework surrounding it. Firstly, there is concern about the unequal application of international standards for child protection, particularly between children from the Global South and the Global North. This suggests that children in developing countries may not receive the same level of protection as those in more developed regions. Factors such as resource distribution, economic disparities, and varying levels of political commitment contribute to this discrepancy in child protection standards.
Another notable concern highlighted in the analysis is the inadequacy of current legislation in dealing with images of child abuse created by artificial intelligence (AI). As technology advances, AI is increasingly being used to generate explicit and harmful content involving children. However, existing laws appear ineffective in addressing the complexities associated with such content, raising questions about the efficacy of the legal framework in the face of rapidly evolving technology.
On a positive note, there is support for taking proactive measures and demanding better protection measures from online platforms. Efforts are being made to provide guidelines and recommendations to agencies working with children and adolescents, aimed at enhancing child protection in the digital space and promoting the well-being of young individuals online. This demonstrates an awareness of the need to keep pace with technological advancements and adapt legal frameworks accordingly.
Overall, the analysis underscores the importance of addressing the unequal application of international standards for child protection and the challenges posed by AI-generated images of child abuse. It emphasizes the need for updated legislation that aligns with emerging technologies, while also advocating for proactive measures to enhance protection on online platforms. These insights provide valuable considerations for policymakers, child protection agencies, and stakeholders working towards establishing robust and inclusive frameworks for child protection globally.
Jim
The discussion emphasized the importance of regulating and supporting internet technology in developing countries, as evidenced by the interest and concern of participants from regions such as Bangladesh and Kabul University. This real-world engagement highlights the relevance and urgency of the issue in developing regions.
Jim, during the discussion, summarised and acknowledged the questions raised by participants from developing nations, demonstrating his support for addressing the challenges and needs specific to these countries. He stressed the need to consider these perspectives when dealing with the issues surrounding internet technology in developing countries. This recognition of diverse needs and experiences reflects a commitment to inclusivity and ensuring that solutions are tailored to the circumstances of each country.
The overall sentiment observed in the discussion was neutral to positive. This indicates a recognition of the importance of regulating and supporting internet technology in developing countries, and a willingness to address the challenges and concerns associated with it. The positive sentiment suggests support for efforts to enhance access to, and the effectiveness of, internet technology in these regions, contributing to the United Nations Sustainable Development Goals of Industry, Innovation and Infrastructure (SDG 9) and Reduced Inequalities (SDG 10).
In conclusion, the discussion highlights the crucial role of regulation and support for internet technology in developing countries. The participation and engagement of individuals from these regions further validate the significance and necessity of addressing their specific needs and challenges. By considering the perspectives of those in developing nations and taking appropriate actions to bridge the digital divide, we can work towards achieving a more inclusive and equitable global digital landscape.
Liz
In a recent discussion on online safety, Microsoft emphasised its responsibility in protecting users, particularly children, from harmful content. They acknowledged that tailored safety measures, based on the type of service, are necessary for an effective approach. However, they also highlighted the importance of striking a balance between safety and considerations for privacy and freedom of expression.
One speaker raised an interesting point about the potential risks of a “one size fits all” approach to addressing online safety. They argued that different services, such as gaming or professional social networks, require context-specific interventions. Implementing broad-scoped regulation could inadvertently capture services that have unique safety requirements.
Both legislation and voluntary actions were deemed necessary to address children’s online safety. Microsoft highlighted their focus on building safety and privacy by design. By incorporating safety measures from the very beginning during product development, they aim to create a safer online environment for users.
However, concerns were also raised about the current state of legislation related to online safety and privacy. It was noted that legislative efforts often lack a holistic approach and can sometimes contradict each other. Some safety and privacy legislations contain concepts that may not optimise online safety measures.
Microsoft also recognised the risks posed by AI-generated child sexual abuse material (CSAM) and emphasised the need for responsible AI practices. They are actively considering these risks in their approach to ensure the responsible use of AI technologies.
The discussion strongly advocated for the importance of regulation in addressing online harms. Microsoft believes that effective regulation and a whole society approach are crucial in tackling the various challenges posed by online safety. They emphasised the need for ongoing collaboration with experts and stakeholders to continuously improve online child safety measures and access controls.
Another key aspect discussed was the need for a better understanding of the gendered impacts of technology. It was highlighted that current research lacks a comprehensive understanding of youth experiences, particularly for females and different cultures. Additional research, empowerment, and capacity building were suggested as ways to better understand the gendered implications of technology.
In conclusion, the discussion stressed the importance of collaboration, open-mindedness, and continuous learning in addressing online safety. Microsoft’s commitment to protecting users, especially children, from harmful content was evident in their approach to building safety and privacy by design. The speakers highlighted the complexities of the topic and emphasised the need for context-specific interventions and effective regulation to ensure a safer online environment for all users.
Session transcript
Amy Crocker:
Thank you very much. Sorry for the short delay, but it was a good opportunity to bring more people into the room. So thank you very much for being here for the 2023 session of the Dynamic Coalition on Children’s Rights in the Digital Environment. I know you can go and navigate many paths in the agenda, the impressive agenda of the IDF, and so we’re really happy that you are here. There are also, as we speak, some similar child rights-focused sessions going on, so thank you for choosing this, and I hope that you’ll have the opportunity to perhaps watch online some of the other sessions and engage with the speakers in those sessions as well. So as we all know, the theme for this year’s IDF is the internet we want empowering all people, and the Dynamic Coalition, which I will explain a little bit and we can talk about throughout this session, has a clear starting point that for us as children’s rights advocates, there can be no empowerment on or through the internet without a foundation of safety, and the internet we want and the internet we need is one where children’s rights are guaranteed, and that includes speaking to them about their views about their digital lives and the online world. And of course that’s not just me or our coalition or my fellow panelists saying this. We can also refer, and for those of you coming from the previous session on digital rights in different regions around the world, we have now something called the General Comment Number 25 to the UN Convention on the Rights of the Child that recognizes children’s rights in relation to the digital environment, and that was adopted two years ago. And this obliges state parties to protect children in digital environments from all forms of exploitation and abuse. So what this means is the rights that children have in the offline world, if we can call it that, are also guaranteed online, and I think this is crucial for the context in which we are meeting today. So in that context, when we talk about the AI, the metaverse, new technologies, frontier technologies, as we’ve seen at this IDF, it’s clearly at the forefront of discussion. It’s across the agenda very heavily. There are a lot of sessions talking about regulation, frameworks, guidance, opportunities, risks of these kind of new technologies, and we know they are increasingly embedded in the lives of digital platform users worldwide. So we see that legislation, digital policy, safety policy, design practice, digital experiences are at a critical moment of transition, and innovation is not new. It’s core to our human societies. It does actually define us. There is a pace of change, perhaps, that we’re seeing right now that requires us to really stop and pay attention, and consider what these implications may be, and how we can harness the positive opportunities for the next generations. Yes, indeed, I think we all agree, and a starting point for this panel, we will be balancing this conversation about the transformative power of technologies, but also looking at how we mitigate the risks, and address harms, some of which we can talk about very directly and concretely today, some of which we can probably predict, and some which we cannot predict. This is the nature of the evolving environment. We do know that governments often find themselves playing catch-up. There is a huge regulatory debate right now, but in many ways, in too many ways, it’s responsive to the problem after it’s happened. We’ll be talking a little bit about moving to a more preventive upstream model of safety by design. How do we prevent things happening before they take place, and how can we build, at the same time, those environments and communities online for children and everyone to thrive, and be well, and progress. We’ve also seen that some online companies, technology providers, are not equal in that understanding your commitment to design. I think that’s something that’s crucial for us to address. How can we all work with companies of different sizes to actually scale, and share best practice and knowledge in these areas. The questions I think that we need to ask is how we move from talk to action, how we move from policy to practice. This is also something that has come up in many of the sessions I have attended. We need to act. We need to be smart about the policies and laws we develop, but really the proof is in the implementation. The proof is in how we actually use these for the benefit of society, and how we localize these and make them relevant to the specific context in which we are implementing these policies and practices. We also need to think very seriously about how we assess and mitigate the risks of new technologies, so that we can assure safety, but also champion opportunity that tech provides for millions, billions of people living on this planet. Some of the goals of the session are to identify the main impacts of AI and new technologies on children globally, understand, hear from one young panelist, but also I see some younger participants in the room. I’m really looking forward to hearing your views on this, and to raise awareness for a child rights-based approach to AI-based service development. Perhaps at the end I’ll take the opportunity to talk a little bit about the dynamic coalition on children’s rights as a vehicle within the IGF to really bring together organizations interested in ensuring children’s rights are mainstream within Internet governance policies worldwide, and we would love you to join us. We have some flyers and some QR codes, so you can’t escape. You don’t have to write anything down, and you can consider joining the coalition so we can actually move forward. I’m really pleased to introduce our speakers as well. We have two speakers online and three speakers sitting next to me. Perhaps I’ll start with the online participants, since they’ve joined us very early, so they get the special prize. I have Patrick Burton, who is the Executive Director of the Center for Justice and Crime Prevention in South Africa. Patrick, good morning. I have Sophie Poehler. She’s a media education consultant at the German Children’s Fund. Thank you very much for joining us, Sophie. Here in the room I have, to my right, Liz Thomas, who is the Director of Public Policy and Digital Safety at Microsoft. I have Jenna Fung, who is a youth advocate for youth-led initiatives online. She’s representing the Asia-Pacific IGF, Youth IGF, and she’s part of the Youth Track Organizing Team as well. And last, but very much not least, I have Kats Takeda, who is the Executive Director of Child Fund Japan, who can also give us a perspective from the wonderful country in which we are attending this event. So thank you very much. Before we go forward, I wanted to just take a show of hands, because this is a round table. The seating makes it a little bit harder to make it a round table. So, you know, a bit of audience participation. So perhaps you could raise your hand if you’re from civil society in the room. And from government? Raised hand. From private sector? Good to have you. And from any other? And from the different regions? We have some, I think, some colleagues from Asia-Pacific region and European. Yeah. From any colleagues from Middle East? Hello. Thank you for joining us. Latin America? No. And Europe? Some Europe? Well, I’m from Europe, so. Yeah. And the Americas. Yeah. Great. Great to have you. So we have, we can have a global conversation, I think. We are lacking some regions, but it’s really great to have you all here. Thank you. Thank you for being here. I should introduce myself. My name is Amy Crocker. I work for an organization, please, called ECPAT International. We are a global civil society network dedicated to ending the sexual exploitation of children. And I’m here as moderator today, as the chair or coordinator of the Dynamic Coalition on Children’s Rights in the digital environment here at the IGF. So this is a 90-minute roundtable. You’ve already, I’ve already taken up a lot of the time, so we will go on. We’re going to organize this in terms of three themes. And what we’d like is, you know, within each theme to hear your reflections, take your questions. So we make this as much of a conversation as we can. And the first theme is broad, but crucial. And it’s on safety and children’s rights being a cornerstone of the internet that we want to need. This is our proposition, but it’s also a challenge, I think, to the internet governance community and to governments and companies and society worldwide. So what I’m going to do is perhaps start with you, Sophie, online, if I may. And perhaps you could tell us a little bit about your views on why children’s rights are so, digital rights, are so fundamental to our construction of a safe, equitable and secure online world.
Sophie:
Yes, thanks, Amy. And hello, everybody from Germany. It’s very early here in the morning. But I hope I can give you some insights in the German perspective on children’s rights in the digital world. Maybe just a quick background. I work in the coordination office for children’s rights on the German Children’s Fund, and we accompany the strategies of the German Children’s Union and the Council of Europe on the Rights of the Child in Germany here. And among other things with a strong focus on children’s rights in the digital world. Yes, Amy, you’ve already mentioned it. The General Commend 25 published by the UN Committee on the Rights of the Child in 2021 sums up the importance of children’s digital rights and provides a very comprehensive framework for this context. And yeah, the rights are crucial really to protect children from harm, but also promote their access to information and empower them with digital skills. And also important is that the rights of provision, protection and participation must be given equal consideration and are really of fundamental importance for the digital world. So upholding these rights is not only an ethical imperative, but also an investment in the well-being of future generations and the society as a whole. And maybe a quick German perspective, which is quite concrete. We have, as German Children’s Fund, we have looked into needs and challenges voiced by children when it comes to risks arising from online interaction. We have analyzed the research field on this question on how children deal with interaction risks such as insults, bullying or harassment in online environments. And therefore we’ve conducted a meta-research and compiled an overview of relevant studies with a focus on German children, how they develop coping strategies and how we can promote this, focused on the age group from nine to 13. And we’ve gained some interesting findings from the reviewed studies when it comes to children’s perspectives on online safety. Just a quick disclaimer, this was not in the context of artificial intelligence, but we still consider the results relevant to our discussion today. And I’d like to pick some important points for the discussion today and later maybe. The younger the children, the more important it is for them to have a social safety net. In case of online risks, they particularly want support from parents, confidants or teachers. And especially, particularly parents are perceived as the most significant and desired safety contact persons for young children. As children grow older, they increasingly resort to technical strategies to deal with online risks, such as blocking, reporting, deleting comments, enabling comment function. And this points to the considerable importance of the safe design in online spaces, which must be adapted to the needs of each age group. The youngsters voiced that platform-related reporting functions are seen critically by them, because the platform side processing of reports takes too long in their eyes and sometimes even fails to occur altogether. They want more information on how to report people, how to block people, and how to protect themselves from uncomfortable or risky interactions, especially sexual interactions. And there’s any case that they need more education to make a more informed decision when coping. And last but not least, two points from a study from THORN, conducted two years ago in the US, so not from Germany, but they have some interesting findings when it comes to reporting. First, anonymity plays an important role for adolescents, especially for young girls. They report that they would be more likely to use technical tools if they could be sure their report would remain anonymous. And very interesting, at the same time, this study results also show that adolescents would welcome a human connection in the reporting process in addition to anonymity. So the big majority of the 9 to 12 year olds we’ve looked at said they would be more willing to use reporting tools that connect users with a human than with an automatic system. And yeah, just a quick insight, there are more findings, but those highlight the importance really of human resources, as well as safe design for children in coping with risks online. Thank you, Sophie. And you’ve touched upon, you know, the second
Amy Crocker:
and third thing that we would be talking about, which is on the one side regulation and policy for safety, and that can be, you know, government policy, platform policy, and then also the issue of safe design, and we’ll go into those. And I think, you know, it’s really interesting, you know, obviously drawing on research conducted with children, when we take a rights-based approach, you said, you know, you won’t be talking, this study wasn’t specifically talking about AI, but indeed, if we take a rights-based approach, it is about rights, perhaps about principles and values, and the technology itself should be responding to those needs rather than the other way around. So I think before we go on to sort of some of the other issues around regulation and policy, I also want to turn to you, Katz, if I may, to talk about, I mean, we’ve heard from Germany, to talk about in the Japanese perspective, your experience of doing your work based on children’s rights, and what that means in terms of creating safety nets, meeting the needs of children, understanding their thoughts, so that you can help advocate for them based on their rights.
Katz:
Thank you for inviting me to this IGF, and especially this dynamic coalition session. So let me share some of the facts from Japan. But before then, I have to say, child rights is a fundamental part of this, I would say, work and societies everywhere. And we are, as a child fund, as child-focused agencies, we are promoting child rights everywhere. But we face several challenges so far. So we conducted some kind of omnibus survey recently. It’s in August. So this is age from 15 to 75 years old. It’s a quite long, quite wide range. This is a kind of image of the public opinion. So we have a question about the definition of CSAM. and also including some of the questions about AI. So, let me share some of the challenges here. Then, the results said is, how say, is some of the internal conflict between the human rights, especially the child rights, and also freedom of expression. So, this is maybe never-ending conflict everywhere, maybe not only in Japan, some other countries. I want to know some other countries’ practices or situation later on. But we think is we need to kind of balance between the two conflicts. Otherwise, we cannot continue to never-ending discussion between the child rights and freedom of expression. And secondly, I have to say, I want to share this one, some kind of misunderstanding or misinterpretation of human rights. So, we ask this kind of a virtual see-sam, and also see-sem, to the responders. Is some of the respond and some comments, narrative comments said is a virtual see-sam, see-sem, will prevent real crime. So, this is kind of a misunderstanding or misinterpretation of human rights, especially the child rights. We need more awareness or education to the public for this one. And thirdly, so I want to share this one, is the one of the result of the public opinion is the question about AIs. Is many of them is how say, we should regulate AI under the context of the see-sam, see-sem. But still is a minority is disagree on this how say, regulation. But interestingly, is 20% of the how say, answer is respond is we don’t know or I don’t know about the AI matters or AI risk. This is quite interesting and also some kind of risk in the future. So probably we should more focus how say, awareness to the public about the risk of the AI and also the opportunity of AI in the future. So that is one of the, some of our results. So, I just how say, share these three points but maybe later on, so I want to hear from you about some other thoughts or insight or some kind of a result or research about the similar work on your country. So, yeah, that’s it. Thank you.
Amy Crocker:
Thank you. And I think you pick up on a really crucial point is how children’s rights are understood and sort of made real within societies, how they’re realized, which often will be dependent on a local context based on principles that we have agreed on globally. But also helping people understand technology and the risks and opportunities. And I think this is a challenge and maybe something Liz, you will speak to later, how people, how you make technology explainable enough that people understand the different sides of it when they’re using it. And indeed, I think we will talk a little bit later about parents and the empowerment of parents. And I think this is something that has come up many times in conversations I’ve been hearing this week. So, speaking about children’s rights, Jenna, I’ll turn to you to tell us that we’re all talking rubbish. No, I’d love to hear your sort of, your perspective on how your experience of how children’s rights can be used to advocate for youth and whether you think we’re doing that in the right way.
Jenna:
Sure, I will try my best. As I work so closely with the youth in my own region in Asia Pacific, most of the people who are involved in this YIGF, they kind of have some sort of knowledge about what we’re doing here. And the youth that is engaged in those conversations, they’re over 18. But then, as we talk about children, they’re very young. And so, today I will add some and bring out some points from those outcome that we have discussed in Asia Pacific, but try to, we’ll try to have some more representation and we definitely not represent like teenager, which I personally see that they are the one that face a lot of challenges online these days. But I would touch on it a little bit later as I prepare some notes here and hope that I won’t disappoint the audience here today. I believe youth, not youth, sorry, correct myself, kids today, they leave and breathe the online world. They practically born with internet and tech gadgets in the hand, which many of us don’t really get to experience or dream of back in the day, even myself as a Gen C-er. I don’t get to experience that. I only get to get introduced to a computer or internet when I was in kindergarten. But kids these days, they have their smartphones or iPads in hand. As soon as their parents play the baby shark, they stop crying, right? That’s what they’re dealing with these days. It may be a bit dramatic to frame it this way, but before they’re born, the photos, everything, are filling up the parents’ social media feed. That’s basically how I find out my high school buddies become parents, and probably because their parents are Gen Z and posting on social media a lot. These kids today, they don’t really get to choose because they’re not born yet, but they’re already online. So it complicates our conversation even more. It might not always be the case because there are people who choose to be online, but somehow it is happening a lot more because of how different generations will use internet or technologies. And I think with all this, we must talk about trust. This is one of the biggest thing we also touch on a lot in the Asia-Pacific Youth IGF and within our own youth community as well because this is basically the bedrock of digital age, we believe. In a world where tech, we rely on technology for almost everything, I guess we don’t have to explain too much after the pandemic. Without the internet or technologies, we can’t really live during that time. So trust is really become the glue that holds everything together. And digital age make trust really crucial against the backdrop of growing reliance on technologies and possible risks related to data breaches, data privacy problems and unethical practices. So building trust and imparting fundamental digital knowledge are essential steps in creating a reliable and ethically responsible digital environment for the younger generations. Our society has evolved a lot to embrace diversity in terms of backgrounds, culture, sexual orientations, and more. With the progress that we have accomplished, potential harms and risks multiply. And the challenges of teenager that face and encounter today probably way more multifaceted than those of the past. And I myself can’t even relate. And I really hope that we will have a mechanism to engage those teenager, technically they’re underage, to be in the conversation so I can hear from them. I can’t speak for them because I am not them. Naming some classic examples from that, cyberbullying. We’re still talking about it. In early age, I mean early stage of the internet, you know, flaming, trolling, harassment through emails. Now it’s different, it’s not more than emails. Where today, where younger generations facing more than just social media bullying, it’s now that they’re encompassing a wider range of challenges like hate speech, doxing, cyberstalking, or one of the most concerned ones, like hate image-based abuse, especially with the rise of generative AI. It just make everything relatively easier to do. And so that’s one of the things that I think there are, the encounters, like just complicated than before. And when underage face such challenge, it’s very natural for them to turn to parents. Because, I mean, it’s just natural. Talk to someone you trust. Sometimes it may not be their own parent, but someone they trust. But you know, to provide a safety net for underage, the guardian can’t do it alone, and they must know something. And not all the parents or guardian would have the same level of knowledge of anyone in this room. And so, you know, especially when there’s nuances on the wrist that young children and teenagers ask, both do, it’s a totally different thing. I think it’s like a responsibility, it’s shared by all stakeholders in terms of safeguarding the younger generations on this very topic. And I probably should stop here and save the rest of the point when we move on to team two, theme two and three. And I hope that I have already brought some new insights from the younger generations. Because as I observed, there’s like only a few youth that’s interested in these kind of topic. And I hope that I represent a small portion of it here today. Thank you so much.
Amy Crocker:
No, you absolutely did. And you’ve touched upon some really good points that set us up for the next topic. But of course, they’re all interrelated. And I really liked that you mentioned the word trust. I think this is a really important word in these times. Trust in algorithms when we talk about AI, trust in institutions, trust in companies, trust in parents. You know, some children, many children don’t have a trusted adult that they can rely upon to help them. So I think we have a lot of different issues we need to sort of unpack. Before we go on to talk about everyone’s favorite topic of regulations and policies, just after lunch. So we’re at risk of everyone falling asleep. No, I’d love to hear from the room if there are any perspectives on how you’ve found building your work upon a basis of children’s rights, useful, challenging, difficult. I don’t know if there are any perspectives. I could call out, I think we have some colleagues from Brazil who just did a wonderful session and you had some videos of children themselves speaking. I don’t know if you’d like to speak or anyone else in the room about how you’ve used children’s rights practically in your work to do the work that you do. Just use the microphone because we have online participants.
Larry Magid:
Yeah. Thank you, I’m Larry Magid from Connect Safely. So in previous IGFs, we’ve had some workshops that I would co-lead called children’s rights versus child protection and the tension between the two. We could protect everyone in this room by putting in bubble wrap and never letting you out of your bed, although you would probably die from some bed-related disease. But the point is that being active in the world automatically creates some risk and clearly being online creates some risk, everyone knows that. And so we want to protect children, but at the same time, we want to protect their rights. And sometimes those are in conflict. And where it becomes particularly critical is in the area of legislation. Because even the United States, which as you all know, has something we call the First Amendment, which if you read the First Amendment in the American Constitution, it says nothing about how old you have to be. It doesn’t say people over 18 have the right to free speech. Everyone has the right to free speech. Well, it doesn’t really say that, but that’s how it’s interpreted. But at the same time, there are laws being proposed in America which would, for example, prohibit children from under 18 to go online without parental permission. So that means a 17-year-old exploring their sexuality, their politics, their religion, or whatever, would have to go to their parents for the right to express themselves. As everybody here I’m sure is aware, the UN Convention on the Rights of the Child guarantees children the right of freedom of expression, participation, assembly, et cetera. So these are in conflict, which is not to say that we should allow five-year-olds to look at hardcore pornography. I mean, I’m not arguing that we completely enable, empower all children to do all things, but at the same time, how do we ensure their rights and protect them at the same time without suppressing their rights? And frankly, if you were to ask some legislators, at least in the United States, and I think it’s true in other countries, they would favor protection over rights and would take away their rights in the name of protection. And it becomes particularly of an issue when there are marginalized groups that are engaged in controversial activities, whether it’s politics or transsexual issues or other issues, where their rights are being suppressed by legislation in name, reportedly, to protect them. So I just think that’s an important backdrop. And even though that workshop is not on the agenda at this IGF, it’s probably more important today than it was even the last time we had that conversation two or three years ago, because again, I can only speak for my country, there is more and more legislation that would essentially deny children their rights for participation online. Thank you.
Amy Crocker:
Thanks, I don’t know if anyone wants to speak to that, but I think absolutely, and there may not be a session on the agenda, but it’s certainly something that has come up many times in the conversations we’ve all been having and at different sessions. And it is a huge challenge we face. I wish I had the answer. In some ways, I feel like we need to embrace those conflicts because we’re always gonna be navigating those conflicts. But I think when we go on to regulation and policy, we need to really critically assess what we’re trying to gain through different regulations and how those should be shaped. I’ll be speaking to that. So we have two questions online. Bangladesh, can you comment on that? Maybe you’d like to ask your question while we’re waiting, yeah.
Steve Del Bianco:
Well, thank you, and it’s a follow-up on what Larry pointed out, Larry Magid. I’m Steve Del Bianco with NetChoice, and two of the US states which have aggressively attempted to ostensibly protect children extended all the way up to the age of 18 is a requirement that any user of any social media site, even something like YouTube.com, would have to present two forms of government-issued ID to make sure that the services knew that was an adult. And if they were younger than 18, they would have had to show that a legal guardian or parent had given verifiable consent for them to use a site. It’s fine to protect a 13-year-old or a 12-year-old, but it was a little ridiculous applying to a 17-year-old. And my organization, NetChoice, sued two states that had these laws, the state of Arkansas, the state of California. And last month, just a few weeks ago, we obtained a preliminary injunction blocking the state from enforcing those laws, which looks terrible for the tech industry to be suggesting that a state was wrong to try to protect children. But in fact, the judges ruled that the state was wrong to do it the way they were doing it. And in that mix will be an argument about the rights of a 17-year-old to access the kind of content that Larry brought up. And since your question was specifically about the rights of the child, if you dive into the document that’s on every other chair, the best interest of the child is supposed to be a balancing test. Whenever I say that, I get heartburn thinking about GDPR, but it’s a balancing test about the rights of the child to access and express versus the need to protect the child from harm. So I think you bring up the right framing of the question. And I realize that other nations that run into the same problem, Larry and I are in the United States, they may not be able to rely upon a court system and a First Amendment and the Constitution toward a block. block a state from going that way. But we need to educate lawmakers, or they will write laws that are mainly messaging bills, where they get to claim they’re trying to protect children, when in fact the mechanisms to do it on age verification just don’t exist. Thank you.
Amy Crocker:
Thank you, yeah. I mean, and of course we could have a whole kind of week-long session about these topics. I’m gonna move now, because in the interest of time, to the Bangladesh Remote Hub. There seem to be many of you. Great, please, go away. Tell us your question. I’m from England, but my English is poor. I do apologize. Please, please give us your question.
Tasneet Choudhury:
Hello, all. I am Tasneet Choudhury, Joint Secretary of Women, IGF, Bangladesh, and Media Personality. Dear moderator and today’s event, greetings to all present. Thank you for giving me this opportunity to ask my question. How do we ensure that AI strategies, policies, and ethical guidelines protect and uphold child rights across the world, especially developing countries like Bangladesh?
Amy Crocker:
Thank you. Thank you so much. Oh, for the question.
B. Adharsan Baksha:
We have another question from Bangladesh Remote Hub. Can we speak? Yes, please. Okay. Thanks a lot to all of us. I’m B. Adharsan Baksha, Bangladesh. Hi, sir, Bangladesh IGF. My question is, AI adoption among children can present many real risks. Data privacy being cheap among them. How popular chatbots like Synapse and MyAI can quickly extract and process vast amounts of personal data, potentially exposing children to cyber threats, targeted advertising, and inappropriate content. How we ensure a secure cyberspace for the children? Thank you.
Amy Crocker:
Thank you very much for those questions. And I think that they are big questions, but I think it leads us very well to sort of the topic of regulation policies around some of these really challenging child rights and child protection issues. And I suppose I’m gonna put a question to you, Liz, from Microsoft about, you know, what are the risks? Are there risks in a kind of one size fits all approach to dealing with some of these issues? Because clearly we have a number of different harms. We have different, as our colleagues from Bangladesh have just said, different contexts in which we have to consider these issues.
Liz:
Fantastic. Thanks so much, Amy. And thank you for the great questions online. It’s awesome to see the remote hub. I didn’t know folks were gathering in different spaces, but that’s brilliant. I mean, so starting from our starting point is Microsoft. You know, we absolutely recognize that we have a responsibility to protect our users and particularly our youngest users and children from illegal and harmful online content and conduct. And part of the way in which we have to do that is through that incredibly necessary balancing of rights. So children’s rights in the round, thinking about it as holistically as possible. So advancing safety, but also thinking about privacy is the questions just raised around freedom of expression, around access to information and everything else. And I think in part answer to the question that was just raised as well, I think the way that that happens is gonna be a combination of an ongoing need for both regulations, but also voluntary activities as we look to take on, you know, and build in safety and privacy by design. But for us as Microsoft to really do that balancing effectively, one of the things we really have to think about is the differentiation. So thinking about the differences between the wide variety of online services that we have. I suspect most of you in the room will be familiar with one or more of the wide variety of Microsoft’s product suite. But I think, you know, what we have to really think about when we’re thinking about a gaming versus a professional social network versus productivity tools is how we really tailor our safety interventions to the nature of that service. And so when we think about this, that’s really at the heart of our approach is how we think about safety and rights in a way that’s proportionate and really, really tailored to the service and the harms in place. And that’s at the heart of our internal standards and the way we think about safety by design as a company. And that includes when we think about what’s appropriate in terms of parental controls, the guardrails that are in place, whether we’re thinking about what the business model looks like and the kind of platform architecture or what’s needed by the way the culture of the service and what we wanna try foster in terms of user behavior and the way that we educate users and parents on those services. And really, we have seen some challenges start to arise internationally where regulation has been really, really broadly scoped and creating that sort of risk of one size fits all requirements. And a really good example of that that we see a lot is a real enthusiasm and desire to address some of the well-known issues arising from some of the social media services. But the definitions that can come through here may actually inadvertently capture a range of other services with measures that might not be appropriate or proportionate on those services. And so again, we really wanna help think through what the right, what the appropriate safety measures are to really think about rights in a holistic way. And then I think that comes a little bit to the points that have just been made on thinking about privacy and safety and isolation as well. Because we, particularly in legislation, thinking about kids’ privacy and safety, we see some kids’ privacy bills. We see some safety bills. And again, these are not taking that holistic approach. Or actually, there are some laws as well we are coming through where there are concepts from safety legislation and concepts in privacy legislation in ways that may not entirely work together here. I mean, it’s a challenge for us all because I don’t think there is a perfect regulatory model for this yet. We are all still learning. One of the things that we are starting to see come through more is really a set of focus on outcomes-based codes. And so really thinking about what the flexibility is for different services in the scope of those codes to achieve the safety and privacy outcomes that are desired. That does start to create a bit more of a web of granular and complex secondary regulation. But I think it’s the starting point where we really come to a place where we can evolve our approaches, really think systematically about risks, about rights, about impact on kids, and really think about what that looks like for the products where children are most vulnerable but also where the opportunities arise. And so enabling us to think really holistically about risks and the mitigations for those going through design and other choices. And I think we are also still learning on the process of learning about what that looks like for some of those products. I know there are folks here at the IGF who are doing some amazing work in this space. I mean, I think one of the things we’ll talk about as we come through, too, is there is still a need, I think, to grow some of the evidence base, particularly on emerging tech, to think about how we do this best. And so I’ll come to that in the next part of the conversation. But I think the other piece I just wanna flag as well as we think about different legal regimes culturally is that there is a risk that globally we see existing economic and social disparities and other inequities really enhanced if there are regimes created where kids are unable to access technology. Thank you.
Amy Crocker:
And that really brings together the importance of elevating children’s rights and how we design and how those are reflected within policies. And indeed, I mean, Patrick, I’ll go to you now. I think it’s interesting. I’ve also been, there’s been some talk of fragmentation of regulatory policies. I’m also told that we shouldn’t be using the word fragmentation in this context. But I think it is interesting in the United States. I know that’s been a challenge that you have state-based laws that main conflict with federal laws and that will be in other countries as well where you have those kinds of structures. And I think there’s richness in diversity, perhaps in testing what goes wrong, but regulations take a long time to develop. So we can’t just pivot in one month and decide we’re gonna create something new. And I think this is a challenge. So Patrick, you’ve seen this issue from many perspectives, both from South Africa and your region, and of course globally. And I know you and I in the past have also spoken about prevention versus regulatory approaches. So I just wonder what your perspective is on sort of differentiated approaches to regulations in different digital spaces, but also the balance between these different kind of, not conflicting, but different factors.
Patrick:
Yeah, thanks, Amy. And it’s quite hard to come after these amazing speakers who kind of taken all your thoughts and put them far more coherently than you could have. So I think I’m just gonna start off by reiterating what almost every speaker has said, that I think while we speak quite glibly about child rights, what those means in different contexts, I’m not sure that we can all together count on how child rights, even as they are contained in the CRC and in general comment number 25, translate into practice in different cultural, religious, national, geographic contexts. It’s huge variation in how child rights are interpreted or where countries or states choose to place the emphasis. And inevitably we see that emphasis being placed on particular rights rather than equitable embracing of all child rights. And that really translates so much into the digital space. I apologize, it’s six o’clock, I’m still not altogether coherent. But I also just want to say that, I think start off by saying that I don’t think we can regulate our way out of the challenges that emerging technologies, immersive spaces, AI present us with. Regulation, we need to bear in mind, is just one of those tools, one of those arrows in our quiver. And I think we often place so much emphasis on regulation and states put place so much, and when I say states, I mean nation states or states with provinces, whatever they might be within national boundaries, place so much emphasis on regulation because they see it as not an easy win, but it’s a very visible commitment to what they are doing and to their commitment to making sure that children stay safe online without investing, without putting the sort of proportionate investment into, as you say, the prevention side of things, the education, the awareness raising, building capacity of parents, building capacity of children, to deal with building children’s resilience, the one thing that we haven’t spoken about. And so regulation is critical. We can’t do away with regulation, but it really is just one component of what we need in order to make sure that children’s rights are realized online. Now, what does that mean for regulation? Liz mentioned this increasing focus on looking at secondary regulation, which is often quite messy. And I think there is a lot to be said for that approach because ultimately, platforms operate in different ways, services operate in different ways. There are some global standards, how data is managed, how data is protected, how data is collected, how data is used, for example, relating to children’s privacy online and the right to protection. Those are standard, but at the same time, different services offer different opportunities for children to learn digital skills, to be creative online. We need to recognize that children have different evolving capacities at different ages, in different contexts, and those evolving capacities are largely influenced by different contexts, geographical contexts in which they live. Those evolving capacities are influenced by the households they live in, by the access to non-digital services they have access. We know the link between what happens online and what happens offline. And so I think having sort of differentiated approach makes sense. I think it is a logical approach, but we can’t wait for that sort of regulatory environment to concretize. I think, Amy, you just summed it up perfectly. Regulations take a long time to implement, and we need to learn from the failures of regulation, and we need to see what’s working, what isn’t working. The same with legislation. You started off the session talking about the gap between legislation and implementation. Well, from the time we start formulating policy to the implementation and the evaluation of the implementation, you’re talking 10 years, by which point we are in a whole different universe in terms of emerging technology. And so we need to look at what individual services can do, platforms can do. And I can’t think about this without thinking that, in order to achieve that, we need to make sure that we are all singing of the same hymn sheet when it comes to what child rights are, and the transparent commitment and culture of child rights that any business, industry, government needs to be working from, and the transparency around that. Am I making sense? I mean, hopefully you can kind of bring all that together. I’m gonna stop, otherwise I’ll just keep talking.
Amy Crocker:
No, thank you, and I hope you have some coffee, coffee or tea by your side. But no, absolutely, it does make sense. And indeed, I mean, I’ll ask, you know, Jenna for your input on this, but I think absolutely the, I’m losing my train of thought, the need for, and we will talk now about a design approach and a child rights-based design approach, that we can’t wait for regulation. I think there is a strong role for it to provide a framework, to provide a legal basis on which we can have conversations and decide how to act. But I mean, each one of us in this room has probably five, 10 stories about the uptake of AI models or AI products. We won’t name any in particular. And some of those are good, some of those are bad, but it’s happening faster than we have the ability to take action. So we need to think very critically about where we go and actually build those into decision-making processes earlier on in the design of products, the building of products, and that’s what we will go on to. But Jenna, I just wanted, before we do that, and then if we can take any reflections or questions from other participants in the room or online. Jenna, what is your view on, from engaging with youth and through the youth sort of IGF perspective, what is your view on regulation as a, not as a solution, but as a part of the solution to some of the challenges we face? How do young people see that? What are the priorities for building a safe and empowering environment?
Jenna:
I’ve prepared some notes around it, of course. But before I respond to your questions or theme two overall, I wanna quickly respond to what Patrick mentioned earlier about how cultural factors and just culture in general, let’s frame it that way, will be so different. Because earlier this year, I partnered with a group of amateur, we just do it voluntarily, all this policy research actually, we had from Bangladesh local hub, actually we worked together to make a study and try to see how different jurisdiction in Asia-Pacific deal with online safety. And from part of our study is that Australia adopted industry code to mitigate this issue, where Singapore use a more government driven way. So it’s kind of reflected some cultural influence in how we approach things. And I just find that it’s really a fact that we have to admit, because especially in Asia-Pacific is really different. Myself and East Asian, there are things that I can’t understand completely from those who are from Southeast Asian and South Asian. And sometimes we will be unconsciously biased and people sometimes from Western world do not think that Indians are Asian as well. I find it quite interesting when I hear from some people sometimes. But anyway, that’s my quick respond to it. And I will try to touch on the question that you asked with the notes I prepared. But most of these are part of the outcome that we have from the discussion we had last month in Brisbane in our annual meeting. I think to deal with this very topic that we are trying to address today, the youth think that we need to have a clear definition and scope about all this online safety threat. Because sometimes different people of different background will have different definition and it’s important to have international standard of course, but also to have some localization to adapt into it. And so it’s relevant to their environment. I think the other day I was attending a workshop. and then they were doing capacity building even at a municipal level, because that might be even more effective, because I personally work so closely with youth as a project manager for the Asia Pacific or IGF. I figure out that we have to empower them at many level in order to get their voice heard, especially when we talk about internet governance, child rights online. If they don’t really know about the technical aspect, sometimes they will suggest something that is not really relevant. Putting my other hat on, I actually work for the top level domain registry as well. Sometimes we think that we understand how the technology of internet work, but then when I talk to those engineer, they were like, all this details that entered a head and they were like, that’s not exactly what it is, but sure. So we need to have more stakeholder get into the conversation, because there’s no way for everyone to understand everything. So we need to put all of them together. And if we are circling back to here, I’m going too far. If we are trying to bring in the younger voice, I really want to shout out to Bangladesh actually. They started way far ahead, because I know that they have this kids IGF happening in the past two years, which is very progressive. It’s hard to get a five years old into our conversation here, because there’s like different levels, but at a kid’s level, it’s really a good way to start early for them to start engaging them. There’s no way for my mom to understand what we are here talking about. Been here for a long time, she still have no idea what I’m doing. But what we really want to stress is that we need to have a multi-stakeholder approach, but in order to achieve that, we must have capacity building along, try to make information accessible, use more accessible languages as well. So people with different level of knowledge can understand, and sometimes, and myself included, don’t really speak English as the mother tongue, and so there’s like loss in translation sometimes. That’s also one of the barrier. And so if we really want to regulate and stuff, I think we really need to bring different voices into the process, and democratizing the process eventually.
Amy Crocker:
Thank you so much. I mean, you’ve hit on so many important points, and I think I love this. I often think of it as the regulation is being top-down, but your point about the bottom-up approach, not only among children themselves, but in communities, and actually building solutions through that. And then when we go on to sort of the safety dimension that helps support that, I think that will be crucial. I know we have, Jim, some collective comments or questions.
Jim:
Yeah. I’ll just summarize it, but just to pick up on the Bangladesh point, the second question was actually from the vice chair of the Bangladesh Youth IGF, so they’re actively engaged there. But between those questions, and we have a question here as well from Mohammed, who’s an instructor at Kabul University in Afghanistan. I think as you’re addressing these issues going forward, what about the perspective, and what can be done to help in developing countries like Bangladesh and Afghanistan address these problems? I think we all know the history of the challenges that these countries have with technology, and access, and capacity building. So as we’re discussing this forward, maybe think about that as part of your comments.
Amy Crocker:
Absolutely. I mean, would anyone on the panel like to talk about how we can address some of those issues? I mean, I think, Jenna, even you spoke a little bit about, looking at different opportunities for codes of conduct that can be not copied exactly, but that are based on some values, principles, some guidelines possibly, that can be translated into your own context for the participant from Afghanistan. I think learning from approaches to regulation that can work possibly, but obviously understanding the context there. And I suppose also back to Jenna’s point, making sure that the children and young people are consulted, find out what do they think, how do they feel about these issues, and trying to drive that. But again, I don’t know if anyone, even in the room would like to comment on that. Yeah, otherwise, we’ll take a question.
Andrew Campling:
Okay, thank you. Andrew Campling, I run a public policy, public affairs consultancy, but also a trustee of the Internet Watch Foundation, so probably more with that hat. It’s a very big topic, so I’m trying to give two fairly narrow points that are at least linked in token ways to AI. So first one, so first one, algorithms quite obviously make malicious content much more accessible through their recommendations. So for example, in the UK, we’ve seen a child who unfortunately was shown suicide-relevant content, committed suicide. It’s highly improbable she would have found that content had the algorithm not shown it to her. So first question, should there be restrictions on the application of surveillance capitalism to children? A blanket prohibition of doing the data gathering of known child users on platforms in the first place to try and prevent that from happening. Secondly, AI models are already being used to generate CSAM. So should AI-generated CSAM be illegal? It is in some countries, but it’s a loophole in others. And should prompts that are deliberately intended to generate CSAM, should the circulation of those be made illegal? Because there’s an active trade, that’s the right phrase, and the best prompts to use to get the images. And then more generally, so given the pace of technology change, and you said how difficult it is to create regulation, it’s easily been outpaced by the changes in the tech. Dare I say it, learning from the UK experience, should we try and avoid being caught out by the pace of change simply by imposing a duty of care on platforms of their users? Because otherwise it’s pretty much impossible for regulators to keep up with the changes. So just give the blanket duty of care and put the problem on the platform operators to do that responsibly. Thank you.
Amy Crocker:
Thank you. Big questions. I know that Patrick wants to come in. Oh, do you wanna quickly speak to that and then we’ll bring Patrick in? Patrick, go ahead.
Patrick:
Thanks, Amy. Just two very quick responses. The first to the question from Afghanistan, and it’s just a general observation. In so many of the countries in which I work, where governments are trying to catch up on policy, they’re trying to catch up on legislation, they’re looking to key countries for model legislation. They’re desperate to look at best practice. And so what tends to happen is there are three or four countries that come to mind, and they look at those countries and try to model their own legislation based on that, without recognizing some of the challenges and the dilemmas that those pieces of legislation face, or where they haven’t got it right. And so there’s a real danger in developing countries saying, okay, well, this is what country A has done, we’re gonna follow that model without any critical engagement as to what some of the challenges in implementation might be. So that’s just an observation. I think there’s a real danger of doing that. And I do see that a lot in many of the countries that I work in, Southern Africa, North Africa, some of the Asian Pacific, smaller island countries and territories. And then if I can just use my position, my mic, just in response to the question or the observation from the IWF colleague. The other thing that I’ve seen so many of the developing countries that I work is this issue around definitions. And now you raised the example of AI-generated CSAM. What tends to happen is countries are loathe to update their sexual violence. Whatever legislation, their child sexual abuse, exploitation, crimes, offenses are contained because it takes so long. And that’s why I think it’s also up to individual industries and companies to say, we are going to adhere to these definitions of CSAM and that includes AI-generated CSAM. So that it’s actually a step ahead of changing national policy because that is going to take five to 10 years for that policy to update because it’s such a process for legislation to be updated. Thanks.
Amy Crocker:
Thanks, Patrick. Go ahead, Liz, and then I’ve got many follow-ups to give to people in the room. Great.
Liz:
Well, I will try to be brief. I mean, I think a couple of great questions from the IWF here in the room and things that are really top of mind for us. And actually, I think this goes to some of the points I was hoping to raise anyway. So excellent segue. I mean, on the topic of AI-generated CSAM, I think certainly for us in industry, thinking about these risks has absolutely been at the core of our responsible AI approach at Microsoft, but also how we’re thinking about applying safety by design across the services where that’s being deployed and the features in that. On the question of legality, I think this really goes to some of the conversation we’ve just had around A, the criticality of regulation, but also B, regulation not being the only tool in the toolkit. And I think it goes to, again, we have to have the whole of society approach to addressing these problems. And part of that will be us taking responsibility to make sure that this particular horrific harm type is not being disseminated or created on our services. But secondly, that need for urgency in some regulation. I know in some jurisdictions, there have already been statements around the legality of CSAM, but I think it speaks to some of the great work by the We Protect Global Alliance and others as well with the model national response to really help support harmonization on legal regimes in this, so there are not spaces where this crime is permitted. On the question of whether children should be able to access some services or not, two quick points in response. And I think part of this goes to the references to safety by design across diverse services that I made before. And part of that is really thinking about where there are recommendation systems or other features, what impact that has on the risks to young people on the service and understanding the potential mitigations for that. But more broadly, I think you’ve kind of raised one of the major topics under discussion in child rights and child safety conversations at the moment, which is obviously age assurance and the ability to identify whether users are indeed actually children. And there are multiple strands of work, I think, that are needed here really to A, help us find the right tech solutions, noting that there are a range of trade-offs between sort of getting the right degree of accuracy around the age of a child versus privacy, security, and other factors. But then B, once we do know the age of a child, what are the choices that we make around the safety interventions and indeed access to services on that? And I think this is where we are really, certainly as Microsoft, very keen to continue the conversations with the experts and grow our evidence on these topics.
Amy Crocker:
Thanks, I know we have some questions. I just wanna, Sophie, I don’t know, because I know you’re waiting there with us online. I wonder if, picking up on the point I think made about the use of children’s data, for example, I wonder if you have anything you’d like to say about, for example, the Digital Services Act and what that may kind of mean for sort of protecting children’s data within the EU. Is that something you’d like to speak to or speak about the European context?
Sophie:
Yes, I can give a short insight. So we have the Digital Services Act and the European Union, which is going to be in point next year. And we also have regulations following the DSA in Germany. Right now, we are discussing it a lot. And from a child rights perspective, we consider it a really important point to make and a good way to protect the data of children, especially when it comes to advertising, but also when it comes to the responsibility of very large online platforms to protect children and young people from certain risks. I’d also like to add something to the idea of children’s rights by design and also children’s participation in regulation, because I think this is a crucial aspect if we really want to think children’s rights in a holistic way, not only to focus on the protection point all the time, but to do it in a holistic way and also looking for how can we empower children, how can regulation support empowerment of children and also how can regulation support the participation of children. And because how digital media are regulated and designed has a really direct influence on the lives of children and young people, but they, if we are honest, rarely have a say in these issues. The GC25 also addresses this right of young people to participate in questions and decisions about the digital environment. And here in Germany, we’ve already seen some efforts to involve children and young people in the design and implementation of legal youth and media protection. And as a German children’s fund, we’ve conducted also an exploratory research and concluded in this context that we need quality criteria for participation in this point. And we’ve encountered already a wide variety of participation oriented formats, such as consultations or comment processes where children are included and involved in regulation processes. We have youth juries, editorial boards, and also young people who design products and even design and conduct events on their own, get involved in peer-to-peer networks or consultations. And I’d be very interested in experiences from other countries. And this also leads me to the point of safety by design, child rights by design. Children and adolescents need social spaces where they can really implement their own ideas without being primarily affected by product guidelines or market-driven interests, allowing them to exercise their right to open creative processes. And this likely clashes a bit with a metaverse concept whose hosts also target young audiences. So safe social spaces are more likely created by civil society and educational organizations. That’s what we’ve seen so far. And the approach of children’s rights by design offers providers the opportunity to place children and adolescents’ self-realization and participation really at the forefront and develop ideas on how to involve them as informants and full-fledged design partners. And this is also, I think Patrick already mentioned it, an opportunity to bring in the aspect of the evolving capacities. And to really look how to develop age-appropriate social online spaces. And yeah.
Amy Crocker:
Thank you. That may be on this part. Yeah, thank you so much, Sophie. And I’m sorry to cut you off because we have a queue of questions in the room. So we’ll take some questions. Please go ahead.
Amyana:
Hello, Aymana. I’m from Brazil. And right now in the National Council for Children’s Rights we are preparing a document with some guidelines and recommendations for the prosecutors, the public ministry, and all the services that work with children and adolescents. And what these agencies should do and require from platforms to protect children. because how can platforms manage to remove content from films, for example, and not remove violent or dangerous content for children? So how can we focus on protecting by design, like you were saying, because yes, there are standards, international standards, but they are not applied equally. Children, especially from the global south, have a much lower level from protection than those from the north, and we already have data to affirm that. And another question is about how we can legally framework, for example, images of child abuse created by AI, because we are thinking about this now, because our legislation doesn’t fit for these actions. So how have you been dealing with this in your countries, like apology for crime or incitations? So that’s it. Thank you. I will quickly just
Amy Crocker:
see if, and then we’ll take, Kasia, your question. Kat, I don’t know if you would like to respond, because on this point of how you can think about legislating for this, because this is the point you raised earlier in Japan, and how you can sort of build awareness about the need to criminalize these types of content. Thank you for raising the issues. It’s quite important, but for
Katz:
Japan, so we don’t have any regulation and policies to regulate that kind of AI generated image so far. As quite recently, the BBC, how they focus on some kind of AI generated, some kind of a system, and but we couldn’t, how say, we couldn’t know about that kind of news from the Japanese media. I think kind of a more responsibility of media in Japan, so they have to inform us that kind of a situation right now. Otherwise, the normal people, we don’t know about what’s going on, the AIs. So I think we need to know more about that kind of a new information. Maybe not only the media, we can, how say, collect information from SNS, whatever, so. Thank you. I’m going to declare that we’ll all stay here for
Amy Crocker:
three more hours, so I hope you all have time. Unfortunately, we cannot, so Kasia,
Katarzyna Staciewa:
please. Thank you very much. Hello, everyone. My name is Katarzyna Staciewa, and I represent the National Research Institute in Poland, but I would like to link my intervention to my previous experience in law enforcement and also in research based on my education in criminology. So actually, it’s so live discussion that it only proves that we need more room in the future for these sort of discussions, and I wanted to thank you, Katsuhiko, if my Japanese pronunciation is well, and Liz for all the comments related to research and child rights in the so dynamically developing space. I have recently conducted research on the metaverse, and I believe research is a key, and research can also guide the developing countries, because there is a chance to benefit from what has been found out already, and the research can also guide our future actions. So in this research, I analyzed the darknet, and I analyzed the teams of conversations of people that are potentially sexually interested in children, and I found out that there are three teams that are absolutely worrying. The first one is that it’s an environment in which people like that can meet a child or can move conversation from publicly available spaces. The second is that they can create something that has already been said, that they can create AI-generated CSAM. So imagine that someone uses a picture or a video of a real child and transfer it into that sort of material. It will be constant re-victimization of a child that was absolutely innocent. And the third one is even more scary, because it was about updating, upgrading the existing CSAM into the VR or metaverse-oriented frame. So it means for the victims, the past ones and the future ones, a constant re-victimization, and we should be definitely looking at this perspective, and the call for more robust research has never been more valid. So I would just like to finalize this intervention with a focus on research as a potential gateway to more tailor-made, oriented actions for the safety of
Amy Crocker:
children. Thank you. Thank you so much, and actually, I mean, it points to a really interesting point, Sophie, that you made about safe spaces being created by civil society organizations, communities, families offline, and what should that look like in the metaverse? What can that look like? And are we ready, really, for that? And of course, we haven’t, I mean, we are short on time. We could speak about safety boat design for a long time, but I think these are crucial issues we have to grapple with as we allow children to operate as they want to. Young people want to be engaged in these environments, and also picking up on the point about what that means in different contexts, because a tool or an environment designed by a company in one country, one region, will not necessarily be meeting the needs of children in other environments, or children of diverse identities. So please. Hi, thank
Ahmad Karim:
you so much for all the intervention. My name is Ahmad Karim. I’m from UN Women Regional Office for Asia and Pacific, and I come from that angle of the discussion where whenever we have those kind of big topics, we tend to be gender-blind in the conversation, and I wonder if there is some specificities related to gender design that would relate more or give more attention to girls and young adults and females, and those who could be affected more by the advancement of technology and where national loss is not considerate, where we put all the children in one basket, but then there are some marginalized and fragile groups that deserve more attention, especially from the design of
Liz:
technology itself. Thank you. I can jump in briefly on that. I mean, fundamentally the lens we’re certainly coming at this from is we want to unlock the economic, social, and educational power of technology, but really find a way to do that. They’re using it mindfully and safely, and you can’t do that without being alive to the gender element. So I think absolutely. Where I think we are still, again, in need of a better understanding, we’ve done consumer research for a long time now. There’s a lot of good work underway, but I still don’t think we necessarily have the right level of understanding of some of those gendered impacts, and I think one of the only ways to do that actually goes back to some of the first conversation we had around youth participation, because as a millennial who got a device in high school rather than in kindergarten, I know that I don’t have an understanding of what it looks for a teenage girl online, let alone in a diverse range of cultures. I’m a New Zealander. I come with that particular lens. There are a whole range of lenses I don’t bring. So finding those ways to do that research and to get those perspectives, and we know that as a company, you know, we don’t always have the right ways of doing that either, and doing that mindfully in a way that is really asking questions of kids at the right age in the right places, and doing that safely as well so that they’re feeling really empowered to share, and I think it goes a little bit to some of the capacity building you talked about as well. Maybe I can jump in a little bit to quickly respond to
Jenna:
Emmett’s points about gender and youth participation. Actually, my colleagues right here, they are going to talk about something about gender tomorrow morning, about like, they are even younger than me, let’s be real, and then they often give up points that I don’t even touch on. They design the workshop from that perspective because they think it’s very important. Their interpretations about gender is different from what we historically define, and that’s really important, and I got invited to a panel about how we leverage AI to ensure gender inclusivity, and suddenly when I prepared a session, I was like, why am I even invited to that? Because I am just one of those ordinary one heterosexual person with really ordinary point. Why am I even on there? And so I feel like, you know, with talking to more young people, you will get some new insights from how they think as much as we dedicate a time to talk about CSAM. I think it’s really important for us to address that, and I do think that we should, you know, instead of just create one big bill to deal with how AI influences all this matter, I think government and all stakeholders should modernize different existing legal framework, like modernizing like Broadcasting Act, Consumer Protection Act, Competition Act, to make sure all these matters are integrated into it, and so public interest or the younger generations ideas are considered while we are creating all these like policies. While we are talking so much about CSAM, last month when I was like in Brisbane talking with all these Asia-Pacific youth, their workshop designed something about explicit content. They have a totally different approach, because I think when it comes to CSAM, maybe as an adult, we care how we protect them, which is very important, but they actually want to explore how they and maybe we use explicit content to express themselves, and so they actually talk about like OnlyFans, a kind of platform, how we create a safe space for those who want to express themselves through those content, which sometimes we forget to talk about it. This is also their right to express themselves if they want to, and so that’s just one thing that actually surprised me a lot, because I never thought about it. Probably I’m too conservative in some way, but that’s something we should, that is why we must bring them, because we will always find something new. We as an adult think they need this, but maybe it’s actually not, so we should have them. We have a few minutes left for
Amy Crocker:
final reflections, but that’s a perfect place to bring us home, because ultimately this is about creating safe, empowering spaces where you need regulation to do certain things, you need design to be mindful and informed by a child participant, so by child consultation and participation. So in two minutes, but I’m maybe gonna take an extra minute if we can, I’d like to invite all our panellists to just give maybe a final reflection on what they’ve heard today, something that really stands out, what perhaps even just what would be your takeaway, but what thing you would do tomorrow in response to this session, and I’ll go first online, so Patrick. Thanks Amy, and
Patrick:
it’s really hard to follow up from Jenna, because as you say, I think that is the perfect way to wrap it up. I had two notes, the one was speak and engage, not speak to engage and hear from, meaningfully, children in different contexts, their understanding, their experiences, both positive and negative, and how they want to use the internet. That means we need to be open as adults to challenge our own thinking around this, because we need to let young people who are the core focus here, determine or dictate, feed into that space. The second is also, my second point I just wanted to conclude with, was it was great to hear a speaker, the speaker from Poland, in the audience, who’s a criminologist. The other point that I wanted to make when I was talking is, we need to have criminologists, violence prevention, public health, educators, social workers, all of those sectors and specialities around, and child rights legal experts, in this conversation. It cannot come down to industry, to government, to regulation. We need to make sure that we have all of those pieces fitting together, in order to make this work. Thank you Amy, and thanks to speakers for a great
Amy Crocker:
conversation. Thank you, Sophie. Very, very short, if possible, just your main
Sophie:
reflection. Yeah, thanks to everyone for your inputs, to the speakers, to the audience. I think my learning from today is that, to advocate children’s rights in the digital world, in terms of a holistic approach, we need so many stakeholders, and it’s important really to grab them all and go this way, and especially to go this way with children and young people themselves, as a really important participant group in this context. Thank you. Kat, I’ll go to you for a final
Katz:
reflection, if I may. Yeah, thank you so much for your brainstorming sessions, so I really appreciate your input and encouragement everywhere. So, I think whatever the design, whatever the house of regulation policies, all the time we should move on to the rights-based approach, that is most important, whatever the human rights or child rights, very important, significant approach. Then also, in the past, probably we made an effort to more approach to public or people, but in the future, maybe to approach to AI in the future, so we need to more have an approach, the target will be increased in
Amy Crocker:
the future, I think. Yeah, thank you. Thank you. Very briefly, Jenna and then Liz.
Jenna:
I will be really brief, because I think I’ve had taken enough air time. I think one last takeaway is collaborations, I would say, because as someone who works on capacity building, I need research to back up all the things that I do, and then we, you know, all the stakeholders work together, is that, you know, in terms of legislation regulations, we need government, private sector, and everyone to work together to give a safe environment, and of course, don’t miss out the technical community, please, because they are very important, they have all the knowledge, and sometimes they might not be the best involved in the policymaking process. So yeah, that’s just my final words. Thank you.
Liz:
I’ll be really brief. I think my takeaway today is to continue to try to approach this in the spirit of learning, learning from others, learning to try to keep the holistic approach in mind, and we need to grapple with different harms, but we need to find a way to do that while also thinking about rights, and I think it’s a complex area, and we will have to keep learning together.
Amy Crocker:
Hello? Yeah, sorry, I won’t summarize this, we are over time, but it’s been a really fascinating conversation, and I genuinely wish we had more time, but as someone commented, we need to continue this conversation. If anyone is interested to join the Dynamic Coalition and to continue these types of conversations, we have some flyers, there is a QR code, you can go to the website, you can also go on the IJF website, find us, and sign up to the mailing list. We want to help create a space within the IJF, a bigger space, a renewed space for children’s rights issues to be discussed. I will end it now. Thank you so much for being here, and thank you to all our speakers, and thank you to Jim as our online moderator, and thank you to Bangladesh Remote Hub, it was so lovely to have you here, and all participants online. Thank you.
Speakers
Ahmad Karim
Speech speed
172 words per minute
Speech length
130 words
Speech time
45 secs
Arguments
Need for attention to girls, young adults, females, and marginalized and fragile groups in the design of advancing technology
Supporting facts:
- Ahmad Karim is from UN Women Regional Office for Asia and Pacific
- Points out the general tendency to be gender-blind in such discussions
Topics: Gender design, Marginalized groups, Technological advancement
Report
In a discussion concerning the design of advancing technology, Ahmad Karim, representing the UN Women Regional Office for Asia and the Pacific, stressed the importance of carefully considering the needs of girls, young adults, females, and marginalized and fragile groups.
It was noted that, in such discussions, there is often a tendency to overlook gender-related issues, which indicates a gender-blind approach. Another argument put forth during the discussion underscored the significance of making the design of the metaverse and technologies more considerate towards marginalized and fragile groups, especially girls and women.
The rapid advancements in technology were acknowledged as having disproportionate effects on females and marginalized sectors of society. It was highlighted that national laws frequently do not adequately account for the specific needs and challenges faced by these groups. The supporting evidence provided includes the fact that girls, young adults, and women are often underrepresented and encounter barriers in accessing and benefiting from technological advancements.
Additionally, marginalized and fragile groups, such as those from low-income backgrounds or with disabilities, are particularly vulnerable to exclusion and discrimination in the design and implementation of technology. The conclusion drawn from the discussion is that there is an urgent need for greater attention and inclusivity in the design of advancing technology.
Consideration must be given to the unique needs and challenges faced by girls, young adults, females, and marginalized and fragile groups. It is imperative that national laws and policies reflect these considerations and ensure that these groups are not left behind in the technological progress.
This discussion highlights the significance of addressing gender inequality and reducing inequalities in the design and implementation of technology. It sheds light on the potential pitfalls and repercussions of disregarding the needs of marginalized and fragile groups, and calls for a more inclusive and equitable approach to technological advancements.
Amy Crocker
Speech speed
178 words per minute
Speech length
4308 words
Speech time
1452 secs
Arguments
Children’s digital rights are integral to a safe, equitable and secure online world
Supporting facts:
- General Comment Number 25 to the UN Convention on the Rights of the Child recognizes children’s digital rights, obliging state parties to protect children from all forms of online exploitation and abuse
- The internet can provide positive opportunities for children and young people if safety can be assured
- The rights that children have in the offline world should also be assured online
Topics: Children’s Rights, Online Safety, Digital Environment, Internet Governance
Amy Crocker emphasizes the importance of understanding children’s rights within local contexts
Supporting facts:
- Amy mentioned that realization of children’s rights can often be dependent on a local context
Topics: Children’s rights, Local context
Amy Crocker highlights the need for improving public understanding of technology for its risks and opportunities
Supporting facts:
- Amy put forth the challenge of making technology explainable enough for people to understand the risks and opportunities associated with it
Topics: Technology, Public Understanding
Trust is crucial in the digital age
Supporting facts:
- In a world where we rely on technology for almost everything, trust becomes the glue that holds everything together
- Trust is crucial against the backdrop of growing reliance on technologies and possible risks related to data breaches, data privacy problems and unethical practices
Topics: Online safety, Youth engagement, Digital literacy
Regulation alone is not a solution for the challenges of emerging technologies. Both regulation and prevention through education and awareness are crucial.
Supporting facts:
- Regulation is often seen as a visible commitment to keeping children safe online. However, it takes a long time to formulate and implement policy. Thus, while critical, it is just one component of what is needed.
- We need to invest in prevention, building capacity of parents and children, raising awareness, and building resilience.
Topics: Regulation, Emerging Technologies, Education, Awareness
Addressing technology issues in developing countries
Supporting facts:
- Discussion on the challenges faced by countries like Bangladesh and Afghanistan in terms of technology and capacity building
- Seeking answers to questions from the vice chair of the Bangladesh Youth IGF and an instructor at Kabul University
- Considering different opportunities for codes of conduct that can be adapted to different contexts
Topics: Technology Access, Capacity Building
Amy Crocker wishes the conversation to continue and is interested in creating more space within the IJF for children’s rights issues to be discussed.
Supporting facts:
- Amy is inviting anyone interested to join the Dynamic Coalition and continue similar conversations, she mentioned about flyers and a QR code that directs to the website and the IJF website to sign up for the mailing list.
Topics: children’s rights, IGF, Dynamic Coalition
Report
During the event, the speakers highlighted the significant importance of children’s digital rights in creating a safe and secure online environment. They stressed that children’s rights should be protected online, just as they are in the offline world. General Comment Number 25 to the UN Convention on the Rights of the Child was mentioned as a recognition of the importance of children’s digital rights, with state parties being obligated to protect children from all forms of online exploitation and abuse.
In terms of internet governance, the speakers advocated for a proactive and preventive approach, rather than a reactive one. They argued that governments often find themselves playing catch-up with digital issues, reacting to problems after they have already occurred. A shift towards a preventive model of online safety was deemed necessary, which involves designing for safety before potential issues arise.
Effective implementation was seen as the key to turning digital policies into practice. The speakers emphasized the need to understand how to implement policies in specific local contexts to realize the full benefits. They argued that implementation is crucial in ensuring that children’s rights are protected and upheld online.
The need for public understanding of technology and its risks and opportunities was also highlighted. It was mentioned that improving public understanding is necessary for individuals to make informed decisions about their online activities. Empowering parents to understand technology and facilitate their children’s rights was seen as an important aspect of ensuring a safe online environment for children.
Trust was identified as a crucial element in the digital age, particularly with the growing reliance on technology. The speakers discussed the importance of trust against the backdrop of emerging risks related to data breaches, data privacy problems, and unethical practices.
Building and maintaining trust were seen as essential for a secure online environment. Safeguarding the younger generations online was viewed as a collective responsibility. The speakers stressed that parents and guardians cannot solely shoulder this responsibility and must have a certain level of knowledge of online safety.
The importance of all stakeholders, including businesses, industries, and governments, working together to protect children’s rights online was emphasized. Regulation was seen as an important tool for keeping children safe online. However, it was noted that regulation alone is not a solution for the challenges posed by emerging technologies.
The speakers argued that both regulation and prevention through education and awareness are crucial in effectively addressing these challenges. Differentiated regulation based on context was advocated for. The speakers highlighted that different online services offer different opportunities for children to learn and be creative.
They also emphasized that children’s evolving capacities are influenced by various factors, such as their geographical and household contexts. Understanding the link between online and offline contexts was seen as essential in developing effective regulation. Transparency, a culture of child rights, and collaborative efforts were identified as crucial for the protection of children’s rights online.
All stakeholders, including businesses, industries, and governments, were urged to work together and have a shared understanding of child rights. The need for transparency in their commitment to protecting child rights was emphasized. The challenges faced by developing countries in terms of technology and capacity building were acknowledged.
The speakers discussed the specific challenges faced by countries like Bangladesh and Afghanistan in terms of accessing technology and building the necessary capacity. Opportunities for codes of conduct that can be adapted to different contexts were also explored. Consulting children and young people was highlighted as an important approach to addressing online safety issues.
The speakers emphasized the need to understand how children and young people feel about these issues and to learn from approaches to regulation that have been successful. Amy Crocker, one of the speakers, encouraged people interested in children’s rights issues to join the Dynamic Coalition and continue similar conversations.
Flyers and a QR code were mentioned as ways to sign up for the mailing list. The importance of creating more space within the IGF for discussing children’s rights issues was also emphasized. In conclusion, the event highlighted the significant importance of protecting children’s digital rights and creating a safe and secure online environment for them.
It emphasized the need for proactive and preventive internet governance, effective implementation of digital policies, public understanding of technology, empowering parents, trust, collective responsibility, regulation along with education and awareness, differentiated regulation based on context, transparency, and collaborative efforts. The challenges faced by developing countries were acknowledged, and the involvement of children and young people was seen as essential in addressing online safety issues.
Amyana
Speech speed
120 words per minute
Speech length
206 words
Speech time
103 secs
Arguments
Concern about unequal application of international standards for child protection
Supporting facts:
- Children from the global south have a lower level of protection than those from the north
Topics: Child Protection, International Standards
Question about legal framework for AI-created images of child abuse
Supporting facts:
- Existing legislation is not equipped to handle images of child abuse created by AI
Topics: Artificial Intelligence, Child Abuse
Report
The analysis addresses several concerns regarding child protection and the legal framework surrounding it. Firstly, there is concern about the unequal application of international standards for child protection, particularly between children from the Global South and the Global North. This suggests that children in developing countries may not receive the same level of protection as those in more developed regions.
Factors such as resource distribution, economic disparities, and varying levels of political commitment contribute to this discrepancy in child protection standards. Another notable concern highlighted in the analysis is the inadequacy of current legislation in dealing with images of child abuse created by artificial intelligence (AI).
As technology advances, AI is increasingly being used to generate explicit and harmful content involving children. However, existing laws appear ineffective in addressing the complexities associated with such content, raising questions about the efficacy of the legal framework in the face of rapidly evolving technology.
On a positive note, there is support for taking proactive measures and demanding better protection measures from online platforms. Efforts are being made to provide guidelines and recommendations to agencies working with children and adolescents, aimed at enhancing child protection in the digital space and promoting the well-being of young individuals online.
This demonstrates an awareness of the need to keep pace with technological advancements and adapt legal frameworks accordingly. Overall, the analysis underscores the importance of addressing the unequal application of international standards for child protection and the challenges posed by AI-generated images of child abuse.
It emphasizes the need for updated legislation that aligns with emerging technologies, while also advocating for proactive measures to enhance protection on online platforms. These insights provide valuable considerations for policymakers, child protection agencies, and stakeholders working towards establishing robust and inclusive frameworks for child protection globally.
Andrew Campling
Speech speed
162 words per minute
Speech length
355 words
Speech time
131 secs
Arguments
Algorithms make malicious content more accessible through their recommendations, leading to harmful consequences for children.
Supporting facts:
- A child in the UK committed suicide after being shown suicide-relevant content by an algorithm.
Topics: AI, Algorithms, Child Safety, Digital Policy
Restrictions should be placed on surveillance capitalism applied to children to prevent malicious content exposure.
Supporting facts:
- Suggested a blanket prohibition of data gathering of known child users on platforms.
Topics: AI, Child Safety, Digital Policy, Surveillance Capitalism
AI models are used to generate Child Sexual Abuse Material (CSAM) and should be made illegal, as well as the circulation of prompts to generate CSAM.
Supporting facts:
- It’s a loophole in some countries where AI-generated CSAM isn’t illegal.
Topics: AI, Child Safety, Digital Policy, CSAM
A duty of care on platforms towards their users should be imposed given the pace of technology change.
Supporting facts:
- Suggests that otherwise it would be impossible for regulators to keep up with changes.
Topics: AI, Digital Policy, Platform Responsibility
Report
The discussions revolve around the significant impact that algorithms have on child safety in the digital realm. One particularly tragic incident occurred in the UK, where a child took their own life after being exposed to suicide-relevant content recommended by an algorithm.
This heartbreaking event highlights the dangerous potential of algorithms to make malicious content more accessible, leading to harmful consequences for children. One key argument suggests that restrictions should be placed on surveillance capitalism as it applies to children. The aim is to prevent the exposure of children to malicious content by prohibiting the gathering of data from known child users on platforms.
These restrictions aim to protect children from potential harms caused by algorithmic recommendations of harmful content. Another concerning issue raised during these discussions is the use of AI models to generate Child Sexual Abuse Material (CSAM). It is alarming that in some countries, this AI-generated CSAM is not yet considered illegal.
The argument is that both the AI models used in generating CSAM and the circulation of prompts to create such content should be made illegal. There is a clear need for legal measures to address this concerning loophole and protect children from the creation and circulation of CSAM.
Furthermore, it is argued that platforms have a responsibility towards their users, particularly in light of the rapid pace of technological change. It is suggested that platforms should impose a duty of care on themselves to ensure the safety and well-being of their users.
This duty of care would help manage the risks associated with algorithmic recommendations and the potential harms they could cause to vulnerable individuals, especially children. Importantly, the argument highlights the difficulty regulators face in keeping up with the ever-evolving technology, making it crucial for platforms to step up and take responsibility.
In conclusion, the discussions surrounding the impact of algorithms on child safety in the digital realm reveal significant concerns and arguments. The tragic incident of a child’s suicide underscores the urgency of addressing the issue. Suggestions include imposing restrictions on surveillance capitalism as it applies to children, making AI-generated CSAM illegal, and holding platforms accountable for their users’ safety.
These measures aim to protect children and ensure a safer digital environment for their well-being.
B. Adharsan Baksha
Speech speed
173 words per minute
Speech length
104 words
Speech time
36 secs
Arguments
AI adoption among children presents many risks, including data privacy issues
Supporting facts:
- Chatbots like Synapse and MyAI can quickly extract and process vast amounts of personal data
- This can potentially expose children to cyber threats, targeted advertising and inappropriate content
Topics: Artificial Intelligence, Children, Data Privacy
Report
AI adoption among children can pose significant risks, particularly in terms of data privacy. The presence of chatbots such as Synapse and MyAI has raised concerns as these tools have the capability to rapidly extract and process vast amounts of personal information.
This raises the potential for exposing children to various cyber threats, targeted advertising, and inappropriate content. The ability of chatbots to collect personal data is alarming as it puts children at risk of having their sensitive information compromised. Cyber threats, such as hacking or identity theft, can have devastating consequences for individuals, and children are especially vulnerable in this regard.
Moreover, the information gathered by chatbots can be used by marketers to target children with ads, leading to potential exploitation and manipulation in the digital realm. Inappropriate content is another concerning aspect of AI adoption among children. Without proper safeguards, chatbots may inadvertently expose children to age-inappropriate material, which can have a negative impact on their emotional and psychological well-being.
Children need a secure and regulated online environment that protects them from exposure to harmful content. It is crucial to recognise the need to ensure a secure cyberspace for children. This includes focusing on the development and implementation of effective measures related to artificial intelligence, children, and cybersecurity.
Governments, organisations, and parents must work together to mitigate the risks associated with AI adoption among children. In conclusion, AI adoption among children brings forth various risks, with data privacy issues at the forefront. Chatbots that possess the ability to collect personal data may expose children to cyber threats, targeted advertising, and inappropriate content.
To safeguard children’s well-being and protect their privacy, it is essential to establish a secure online environment that addresses the potential risks posed by AI technology. The responsibility lies with all stakeholders involved in ensuring a safe and regulated cyberspace for children.
Jenna
Speech speed
169 words per minute
Speech length
2471 words
Speech time
877 secs
Arguments
Children nowadays are more exposed to the internet and technology from a very young age
Supporting facts:
- Kids today breathe in the online world, they are practically born with the internet
- Before they are born, their photos are filling their parents’ social media feeds
Topics: Internet Exposure, Childhood Development, Parental Control
Trust is a bedrock of digital age
Supporting facts:
- In a world where we rely on technology for almost everything, trust becomes more essential
Topics: Digital Age, Internet Security
Building trust and imparting fundamental digital knowledge are essential steps in creating a reliable and ethically responsible digital environment for the younger generations
Supporting facts:
- Potential harms and risks multiply with the progress we have accomplished in embracing diversity
Topics: Digital Education, Online Safety
The rise of cyberbullying and other advanced forms of online abuse like hate speech, doxing, and cyberstalking
Supporting facts:
- Cyberbullying has evolved from early stages of internet flaming and harassment via emails to more advanced forms like cyberstalking and doxing
- With the rise of generative AI, creating hate image-based abuse has become relatively easier
Topics: Cyberbullying, Online Abuse, Hate Speech, Internet Safety
Need for clear definition and scope about online safety threats
Supporting facts:
- Different jurisdictions have different approaches, e.g., Australia adopts industry code, Singapore uses a government driven way
Topics: online safety, localization, international standards
Capacity building at multiple levels is crucial
Supporting facts:
- Empowering young people helps in making their voice heard
- Importance of understanding the technical aspects of Internet governance
Topics: Capacity building, Multi-stakeholder approach
Inclusion of diverse voices and democratizing the process
Supporting facts:
- Youth voices need to be heard
- More stakeholders need to be included in conversations
- Language can be a barrier and cause loss in translation
Topics: Multilingualism, Multistakeholderism, Inclusion
Jenna believes that gaining perspective from young people can bring fresh, unique insights into discussions about gender and technology
Supporting facts:
- Jenna’s younger colleagues are set to speak on gender related matters
- She was part of a panel discussing how AI can be leveraged for gender inclusivity
- She emphasizes the different interpretations of gender from the younger generation
Topics: Gender, Youth in Technology, Perspective
Jenna highlights the importance of engaging young people in discussions related to explicit content and creating safe spaces for self-expression
Supporting facts:
- She recounts a workshop with Asia-Pacific youth who designed something about explicit content
- She brings up the example of OnlyFans as a platform for self-expression
Topics: Explicit Content, Youth Engagement, Self-expression
Collaboration is crucial to success
Supporting facts:
- Jenna works on capacity building and needs research support
- Stakeholders need to work together in terms of legislation regulations
Topics: Collaboration, Capacity Building, Legislation Regulations
Report
Children today are immersed in the online world from a very young age, practically being born with access to the internet and technology. This exposure to the digital age has led to an increased need for trust in this new environment.
Trust is seen as a cornerstone of the digital age, particularly as we rely on technology for almost every aspect of our lives. Without trust, our reliance on technology becomes more precarious. Creating a reliable and ethical digital environment for younger generations requires imparting fundamental digital knowledge and nurturing trust.
Building trust and instilling digital literacy are essential steps in safeguarding children online. Parents play a crucial role in this process, but it is also a shared responsibility that extends to all stakeholders. Informed parents are key as they are often the first line of defense for children facing challenges online.
However, they cannot do it alone, and it is important for all stakeholders to be aware of their responsibility in protecting younger generations. The challenges faced by teenagers today in the online world are more multifaceted and harmful than ever before.
Cyberbullying has evolved from early stages of internet flaming and harassment via emails to more advanced forms like cyberstalking and doxing. The rise of generative AI has made creating hate image-based abuse relatively easier, contributing to a growing concern for online safety.
It is important to address these issues effectively and efficiently to ensure the well-being of young people online. The approach to online safety varies across different jurisdictions, with each adopting their own strategies and measures. For example, Australia has an industry code in place, while Singapore employs a government-driven approach.
This diversity highlights the need for clear definitions and standards regarding online safety threats. A cohesive understanding of these threats is imperative to effectively combat them and ensure consistency across different regions. Capacity building is essential for addressing the challenges of the digital age.
Empowering young people and ensuring their voices are heard can lead to a better understanding of their needs and concerns. Additionally, understanding the technical aspects of internet governance is vital in developing effective solutions to address issues of online safety and security.
Inclusion and diversity are crucial in creating a safe online space. It is important to include the voices of different stakeholders and ensure that everyone has a seat at the table. Language can be a barrier, causing loss in translation, so efforts must be made to overcome this and make conversations more inclusive.
The perspective and insights of young people are valued in discussions on gender and technology. Gaining fresh and unique insights from the younger generation can contribute to the development of more inclusive and gender-responsive approaches. Jenna, a participant in the discussion, highlighted the need to engage young people in discussions related to explicit content and self-expression, as well as providing safe spaces for their voices to be heard.
Modernizing existing legal frameworks is seen as a more effective approach to addressing the impacts of AI and other technological advancements. Rather than a single legislative solution, updating legislation such as the Broadcasting Act, Consumer Protection Act, and Competition Act is seen as crucial in integrating present issues and adapting to the digital age.
Collaboration among stakeholders is essential for success. Capacity building requires research support, and the cooperation of multiple stakeholders is crucial in terms of legislation and regulations. By working together and leveraging each other’s strengths, stakeholders can more effectively address the challenges faced in the digital world.
Lastly, inclusive involvement of the technical community in the policy-making process is advocated. The technical community possesses valuable knowledge and insights that can contribute to the development of effective policies. However, it is acknowledged that their involvement may not always be the best fit for all policy-making decisions.
Striking a balance between technical expertise and broader considerations is key to ensuring policies are robust and comprehensive. In conclusion, children today are growing up in a digital age where they are exposed to the internet and technology from a young age.
Building a reliable and ethical digital environment requires imparting digital knowledge and nurturing trust. Safeguarding younger generations online is a shared responsibility, requiring the involvement of all stakeholders. The challenges faced by teenagers today, such as cyberbullying and hate speech, are advanced and harmful.
Different jurisdictions have varying approaches to online safety, emphasizing the need for clear definitions and standards. Capacity building and the inclusion of diverse voices are crucial in creating a safe online space. The perspective and insights of young people are valuable in discussions on gender and technology.
Modernizing existing legal frameworks is advocated, and engaging young people in discussions on explicit content and self-expression is important. Collaboration among stakeholders and the inclusion of the technical community in policy-making processes are considered essential for success in addressing the impacts of the digital age.
Jim
Speech speed
202 words per minute
Speech length
140 words
Speech time
42 secs
Arguments
The importance of regulating and supporting internet technology in developing countries
Supporting facts:
- The mention of questions coming from Bangladesh Youth IGF, and a question from an instructor at Kabul University illustrates real-world interest and concern from these developing regions.
Topics: Internet technology, Regulation, Developing countries, Capacity building
Report
The discussion emphasized the importance of regulating and supporting internet technology in developing countries, as evidenced by the interest and concern of participants from regions such as Bangladesh and Kabul University. This real-world engagement highlights the relevance and urgency of the issue in developing regions.
Jim, during the discussion, summarised and acknowledged the questions raised by participants from developing nations, demonstrating his support for addressing the challenges and needs specific to these countries. He stressed the need to consider these perspectives when dealing with the issues surrounding internet technology in developing countries.
This recognition of diverse needs and experiences reflects a commitment to inclusivity and ensuring that solutions are tailored to the circumstances of each country. The overall sentiment observed in the discussion was neutral to positive. This indicates a recognition of the importance of regulating and supporting internet technology in developing countries, and a willingness to address the challenges and concerns associated with it.
The positive sentiment suggests support for efforts to enhance access to, and the effectiveness of, internet technology in these regions, contributing to the United Nations Sustainable Development Goals of Industry, Innovation and Infrastructure (SDG 9) and Reduced Inequalities (SDG 10). In conclusion, the discussion highlights the crucial role of regulation and support for internet technology in developing countries.
The participation and engagement of individuals from these regions further validate the significance and necessity of addressing their specific needs and challenges. By considering the perspectives of those in developing nations and taking appropriate actions to bridge the digital divide, we can work towards achieving a more inclusive and equitable global digital landscape.
Katarzyna Staciewa
Speech speed
141 words per minute
Speech length
375 words
Speech time
159 secs
Arguments
Need for more discussions and research in criminology and problematic sectors
Supporting facts:
- Katarzyna Staciewa represents the National Research Institute in Poland and based her intervention on her experiences in law enforcement and criminology
- She conducted research on the metaverse and argues for the importance of it in guiding developing countries.
Topics: Law enforcement, Criminology, Research, Metaverse
Concern over the misuse of metaverse and AI technology
Supporting facts:
- She analyzed the darknet and potentially sexually interested groups in children, revealing worrying trends
- Risks include the possibility of AI-generated CSAM, or the updating of existing CSAM into VR or metaverse frames
Topics: Metaverse, Child Rights, AI-generated CSAM, VR
Report
In a recent discussion focusing on the relationship between the metaverse and various sectors such as criminology and child safety, Katarzyna Staciewa, a representative from the National Research Institute in Poland, shared her insights and emphasized the need for further discussions and research in criminology and other problematic sectors.
Staciewa drew upon her experiences in law enforcement and criminology to support her argument. Staciewa discussed her research on the metaverse, highlighting its significance in guiding the development of developing countries. The metaverse, an immersive virtual reality space, has the potential to shape the future of these countries by offering new opportunities and addressing socio-economic challenges.
Staciewa’s positive sentiment towards the metaverse underscored its potential as a tool for fostering quality education and promoting peace, justice, and strong institutions, as outlined in the relevant Sustainable Development Goals (SDGs). However, concerns were raised during the discussion regarding the potential misuse of the metaverse and AI technology, particularly in relation to child safety.
Staciewa analyzed the darknet and shed light on potentially sexually interested groups involving children, revealing alarming trends. The risks associated with the metaverse lie in the possibility of AI-generated child sexual abuse material (CSAM) and the potential for existing CSAM to be transformed into virtual reality or metaverse frames.
The negative sentiment expressed by Staciewa and others reflected the urgency to address these risks and prevent harm to vulnerable children. The speakers placed strong emphasis on the importance of research in taking appropriate actions to ensure child safety. Staciewa’s research findings highlighted the constant revictimization faced by child victims, further underscoring the need for comprehensive measures to protect them.
By conducting further research in the field of child safety and child rights, stakeholders can gain a deeper understanding of the challenges posed by the metaverse and AI technology and develop effective strategies to mitigate these risks. In conclusion, the discussion on the metaverse and its impact on various sectors, including criminology and child safety, highlighted the need for more research and discussions to harness the potential of the metaverse while safeguarding vulnerable populations.
While acknowledging the metaverse’s ability to guide the development of developing countries and the positive impact it can have on education and institutions, concerns were expressed about the possibility of misuse, particularly with regards to child safety. The importance of research in understanding and addressing these risks was strongly emphasized, particularly in the context of the continuous victimization of child victims.
Katz
Speech speed
123 words per minute
Speech length
791 words
Speech time
386 secs
Arguments
Child rights are fundamental and must be promoted.
Supporting facts:
- Child rights is a necessary part of all societal work
- Katz’s child-focused agency promotes child rights
Topics: Child rights, Societies
Misunderstanding or misinterpretation of child rights needs to be addressed.
Supporting facts:
- Some people believe that virtual child sexual abuse material (CSAM/SEM) prevents real crime, indicating misunderstanding or misinterpretation of child rights
Topics: Child rights, Misunderstanding, Public opinion
There is a need to raise awareness about the risks and opportunities of AI.
Supporting facts:
- 20% of respondents said they don’t know about AI matters or risks indicating a need for increased public awareness and education about AI
Topics: AI risks, AI opportunities, Public awareness
Japan does not currently have any regulations or policies regarding AI-generated imagery
Supporting facts:
- Katz revealed that Japan does not have any regulations for AI-generated images
Topics: AI-generated imagery, Regulations, Japan
There is a need for more awareness and information about AI developments
Supporting facts:
- Katz suggested that the media in Japan should have more responsibility in disseminating information about AI developments
- Katz indicated that currently, people in Japan are not being adequately informed about what’s going on with AI
Topics: AI developments, Media responsibility, Awareness
Importance of rights-based approach in designing regulation policies
Supporting facts:
- In the future, the approach target will increase
Topics: Regulation policies, Children’s rights, AI, Human rights
Report
Child rights are considered fundamental and should be promoted. Katz’s child-focused agency actively advocates for the promotion of child rights. However, conflicts between child rights and freedom of expression can arise. Survey results revealed such conflicts, underscoring the need for balance between these two important aspects.
Misunderstandings or misinterpretations of child rights are common and must be addressed. Some people mistakenly believe that virtual child sexual abuse material (CSAM/SEM) can prevent real crime, indicating a lack of understanding or misinterpretation of child rights. Efforts should be made to educate and provide correct information regarding child rights to combat these misunderstandings.
Regulating AI in the context of child protection is a topic under discussion. Many respondents believe that AI should be regulated to ensure child protection, particularly in relation to CSAM/SEM. However, opinions on this matter are mixed, highlighting the need for further dialogue and research to determine the most appropriate approach.
Public awareness of the risks and opportunities of AI needs to be raised. Approximately 20% of respondents admitted to having limited knowledge about AI matters and associated risks. This signifies the need for increased education and awareness programs to ensure the public understands the potential benefits and dangers of AI technology.
Japan currently lacks regulations and policies concerning AI-generated imagery. Katz’s observation reveals a gap in the legal framework, emphasizing the necessity of establishing guidelines and regulations to effectively address this issue. There is also a need for greater awareness and information dissemination about AI developments.
Katz suggests that the media should take more responsibility in informing the public about advancements and implications of AI. Currently, people in Japan are not adequately informed about ongoing AI developments, highlighting the need for improved communication and awareness campaigns.
Katz recommends that the public should gather information from social networking services (SNS) about AI developments. This highlights the importance of utilizing various platforms to stay updated and informed about the latest developments in the field of AI. A rights-based approach is crucial in designing regulation policies.
It is essential to ensure that the rights of children and humans are protected in the digital world. Advocating for the enhancement of child and human rights in the digital sphere is a vital aspect of creating an inclusive and safe environment.
In conclusion, promoting child rights is essential, although conflicts with freedom of expression may arise. Addressing misunderstandings and misinterpretations of child rights is crucial. The regulation of AI in the context of child protection requires further examination and consideration. Public awareness about the risks and opportunities of AI needs to be improved.
Japan lacks regulations for AI-generated imagery, and greater awareness about AI developments is necessary. Gathering information from SNS can help individuals stay informed about AI happenings. A rights-based approach is needed when designing regulation policies, and enhancing child and human rights in the digital world is vital.
Larry Magid
Speech speed
203 words per minute
Speech length
547 words
Speech time
162 secs
Arguments
Protection should not cost children their rights
Supporting facts:
- Larry argues that protection and children’s rights are sometimes in conflict
- He cites examples of proposed US laws that could suppress children’s rights out of alleged protection
- He cites the UN Convention that guarantees children’s rights to freedom of expression, participation etc
Topics: Children’s rights, Online safety, Legislation
Rights and protection should be balanced
Supporting facts:
- Asserts that being active in the world automatically exposes kids to some risks
- Not arguing for an unregulated space but for a balanced regulation that can protect and ensure children’s rights
Topics: Children’s rights, Child protection
Report
In the analysis, the speakers engage in a discussion regarding the delicate balance between protecting children and upholding their rights. Larry argues that protection and children’s rights are sometimes in conflict. He cites examples of proposed US laws that could suppress children’s rights in the guise of protection.
Larry also highlights the UN Convention, which guarantees children’s rights to freedom of expression, participation, and more. On the other side of the debate, another speaker opposes legislation that infringes upon children’s rights. They point out instances where such legislation may limit children’s rights, such as requiring parental permission for individuals under 18 to access the internet.
Their sentiment towards these laws is negative. Lastly, a speaker emphasises the need for a balanced approach to regulation, one that can protect and ensure children’s rights while acknowledging the inherent risks involved in being active in the world. They argue for a fair equilibrium between rights and protection.
Their sentiment remains neutral. Throughout the analysis, the speakers recognize the challenge in finding the proper balance between protecting children and preserving their rights. The discussion highlights the complexities and potential conflicts that arise in this area, and stresses the importance of striking a balance that safeguards children’s well-being while still allowing them to exercise their rights and freedoms.
Liz
Speech speed
228 words per minute
Speech length
2029 words
Speech time
534 secs
Arguments
Microsoft acknowledges its responsibility in protecting their users and especially children from harmful online content.
Supporting facts:
- Microsoft is tailoring safety interventions based on service type for an effective approach at safety.
- Microsoft also recognizes a need for balancing safety measures with considerations for privacy and freedom of expression.
Topics: Online safety, Children’s rights, Regulation
Risks from AI-generated CSAM central to Microsoft
Supporting facts:
- Microsoft has considered risks from AI-generated CSAM in its responsible AI approach
Topics: Artificial Intelligence, Online child safety, Content regulation
The application of safety by design across services
Supporting facts:
- Microsoft is looking at how safety by design can be applied across services to prevent dissemination or creation of CSAM
Topics: Artificial Intelligence, Online child safety, Content regulation
Major discussions on age assurance and children’s access to online services
Supporting facts:
- Strands of work are needed to find tech solutions for accurate age verification, whilst considering trade-offs with privacy and security.
Topics: Online child safety, Age verification, Children’s rights
The lens for economic, social, and educational power of technology should include gender elements
Supporting facts:
- She acknowledges there is a need for better understanding of gendered impacts
- She believes true understanding will only be achieved through youth participation
Topics: Gender Equality, Technology, Education, Economy, Digital Safety
Approach in the spirit of learning from others
Topics: Collaboration, Open-mindedness, Holistic Approach
Thinking about rights in addressing different harms
Topics: Rights, Harms, Policy
Report
In a recent discussion on online safety, Microsoft emphasised its responsibility in protecting users, particularly children, from harmful content. They acknowledged that tailored safety measures, based on the type of service, are necessary for an effective approach. However, they also highlighted the importance of striking a balance between safety and considerations for privacy and freedom of expression.
One speaker raised an interesting point about the potential risks of a “one size fits all” approach to addressing online safety. They argued that different services, such as gaming or professional social networks, require context-specific interventions. Implementing broad-scoped regulation could inadvertently capture services that have unique safety requirements.
Both legislation and voluntary actions were deemed necessary to address children’s online safety. Microsoft highlighted their focus on building safety and privacy by design. By incorporating safety measures from the very beginning during product development, they aim to create a safer online environment for users.
However, concerns were also raised about the current state of legislation related to online safety and privacy. It was noted that legislative efforts often lack a holistic approach and can sometimes contradict each other. Some safety and privacy legislations contain concepts that may not optimise online safety measures.
Microsoft also recognised the risks posed by AI-generated child sexual abuse material (CSAM) and emphasised the need for responsible AI practices. They are actively considering these risks in their approach to ensure the responsible use of AI technologies. The discussion strongly advocated for the importance of regulation in addressing online harms.
Microsoft believes that effective regulation and a whole society approach are crucial in tackling the various challenges posed by online safety. They emphasised the need for ongoing collaboration with experts and stakeholders to continuously improve online child safety measures and access controls.
Another key aspect discussed was the need for a better understanding of the gendered impacts of technology. It was highlighted that current research lacks a comprehensive understanding of youth experiences, particularly for females and different cultures. Additional research, empowerment, and capacity building were suggested as ways to better understand the gendered implications of technology.
In conclusion, the discussion stressed the importance of collaboration, open-mindedness, and continuous learning in addressing online safety. Microsoft’s commitment to protecting users, especially children, from harmful content was evident in their approach to building safety and privacy by design. The speakers highlighted the complexities of the topic and emphasised the need for context-specific interventions and effective regulation to ensure a safer online environment for all users.
Patrick
Speech speed
169 words per minute
Speech length
1483 words
Speech time
526 secs
Arguments
Regulation is just one of the tools in the child safety quiver, prevention, education and awareness are also critical
Supporting facts:
- Emphasis is often put on regulation due to its visibility as a commitment to child safety
- Lack of proportional investment in prevention aspects like awareness-raising and education
Topics: Regulation, Child Safety, Online Policies, Education
A unified understanding and commitment to child rights are prerequisites for any successful regulation
Supporting facts:
- There’s a huge variation in how child rights are interpreted or emphasized in different regional, cultural or religious contexts
- Transparent commitment and culture of child rights are needed for any industry, business or form government
Topics: Child Rights, Online Policies, Regulation
Governments in developing countries often model their policies and legislation on those of key countries without critiquing the inherent challenges
Supporting facts:
- Working in countries from Southern Africa, North Africa to Asia Pacific, Patrick has observed such tendencies in policy making
Topics: Policy development, Legislation, Developing countries
Countries are reluctant to update their legislation dealing with sexual violence due to the lengthy process
Supporting facts:
- The process for legislation update can take up to five to ten years
Topics: Law & legislation, Sexual violence, Policy updates
Industries and companies must take initiative to adhere to certain definitions such as AI-generated CSAM, and not wait for national policies to change
Supporting facts:
- Companies can act as frontrunners in adopting definitions and staying abreast of technologically enhanced crimes
Topics: Companies & Industries, AI-generated CSAM, Policy change
Engage and meaningfully hear from children in different contexts to understand their experiences and how they want to use the internet
Topics: Child participation, Internet regulation, Online safety
There needs to be cross-sector participation including criminologists, educators, social workers, public health, violence prevention, child rights legal experts in the conversation around internet safety
Topics: Interdisciplinary approach, Internet safety
Report
During the discussion on child safety and online policies, the speakers emphasised the importance of taking a balanced approach. While regulation was acknowledged as a crucial tool in ensuring child safety, the speakers also highlighted the significance of prevention, education, and awareness.
It was noted that regulation often receives more attention due to its visibility as a commitment to child safety. However, the lack of proportional investment in prevention aspects, such as awareness-raising and education, was seen as a gap. Addressing the specific needs of children in relation to their evolving capacities and contexts was deemed crucial.
A differentiated approach to regulation was recommended, taking into consideration the diverse services and opportunities available for children to learn digital skills. The household environment, geographical context, and access to non-digital services were identified as factors that influence children’s evolving capacities.
A unified understanding and commitment to child rights were highlighted as prerequisites for effective regulation. The speakers pointed out that there is often a significant variation in how child rights are interpreted or emphasised in different regional, cultural, or religious contexts.
It was stressed that a transparent commitment and culture of child rights are necessary from industries, businesses, and governments for any successful regulation to be established. The tendency of developing countries to adopt policies and legislation from key countries without critically analysing the unique challenges they face was criticised.
The speakers observed this trend in policy-making from Southern Africa to North Africa and the Asia Pacific region. The need for developing countries to contextualise policies and legislation according to their own specific circumstances was emphasised. An issue of concern raised during the discussion was the reluctance of countries to update their legislation dealing with sexual violence.
The process for legislation update was noted to be lengthy, often taking up to five to ten years. This delay was seen as a significant barrier to effectively addressing the issue and protecting children from sexual violence. The role of industries and companies in ensuring child safety was also highlighted.
It was advocated that industries should act as frontrunners in adopting definitions and staying updated on technologically enhanced crimes, such as AI-generated child sexual abuse material (CSAM). The speakers argued that industries should not wait for national policies to change but should instead take initiative in adhering to certain definitions and guidelines.
The importance of engaging with children and listening to their experiences and voices in different contexts was emphasised. The speakers stressed that children should have a critical say in the internet space, and adults should be open to challenging their own thinking and assumptions.
Meaningful engagement with children was seen as essential to understanding their needs and desires in using the internet safely. In addition, the speakers highlighted the need for cross-sector participation in discussing internet safety. They recommended involving experts from various fields, such as criminologists, educators, social workers, public health specialists, violence prevention experts, and child rights legal experts.
A holistic and interdisciplinary approach was deemed necessary to address the complex issue of internet safety effectively. Overall, the discussion on child safety and online policies emphasised the need for a balanced approach, taking into account regulation, prevention, education, and awareness.
The importance of considering the evolving capacities and contexts of children, a unified understanding and commitment to child rights, and the role of industries and companies in taking initiative were also highlighted. Additionally, the speakers stressed the significance of engaging with children and adopting a cross-sector approach to ensure internet safety.
Sophie
Speech speed
137 words per minute
Speech length
1442 words
Speech time
631 secs
Arguments
Children’s digital rights are imperative for their protection and empowerment in the digital world
Supporting facts:
- General Commend 25 by the UN emphasizes the importance of children’s digital rights.
- Rights of provision, protection, and participation are vital for the digital world.
Topics: Children’s Rights, Digital World, Online Safety
Young children seek support from parents and teachers when facing online risks
Supporting facts:
- Young children desire parents and confidants as safety contact persons for online issues.
- As children grow, they resort to technical strategies to cope with online risks.
Topics: Online Risks, Parental Support, Teacher Support
The design of online spaces needs to be adapted according to the needs of different age groups
Supporting facts:
- Children are critical of long processing times for reports on platforms.
Topics: Online Safety, User Experience, Design
Digital Services Act in European Union is a critical tool for protecting children’s data
Supporting facts:
- Digital Services Act will come into force in EU next year
Topics: Digital Services Act, Child Rights, European Union
Children’s rights by design and children’s participation in regulation processes should be priorities
Supporting facts:
- GC25 addresses the right of young people to participate in decisions about digital environment
- German Children’s fund has conducted research and concluded the need for quality criteria for participation
Topics: Children’s Rights, Regulation, Data Protection
Safe socio-digital spaces for children and adolescents are crucial
Supporting facts:
- Spaces should not be affected primarily by product guidelines or market-driven interests
- Civil society and educational organizations are seen as creators for safe social spaces
Topics: Safe Spaces, Digital Media, Adolescents
A holistic approach is needed to advocate children’s rights in the digital world.
Topics: children’s rights, digital world, holistic approach
Grabbing all stakeholders for children’s rights advocacy in the digital world.
Topics: stakeholders, children’s rights, digital world
Report
The importance of children’s digital rights in the digital world is underscored by the United Nations. These rights encompass provision, protection, and participation, which are essential for children’s empowerment and safety in online spaces. General Commendation 25 by the UN specifically emphasises the significance of children’s digital rights.
It is crucial to ensure that children have access to digital resources, that they are protected from harm and exploitation, and that they have the opportunity to actively engage and participate in the digital world. Young children often seek support from their parents and teachers when faced with online risks.
They rely on them as safety contact persons for any issues they encounter on the internet. As they grow older, children develop their own coping strategies by employing technical measures to mitigate online risks. This highlights the importance of parental and teacher support in assisting children in navigating the digital landscape and promoting their online safety.
Furthermore, the design of online spaces needs to be tailored to cater to the diverse needs of different age groups. Children, as active users, should have digital platforms that are user-friendly and age-appropriate. Children are critical of long processing times for reports on platforms, advocating for more efficient and responsive mechanisms.
It is important to consider children’s perspectives and ensure that their voices are heard when designing and developing online spaces. Human resources play a significant role in fostering safe interactions online. Children are more likely to use reporting tools that establish a human connection, thereby enhancing their sense of safety and anonymity.
The THORN study conducted in the United States supports this viewpoint and suggests that human involvement positively affects children’s willingness to report online incidents. The introduction of the Digital Services Act in the European Union is seen as a critical tool for protecting children’s data.
This legislation is set to come into force next year and aims to enhance data protection measures for individuals, including children, in the digital sphere. The act aims to address issues related to privacy, security, and the responsible use of digital services to safeguard children’s personal information.
Children’s rights by design and their active participation in decision-making processes regarding the digital environment should be prioritised. The United Nations’ General Comment 25 highlights the importance of young people’s participation in decisions about the digital space. The German Children’s Fund has also conducted research that emphasises the need for quality criteria for children’s participation in digital regulations.
By involving children in decision-making, their perspectives and experiences can inform policies and ensure that their rights are respected and protected. Creating safe socio-digital spaces for children and adolescents is of paramount importance. These spaces should not be primarily influenced by product guidelines or market-driven interests but rather should prioritise the well-being and safety of children and young people.
Civil society and educational organisations are seen as key stakeholders in shaping and creating these safe social spaces for children to engage in the digital world. In conclusion, a holistic approach is necessary to advocate for children’s rights in the digital world.
This entails promoting children’s digital rights, providing support and guidance from parents and teachers, adapting the design of online spaces to meet the needs of different age groups, harnessing the potential of human resources for safe interactions, and enacting legislation such as the Digital Services Act for protecting children’s data.
Children and young people should be actively involved in their rights advocacy and be included in decision-making processes in the digital environment. The involvement of all stakeholders, including governments, organisations, and communities, is essential in advancing and safeguarding children’s rights in the digital world.
Steve Del Bianco
Speech speed
216 words per minute
Speech length
455 words
Speech time
126 secs
Arguments
States requiring two forms of government-issued ID for any user of social media sites is an aggressive measure
Supporting facts:
- Steve Del Bianco pointed out that two states – Arkansas and California were sued by his organization for implementing this rule
- By their ruling, a legal guardian or parent had to give verifiable consent for anyone younger than 18 to use a site
Topics: Child Protection Laws, Social Media, Identity Verification, Legal
Broad child protection laws can potentially limit the rights of the child to access and express
Supporting facts:
- Steve suggested that judges ruled that the state was wrong to implement such laws due to the potential hindrance on a child’s rights
- Steve considers the best interest of the child as a balancing test between rights and protection from harm
Topics: Child Protection Laws, Child Rights, Information Access
Report
In the United States, the states of Arkansas and California faced legal action for implementing a controversial rule that required legal consent from a parent or guardian for individuals under the age of 18 to use social media sites. Steve Del Bianco, representing an organization, sued the states and deemed this measure to be aggressive.
The sentiment expressed towards this rule was negative, as it was seen as a potential infringement upon the rights of children and young individuals. The argument presented was that broad child protection laws have the potential to restrict a child’s access to information and their ability to freely express themselves.
Judges who presided over the case acknowledged the importance of striking a balance between child rights and the need for protection from harm. Steve Del Bianco, in the course of the proceedings, emphasized the significance of considering the best interest of the child.
He argued that the state’s laws should undergo a test that balances the rights of the child with their protection from potential harm. According to Del Bianco, these laws should not excessively limit a child’s access to information or their ability to express their beliefs.
Moreover, it became evident that lawmakers lacked an understanding of the broader implications of their laws. This led to legal challenges and raised concerns about the effectiveness of these policies. Del Bianco’s organization obtained an injunction that effectively blocked the states from enforcing these laws.
It was suggested that lawmakers should be educated and gain a better understanding of the potential consequences of their legislative decisions to avoid such legal challenges. To summarize, the implementation of a rule requiring verifiable consent for underage individuals to use social media sites in certain US states sparked controversy and legal disputes.
The negative sentiment towards this rule arose from concerns about potential limitations on the rights of children to access information and express themselves freely. The need to strike a balance between child rights and protection from harm was highlighted. Additionally, the lack of understanding by lawmakers about the broader implications of their laws was emphasized, underscoring the importance of better education and consideration in the legislative process.
Tasneet Choudhury
Speech speed
159 words per minute
Speech length
69 words
Speech time
26 secs
Arguments
Ensuring AI strategies, policies, and ethical guidelines protect and uphold child rights across the world, especially developing countries like Bangladesh
Topics: AI strategies, child rights, ethical guidelines, developing countries
Report
During the discussion, the speakers highlighted the importance of ensuring the protection and promotion of child rights within AI strategies, policies, and ethical guidelines. They particularly emphasized the significance of these efforts in developing countries, such as Bangladesh. Both speakers stressed the need to include provisions that safeguard child rights in AI policies, especially in nations that are still in the process of development.
The speakers also connected their arguments to the Sustainable Development Goals (SDGs), specifically SDG 4: Quality Education and SDG 16: Peace, Justice, and Strong Institutions. They proposed that by embedding measures to protect child rights in AI strategies and policies, countries can contribute to the achievement of these SDGs.
This link between AI development and the attainment of global goals highlights AI’s potential role in promoting inclusive and sustainable development. Although no specific supporting facts were mentioned during the discussion, the speakers expressed a neutral sentiment towards the topic.
This indicates their desire for a balanced and equitable approach to integrating child rights into AI strategies and policies. By addressing this issue neutrally, the speakers emphasized the need for a comprehensive and ethical framework that protects the rights and well-being of children in the context of AI development.
One notable observation from the analysis is the focus on child rights in the discussion of AI policies. This underscores the growing recognition of the potential risks and ethical implications that AI may pose for children, particularly in countries with limited resources and regulations.
The emphasis on child rights serves as a reminder that as AI continues to advance, it is crucial to ensure that these technologies are developed with the best interests of children in mind. In conclusion, the discussion underscored the importance of protecting and upholding child rights within AI strategies, policies, and ethical guidelines.
The speakers highlighted the specific significance of this endeavor in developing countries like Bangladesh. The incorporation of child rights in AI policies aligns with the Sustainable Development Goals of Quality Education and Peace, Justice, and Strong Institutions. The neutral sentiment expressed by both speakers indicates the need for a balanced approach to addressing this issue.
Overall, the discussion shed light on the need for a comprehensive and ethical framework that safeguards the rights of children amidst the development of AI technologies.