WS #235 Judges on Human Rights Online

18 Dec 2024 06:45h - 08:15h

WS #235 Judges on Human Rights Online

Session at a Glance

Summary

This discussion focused on the challenges and opportunities of integrating digital technologies and artificial intelligence into judicial systems. The panel, which included judges, lawyers, and technology experts, emphasized the importance of engaging the judiciary in internet governance forums to address emerging digital rights issues.


Key points included the need for judges to adapt to rapidly evolving technologies, with examples given of how AI is already being used in courts for tasks like scheduling, case assignment, and legal research. Panelists stressed the importance of developing comprehensive legal frameworks to safeguard digital rights, privacy, and transparency in the digital age. They also highlighted challenges such as cross-border jurisdiction issues in cybercrime cases and the need for better training and resources for judges in developing countries.


The discussion touched on the potential benefits of AI in improving judicial efficiency, while also cautioning about risks like AI hallucinations and the need for human oversight. Panelists agreed on the importance of including marginalized groups in digital justice initiatives and called for more inclusive policies.


The session concluded with calls for greater collaboration between the judiciary, technology experts, and policymakers. Participants emphasized the need for ongoing education and capacity building for judges and lawyers to keep pace with technological advancements. Overall, the discussion underscored the critical role of the judiciary in shaping the future of digital rights and internet governance.


Keypoints

Major discussion points:


– The importance of including judges and the judiciary in Internet governance discussions and forums


– Challenges judges face in handling digital evidence and cybercrime cases


– The use of AI and technology in judicial processes and decision-making


– The need for legal frameworks and capacity building to address digital rights issues


– Ensuring access to digital justice for marginalized groups


Overall purpose/goal:


The main goal of this discussion was to explore ways to engage and empower judges to address digital rights issues and adapt legal systems to the challenges of the digital age. The session aimed to highlight the importance of judicial involvement in Internet governance.


Tone:


The overall tone was informative and collaborative. Speakers shared insights from their experiences in different countries and contexts. There was a sense of enthusiasm about bringing judges into Internet governance discussions for the first time. The tone became more urgent when discussing challenges, but remained optimistic about finding solutions through cooperation and capacity building.


Speakers

– Nazarius Kirama: Moderator, Tanzania Internet Governance Forum


– Umar Khan Utmanzai: Advocate, practicing law at Peshawar High Court, Pakistan


– Martin Koyabe: Cybersecurity expert, Global Forum on Cyber Expertise


– Eliamani Isaya Laltaika: Judge, High Court of Tanzania


– Rachel Magege: Lawyer specializing in data protection and governance, Tanzania


Additional speakers:


– AUDIENCE: Various audience members who asked questions


Full session report

Revised Summary of Judicial Engagement in Internet Governance Discussion


This comprehensive discussion focused on the challenges and opportunities of integrating digital technologies and artificial intelligence (AI) into judicial systems. The panel, comprising judges, lawyers, and technology experts, emphasised the critical importance of engaging the judiciary in internet governance forums to address emerging digital rights issues.


Key Themes and Discussion Points:


1. Importance of Judiciary Engagement in Internet Governance


The panellists unanimously agreed on the necessity of including judges and the judiciary in Internet governance discussions and forums. Judge Eliamani Isaya Laltaika stressed that judges need to understand digital issues to properly adjudicate cases, while moderator Nazarius Kirama pointed out that the judiciary has been notably absent from Internet Governance Forum discussions. Advocate Umar Khan Utmanzai highlighted the need for legal frameworks to be updated to address digital rights effectively.


There was a strong consensus that judges should embrace AI and other technologies to improve court processes. However, this point also revealed some differences in approach. While Judge Laltaika advocated for enthusiastically embracing AI, Utmanzai cautioned about the challenges judges face due to lack of training and understanding of technology.


2. Challenges in Applying Law to Digital Spaces


The discussion highlighted several key challenges that judges and legal systems face in the digital age:


a) Complexity of Digital Evidence: Martin Koyabe, a cybersecurity expert, emphasised that digital evidence is complex and requires new skills from judges to interpret and use effectively in court proceedings.


b) Cross-border Jurisdiction: Utmanzai pointed out that the cross-border nature of the internet creates significant jurisdictional issues for courts, particularly in cybercrime cases. He elaborated on the difficulties judges face in determining jurisdiction, collecting evidence, and enforcing judgments across borders.


c) AI Hallucinations: Judge Laltaika raised concerns about AI systems potentially “hallucinating” and presenting inaccurate information, which could have serious implications in legal proceedings.


d) Lack of Precedent: Utmanzai noted that the lack of precedent in cyber cases creates difficulties for judges in making consistent and informed decisions.


3. Strategies for Improving Digital Rights Protection


The panel and audience members proposed several strategies to address these challenges:


a) Comprehensive Legal Frameworks: There was a call for the development of comprehensive legal frameworks specifically designed to address digital rights issues. Martin Koyabe emphasised the need for robust digital frameworks in countries.


b) Cross-border Collaboration: Audience members suggested strengthening cross-border judicial collaboration to tackle jurisdictional challenges in cyber cases.


c) Judicial Training: Utmanzai emphasised the need for increased judicial training on technology issues to bridge the knowledge gap. This includes incorporating cyber law into law school curricula.


d) Showcasing AI Benefits: Rachel Magege, a lawyer specialising in data protection, suggested demonstrating the benefits of AI to increase its acceptance within the legal community.


e) Judiciary Global School on Internet Governance: Judge Laltaika announced the launch of this new initiative to train judges on internet governance issues.


4. AI Integration in Judicial Systems


Judge Laltaika shared insights on the use of AI in Tanzanian courts for various purposes:


– Scheduling court sessions


– Case assignment to judges


– Language translation


– Legal research assistance


He also mentioned an upcoming session on AI ethics for judges, highlighting the proactive approach to addressing AI-related challenges in the judiciary.


5. Inclusivity in Digital Rights


The discussion touched on the importance of ensuring inclusivity in digital rights:


a) Nazarius Kirama stressed the need for policies to prevent digital exclusion of marginalised groups.


b) Rachel Magege highlighted how gender-based violence can be exacerbated online and how the digital divide affects access to justice.


c) Martin Koyabe emphasised that frameworks should embed human rights protections to ensure inclusivity.


Thought-Provoking Insights:


1. Judge Laltaika shared an anecdote about a high court judge in East Africa who was summoned by a disciplinary committee for allegedly using ChatGPT in writing part of a judgment, highlighting real-world challenges of AI use in judicial processes.


2. Judge Laltaika provided a positive example from Tanzania, where a strategic five-year plan was developed to identify judiciary needs and secure executive support for technological advancements.


3. Martin Koyabe praised Tanzania’s approach to developing its cybersecurity strategy, which embedded fundamental tools and instruments within the strategy.


4. Umar Khan Utmanzai described the situation in Pakistan, where many high court judges lack basic knowledge about the internet and AI due to their isolation from public interaction and outdated legal education.


5. Judge Laltaika used a metaphor of rebuilding a house to illustrate the need for adapting the judiciary to accommodate the digital world.


Resolutions and Action Items:


1. Launch of the Judiciary Global School on Internet Governance to train judges on IG issues.


2. Plan to include more judges and legal practitioners in future IGF meetings, including a parliamentary track room session on this topic.


3. A Tanzanian judge’s commitment to request government sponsorship for lawyers to attend the next IGF.


4. Tanzania Internet Governance Forum’s initiatives to engage judges in IG discussions.


5. Recognition of UNESCO’s guidelines for AI use by judiciaries.


Unresolved Issues and Future Considerations:


1. Balancing judicial independence with the need for technology adoption.


2. The extent to which court processes should be digitised (e.g., online marriages).


3. Addressing AI hallucination and ensuring the accuracy of AI-generated legal information.


4. Funding and resource allocation for judiciary digitisation in developing countries.


In conclusion, this discussion underscored the critical role of the judiciary in shaping the future of digital rights and internet governance. It highlighted the urgent need for judges to adapt to rapidly evolving technologies while emphasising the importance of developing comprehensive legal frameworks to safeguard digital rights, privacy, and transparency in the digital age. The session called for greater collaboration between the judiciary, technology experts, and policymakers, emphasising the need for ongoing education and capacity building for judges and lawyers to keep pace with technological advancements.


Session Transcript

Nazarius Kirama: We are on Channel 5 for the session. So we wait for like two minutes before we start. But Dr. Nazare Sukirama, I’ll be moderating the session and my fellow Daniel Turan will be moderating online, and Atanas will be our reporter, and I would like to take this opportunity to welcome all of you to this very important session, and this session is happening for the first time during the lifetime of the Internet Governance Forum. It is the first time that we’re going to have this, and we hope that next year will be bigger and the years after that. If you can put on the presentation, please. Just a moment. So like I said at the beginning, my name is Nazare Nikola Sukirama, a.k.a. Nazare Nikola, so that is my digital identity name, and today I’m going to be your pilot, and I have my co-pilot, Mr. Turan from Italy, and today we’re going to have a session on challenges on human rights online, and the overview of the session at this time when we live in the age of artificial intelligence, the digital age actually presents challenges that requires safeguards on human rights online, and when you are talking about things like privacy, freedom, inclusions are some key issues that need to be taken into account, so our session will focus on actually ways in which we can empower judges to address digital rights, and also we will explore some legal frameworks for inclusion. Like you see at the beginning, I said the judiciary, since the formation of the Internet Governance Forum in 2005, the judiciary has not been engaged in this space, if it is because of that notion of independence of the judiciary, but the judiciary as one of the four branches of government needs to be included in this space so they can not only learn, but also engage properly in the debates for how the Internet should be governed. Our attempt as Tanzania Internet Governance Forum is to engage judges, and we have had like two initiatives that are aiming for that, so you can see in the IGF multi-stakeholder model we are bringing the judiciary to be part of. Can you hear me now? Okay. The aim of this session is to make sure that there are as many stakeholders for collaboration and to protect digital rights online, and the objective is to adapt legal systems to safeguard privacy and freedom of expression, address cross-border enforcement challenges, foster inclusive policy for marginalized communities, and build judicial capacity in Internet Governance. So these are the kind of objectives that we are going to attempt to address through our esteemed panels. And the expected outcomes we believe the speakers will be able to address the legal systems better and also engage in terms of privacy and freedom of expression. Judicial collaboration is one of the expected outcomes in terms of roadmap for cross-border cooperation in enforcing digital rights consistently and fairly, inclusivity in digital policies that these are the kind of recommendations that we expect in terms of inclusive laws addressing marginalized and disabled communities to prevent digital exclusion. And the final expected outcome, identification of tools and strategies to enhance judicial engagement in Internet Governance. I would like to take this opportunity to thank all of our session makers, including the Honorable Dr. Judge Elia-Manuel Altayga for the input. Myself, Rachel Magege, who is joining us from online, Daniel Tura from Italy, who is our online moderator, Atanas Bazihire, who is on-site rapporteur, is around here, Dr. Martin Koyabe, who is our cybersecurity expert, sorry for the typo, and Pamela Chogo, who is a lecturer and a rapporteur online. We are also very glad today to introduce to you the new baby in town, the Judiciary Global School on Internet Governance, which has been registered by the Dynamic Coalition of School on Internet Governance. This basically is a platform for judges to learn about Internet Governance. governance, and it is an initiative of Tanzania IGF, ISOP Tanzania Chapter, and the Organization for Digital Africa. And we thank Dr. Eliamani, Honorable Judge from the High Court of Tanzania, for his enormous input and counsel on this initiative. So in the years ahead, we expect to continue to train judges from various jurisdictions around the world, and we look forward to engaging this critical branch of government in the IG space. We thank all of them for the opportunity that we have created to engage judges in this space. Now I would like to take this opportunity to introduce our speakers. The speakers will have like one minute to introduce themselves. You can say your name, I mean, your expertise, and the organization you come from, and then from there we will continue. Welcome to the session, ladies and gentlemen, and I look forward to your interaction. We’ll have a Q&A session, and we look forward to receiving your questions, and we hope and believe that we have in front of you our speakers who are very capable of answering and interacting with these questions. Thank you. We start with Umar from Pakistan, Advocate Umar from Pakistan.


Umar Khan Utmanzai: Hello. Thank you so much, moderator of this session. This is Advocate Umar, basically belong to Peshawar, Pakistan, practicing law at Peshawar High Court, a society organization with the name Citizen Rights and Advocacy Forum CROP, which basically targeted the awareness for the sensitization of the citizen rights, digital rights and privacy protection is one of our objectives, sensitizing the local community that accesses the internet, free and fair internet is your right, along with that, that how can we protect them, how can we sensitize them in terms of the cyberbullying, especially the women, which are very vulnerable in Pakistan. So this is what we are doing. Along with that, I’m taking these cyber cases in Pakistan, and so this is what we are doing in Pakistan. Thank you.


Nazarius Kirama: Dr. Qayyabi.


Martin Koyabe: Yeah, thank you very much, Naz, and first of all, let me take this opportunity on behalf of the Global Forum on Cyber Expertise, where I work and consult for, to really thank the organizers and judge for actually pushing this idea, I remember we talked about it in Kigali at some point, but I’m really glad to see it here. So my name is Dr. Martin Qayyabi, my real area of functionality is cyber security. I’ve worked in Africa extensively in a number of countries, specifically looking at strategy development, looking at, but more importantly, building the ICT sector within the continent. We’ll talk more within this session, and I’m really pleased to see some of you here today. Thank you.


Eliamani Isaya Laltaika: Thank you very much, facilitator. My name is Eliyamani Isayel Altaika. I’m a judge of the High Court of Tanzania and an adjunct faculty member of the Nelson Mandela African Institute of Science and Technology, where I taught for at least 10 years cyber security law, bioethics, and other law-related courses to scientists. And it’s true that I am, like we will discuss, I very much believe that without engaging judges, while building the digital economy, we will be shooting ourselves in the leg, because if you have very good policies, very good laws, and you bring a case before a judge, and that judge cannot even tell what a mouse is from another device, then you are just doing nothing. So I believe that judges are a fulcrum of any attempt to protect rights, be it digital or physical. And it is, therefore, my firm belief that in the next few years, we will have a lot of judges, and that is better for humanity. I have a dream.


Nazarius Kirama: Thank you so much, Judge, for always being there for us. I know we have had a lot to work on in terms of making the dream that Dr. Koyabe was talking about. In Kigali, it was just a dream, but now it is a reality. And we hope to continue to borrow your wisdom, engaging the judges into the intergovernance space. Now I will start with questions to our able speakers. I will start with you, Honorable Eliamani. Could you tell us why you have been advocating for inclusion of the judiciary in the IGF? I mean, why is it important?


Eliamani Isaya Laltaika: Yeah. Thank you very much. Hello? Yeah, I can hear you. Yeah, thank you very much. Like I said, this is really something very close to my heart, and it has a personal story. After my PhD, I was employed by the Nelson Mandela African Institution of Science and Technology. It is one in a network across the continent. Our elder Mandela had a dream of making Africa a hub of skills and knowledge and capabilities in STEM, science, technology, engineering, and mathematics. So he engaged with funders and donors from across the world, and through IMF and World Bank, they wrote a concept of establishing the MIT of Africa. They said, OK, we really cannot have one single MIT for Africa because of language barriers. There are countries which are Arabic, others are French, others are English, others are Portuguese. We also cannot have one MIT for the whole continent because education systems differ. At the end, they decided to establish four institutions, and I’m very glad that through diplomacy and efforts of the government of Tanzania by then, through President Jakai Amrisho Kikwete, who before that was in affairs, one of those prestigious institutions was established in my hometown of Arusha. As soon as I finished my PhD, I got into this institution, and I was tasked with preparing curriculum and course outlines to teach cyber security law, intellectual property, bioethics, and every other law that is required by scientists. It was during reading materials available in the cases that I discovered that there was a huge knowledge gap between judges or lawyers on one hand and scientists. From there, I became a link to try and bridge the gap. I conducted many seminars with the law society and also with the judges to try and alert them on fundamental issues related to ICT. I didn’t know that I would become a judge by then, but five years, six years later, I was appointed by the President to become a judge of the High Court in 2021. My friend, Nazar Kirama, doctor here, used to work with me for Tanzania IGF before he became a judge. One day, he came very humbly and tried to ask me whether after being a judge, I would still come to IGF. He was almost trembling, so I just gave him a hug and said, please sit down. It’s okay. I’m still the same. We traveled with him to Kigali, where Dr. Koyabe is staying. Throughout that trip, I was saying I need to have judges inclusive because I’m the only one who is getting this knowledge. Luckily, we traveled to Kyoto. During the closing ceremony of last year’s IGF, I was given a platform during the closing ceremony. After Prime Minister Suzuki, I took the stage and I made a joke. I said, there are about 9,000 people here, but I’m the only judge. What happened? I really pray that next year, we have more judges. I’m very glad that the IGF Secretariat took this very seriously. They wrote me an email back in Tanzania and said, we have started. Your colleague, Dr. Kirama Nazar, has brought a proposition. Today at 12, you will see now a session. established by IGF, which will compare the ethics of judges and use of artificial intelligence. Because some judges are afraid of coming here because they think artificial intelligence is eroding their ethical values. So we have a session today to answer, does use of AI make you a better judge or a worse judge? And we’ll have a very open discussion to make sure that judges all over the world, gain the confidence in coming with us. I will stop there for now because I have so many stories to tell.


Nazarius Kirama: Thank you so much, Judge Loltaika. And I think you’re one of the judges that I could dare say that you’re a judge down to earth. We have advocate Rachel Magege trying to join us online. And as soon as she joins us, we will have her introduce herself. And we continue with advocate Uma from Pakistan. Advocate Uma, I know you have been in the space of lawyering for several years. And do you think judges need to learn new tools of statutory interpretation to accommodate human rights in the digital space?


Umar Khan Utmanzai: The digital world is growing rapidly and not just law, every field is going to adopt it. So I believe that being in the field of legal paternity from the last three years, it is very mandatory for the judges as well to adopt the new tools just to face those to trial and deal with the cases related to cyber crime and other issues. Because in the traditional system of law, you cannot have such opportunities to trial cases related to head speech, freedom of expression. So there are issues like we have seen the IGF Secretariat has established the charter of human rights and principle for the internet, which has been delivered through the UDHR 1948, which is important. So I think those who rights we have been available to the citizens, to the humans under the UDHR now have been came through the charter of the human rights and principle for the internet. And so I believe that being a judge, you have those things which are not developing the way the internet has developed, the way the AI, no, no, just internet, no, the AI in the coming year, you will have more things in the era. So I believe that for the judge, it is very important to adopt himself with the tools which are going to have the citizen in the field of law. And for the judges who are going to deliver the justice, are the more important, it was one of the important stakeholder, the way that educate the judge high court, Tanzania has mentioned in a very brilliant way that why he has always in an interest to work on the digital rights. So I believe that in the coming year or in the nowadays, it is very important for the judges to adopt with the technology, with the new tools to interpret the laws. And I believe that the traditional, the old laws do not deal, do not answer and address the issues which are currently in the system of the digital world.


Nazarius Kirama: Thank you, Advocate Umar. And I think that is very important for judges to be able to keep abreast with the emerging technologies and the tools that they need to deliver justice on time. Because we know justice delayed is justice denied and technology tools have this capacity to be able to shorten the time with which the justice can be delivered. Now, we have Madam Advocate Rachel Magege. Can you hear us?


Rachel Magege: Yes, yes, good morning. I can hear you loud and clear. Can you hear me?


Nazarius Kirama: Yes, we can hear you Rachel. Thank you so much for joining us online. And if we can take one minute to introduce yourself and your institution. And if you can answer this question after that. The advocate, can gender-based violence be exasperated online? That will be your question after your introduction. Welcome, yes.


Rachel Magege: Thank you, thank you so much. And good morning to everyone. I am Rachel Magege. I am a Tanzanian citizen and I am a lawyer. My areas of practice are in data protection, data governance more generally. And I am very, very honored to be a part of this panel and a part of this conversation. I sit on the board of directors under the Tanzania Privacy Professionals Association. We have an association of privacy and data protection experts in the country. And so I’m really glad to be having this conversation. So Mr. Nazarius, on your question about gender-based violence and the digital and online platforms. Absolutely, gender-based violence can be further moved and amplified and can become even much greater because of all of these different digital and online platforms. It is very important for people to know that many times what happens in the physical is what is going to move and happen even in the online and digital spaces. Likewise, what happens in the online spaces with people who do not even know each other with regards to perpetuating harassment and bullying to specific genders also can move into the physical. So I’m usually very sensitive about this topic with the many clients and the many people that we talk to but even in the judiciary sector, because already if you’re looking at, for example, the physical space where with different backgrounds, different relationships and different structures like that, if it’s already happening in the physical and you have an institution that wants to introduce artificial intelligence or other different technologies, that same mindset is still going to happen even in the online and digital platforms. So it’s very important to make sure that in as much as people are working towards understanding and learning these emerging technologies, they also need to understand and learn more about themselves and about the biases that they carry, which can transfer even into the digital and online space. That’s a very brief response I can give for now. I’m happy to answer more questions as they come. Thank you.


Nazarius Kirama: Thank you very much, Advocate Rachel, for your… And indeed, I was tickled by what Honorable Judge said at the beginning that it is very important for the judiciary to be engaged in terms of all these issues because the cases about either internet or internet-related issues will always end up in courts. So, would always be the one that knows the issues. And this is what we are trying to achieve in terms of engagement of the judiciary in the IG internet governance space. Honorable Judge, how can the governments make sure that privacy laws keep up with the fast-changing technology? Because now we have artificial intelligence, we have all these blockchains, and there is a resistance against all four. How can these governments from across various jurisdictions with the fast-changing technology?


Eliamani Isaya Laltaika: Judge, Facilitator, one of the issues that are important for the judiciaries to do is to breathe life to legislation. When a law is enacted, be it at international level or at national level, it is a document which is more or less formless. The definitions are vague, the rights are not understandable. Sometimes if it’s an international instrument, it can stay in the cupboards for many years, but as soon as this law goes to the court, the court gives an interpretation, it breathes life, it makes that law something that people can relate with. And that is what we are trying to do currently with the data protection and privacy laws. For example, the first thing that people do not distinguish out there, lay people and many of our technocrats, is that data protection and privacy are totally different. Many people think data protection and privacy are the same. They are not. Before a judge, a judge knows that privacy is part of the human rights, but data protection and protection of one’s information is a fundamental right. If you look at the European Convention, this is article seven, article eight, and they are totally different. In our country, for example, in Tanzania, all data protection cases start with the commission because it’s a fundamental right that is protected administratively. And right issue, you go directly to the high court. You don’t even go to the administrative procedures. If a lower court recognizes that they have a case touching on fundamental human rights, they immediately ask you to go to the high court. So at the moment, what I’m seeing throughout jurisdictions is that judges are defining concepts. Judges are clarifying concepts. Judges are putting, laying down the foundations of integrating fundamental privacy and data protection laws within the larger fabric. Breaches are done by the private sector. It is the private sector which manufactures these gadgets you are seeing. But it is the role of the government to regulate them. The court stands there as an umpire to say, OK, the manufacturer is responsible for protecting privacy. When you release a headset or a computer or whatever gadget, it is upon the regulator to make sure.


Martin Koyabe: It is stored in digital form in zeros and ones. So what that means, there is a huge fundamental challenge in how we actually handle the digital evidence. The second issue is that it is very volatile. Because if you either change the time when it is stored, if you move it from one point to another, it might actually alter the actual evidence that is attributed to that digital evidence. There are some specific areas that need to be considered. It also traverses boundaries. Because you can carry digital evidence in a USB. You can carry it in a memory stick. You can carry it on any other device. And then move it from one jurisdiction to another. And that brings a huge amount of challenges when it comes to the judgment, when it comes to how we have to handle that issue. So therefore, the processing of evidence has to be based on the principles that have been agreed across jurisdictions. There is also the challenge of the dependency of digital evidence on technology. You will need a device, or at least something that will interpret what is referred to as digital evidence into a form that is admissible. And therefore, the authenticity that the judge talked about of that device being used to interpret that digital evidence becomes fundamental. And therefore, it is important that we understand the realm of association and where we are going when it comes to the issues of evidence that is admissible in court. The other issue that we have to be cognizant of is that when you consider the digital evidence, and when you look at how we need to make sure that we are able to admit it in court, then the preservation of that evidence becomes fundamental. And all this requires what we call expertise. Some expertise in terms of processing, digital evidence officers, and so forth. And then the last point that I wanted to point out is that we have a duty for those of us who work with the local government to ensure that the experts that we have, who provide maybe expertise in courts, who provide what we call the forensic expertise within the judicial system, are compensated well enough so that we can preserve that knowledge in this particular institution. So that’s how I view the whole concept of digital evidence and how fundamental it is when it comes to having court cases that are very, very fundamental. Let me just give you an example, just before I end, of what happened in a country that I know of where there was evidence of a crime that had occurred in a specific room. And this evidence was actually on the screens of the computer. But because the collection was so rudimental, the police and other officers who went to collect this evidence simply plugged the cable from what was on the screen, disappeared, which was critical for judgment within the law courts. So there’s that bit. There’s also the issue around when you issue a court order. In some cases, court orders are supposed to be issued in written form. So some laws are so arcane, they cannot accept any other form of court order unless it has been written and delivered to the individual person who is supposed to appear. So we have fundamental differences in how we need to update our courts, our judges, and also our legal framework. But we’ll come to that issue at some point. Thank you.


Nazarius Kirama: Thank you, Dr. Koyabe. And I think now we are almost going to finish this first segment of the presentations from the speakers. And before we take the questions, I would like to ask Rachel, what steps can policemakers take to ensure people with disabilities and marginalized groups have equal access to digital tools and services?


Rachel Magege: All right. Thank you so much for that question. Here’s what I’m going to say. A lot of frameworks and laws and regulations, subsidiary legislation have already been put in place. And whereas it is good to have a specific law that mentions a specific person or group of people, what I see as more beneficial is for policymakers to continuously remind and maybe even educate the implementers and the regulators to be very creative and deliberate in how they carry out the provisions of the law. Yes, because it is good to say that we want a specific law on digital platforms and the digital innovation space to say this and this about people with disabilities or the elderly and this and that. But if you already have a law, for example, the People with Disabilities Act, yes, and it’s very general, as with many…


Nazarius Kirama: Rachel, I think we lost you. Hello? Anyway, I think we can proceed. Did we lose her? Anyway, I think I will continue with Advocate Umar on… Is Rachel online? No. Can you hear me, sir? Yes, we can hear you.


Rachel Magege: My sincere apologies, the Zoom link just threw me out, but I’m back now. Let me just quickly wrap up my submission on this. So what I was saying was that as long as you have laws that are already in place, the implementers and the regulators have to be very creative in knowing that. For example, I have a provision that says everyone needs to have access to clean and safe water. All right? You as the implementer, as a regulator, are going to go to a place where you will have little children, you will have the elderly, you have people with physical disabilities, maybe may not have people who may have visual impairment, hearing impairment, and things like that. You have to be very creative into looking at all of these different groups and seeing that in order for all of them to get clean water, this person is going to need this, this person is going to need that, this person will need extra assistance in this, but at the end of the day, all of them get water. So it is the very same with digital platforms and with the digital space. And we’re already seeing a lot of these things come up once again, because what you are doing is at the end of the day, the policy makers will get to say, we have created this law. And we’re already seeing that when it is executed and implemented, that maybe young students or young girls or people with disabilities are all having their needs met, but in a different way as required for each of their different needs. So that’s the biggest thing that I would say, Mr. Nazarius. I know Tanzania has already been working on a number of different legislation for the benefit of the members in the audience to know. And currently the ICT is working on a national artificial intelligence framework and strategy. I do remember sitting down with them and with some development partners where we advised them that in as much as you are now with the ministries now conducting different assessments and different needs assessments and impacts in the country, you have to make sure that this law and this framework that you are creating, first of all, aligns with all the other ICT frameworks in the country, and especially with the Data Protection Act of Tanzania, but also make sure that it caters to every different group of people. So that is the feedback that I can give right now. And I know that once a regulator, once an implementer, you know, the commission, original commission, district commissioner, once they are creative in how they carry out this law, it is actually also going to reduce the number of complaints and lawsuits that Dr. Laitaka gets and receives in court. It’s going to reduce that a lot because you will see that this implementer, this government official has actually taken the time to understand and see what the policies have and what different groups of people need and how those different needs can be met. Thank you very much.


Nazarius Kirama: Thank you advocate for, you know, bringing the pieces from the perspective of marginalized and, you know, and sections of the society. Now we are going to go to the audience, but before we go to the audience in the room, we’d like to have a question from Zoom. There is a person from Zoom. There is a question for the court, for the judge. So if…


AUDIENCE: Can you hear me?


Nazarius Kirama: Yes.


AUDIENCE: We have a question for our horrible judge Eliamani. Nana is asking what perspectives are there for the use of AI to support, refine, clarify, enhance, influence decisions for judges? This is a question from an operational point of view.


Nazarius Kirama: Judge, that comes to you.


Eliamani Isaya Laltaika: Thank you very much for that brilliant question from our Zoom attendance. It is now an open secret that AI is used in the courtroom, including by judges and the assistants. However, there are no guidelines or regulations, and there are very, very different perspectives on how jurisdictions are embracing artificial intelligence. Within the East African community, and I’m not going to mention countries here, there is, in one country, a judge, a high court judge was summoned by the disciplinary committee because a part of his judgment was allegedly written by a chat GPT. I think there were a four and a zero that looked like a chat GPT four, and some sentences which were not very legal, and he was called to answer, and many lawyers have been reprimanded, and they got scared. Within the same East African community, just across the board, a chief justice is saying, please embrace generative AI. Make use of them to clarify what presentations by lawyers, and at the end of the day, you are responsible in what you say, but use any tools. Now, we are not sure what happens because these are countries sharing the border, sharing history of development of law from the British. However, luckily, we now have UNESCO. You talked yesterday when we were launching the guidelines for use of AI by judiciaries that is being pioneered by UNESCO. I hope that in the next few months or two years down the lane, we will have clear guidelines. To answer specifically, from my jurisdiction, we are using artificial intelligence in the court in Tanzania for at least four ways. One, scheduling. We no longer put our schedules manually. It is automated. There is electronic management system where I know the cases that will come to my chamber two weeks or next month because they are just automated. Secondly, is assigning cases. The judge in charge is no longer the only person who assigns cases. In the past, people would say, this boss is giving me investigations with advocates who are sole complainants. Nowadays, it is the artificial intelligence which just says, judge one, judge two, judge three. Cases are filed. The artificial intelligence says, lal taika, so-and-so, so-and-so. So you are sure that there is no bias. You handle your file very well. Thirdly, we use artificial intelligence in language translation. Our country uses Kiswahili, which is the national language of the court is English. So if someone speaks, we already have, we have deployed the TTS, transcriber and translation systems where you can transcribe what a person is writing and take back to. And thirdly, we use it intensively for research. And I said yesterday that when the chart GPT started in 2022, 2021, it was very inaccurate. You would get fake cases. It would come up with cases which are not anywhere in the law report. It changed it dramatically. Now you can be very sure that it provides you with cases that have been decided by the Court of Appeal of Tanzania. Only the judge at last to go and say, this is relevant here, this is not. And thirdly, to finalize that, we are actually using AI to predict whether that will get the simplified language of texts, including legislations or acts of violence.


Nazarius Kirama: Thank you, Judge, for your critical intervention. And now I would like to open Q&A to the audience in the room. If you have any question to any of the panelists, you are welcome to see your hands. Please, if you can get the microphone.


AUDIENCE: You hear me?


Nazarius Kirama: Yeah, okay, here you go.


AUDIENCE: Thank you so much. Thank you so much. Really appreciate it. This has been a very interesting conversation. I also attended the other session on AI in judiciary with Honorable Judge as well. I personally learned a lot and thank you for the other panel. I wanted to comment on the fact that there are a lot of people in the room who are not familiar with AI. And I think it’s very important to comment because we tend to talk a lot about a specific context, but there are other countries that they have a different. context, for example, in terms of their maturity to use technology. And I’m from, obviously, I’m from Iraq, and I wanted to talk about, first, the political role in the country to use technology, and the judiciary, I know that they are independent, but they are always affiliated somehow with the political, in line with the political role. Before getting to the question, I wanted to comment what Honorable Judge also said, that it’s unfortunate that we don’t have many judges, actually, or lawyers, or legal practitioners, even in the room, that we are very few. So hopefully, in the next year, we will have more judges and more lawyers and more legal practitioners in the room to share their experiences and views on that. My question will be for the Honorable Judge, because we are advocating to bring all the stakeholders, like in my country, for example, to including the judiciary. For a judge who have been heavily depending on paper, and not using technology, as you mentioned in the beginning of your speech, how would you convince someone who needs to change all this, who needs to spend more time to learn technology, to hire experts to reserve evidence, for example, analyze evidence, and things like that? What are the main three arguments, let’s say, that you will be using when you advocate for that change? And the other question, do you think at the beginning, judges will need, like in addition to investigators, will need someone who will be tech-savvy, for example, to help with all this kind of evidence? Thank you so much.


Nazarius Kirama: A very loaded question, I might say, and I think, is it a consensus as they come, or we take questions first, and then they answer? Which one should we take? As they come. Is that a consensus? I am a democratic moderator.


Rachel Magege: It’s okay with me. Thank you.


Nazarius Kirama: Okay. Thank you so much. Judge, if you can intervene, please.


Eliamani Isaya Laltaika: Okay. Very quickly. Thank you very, very much for that question. Unfortunately, it is true that we are not in isolation. Judiciary, the old-fashioned way of saying separation of powers, I am from the judiciary. My colleague is a minister. We work together. So, I meet my minister and say, okay, look here, this is the law you are proposing. It doesn’t work our way, so do this. That’s what is happening in the U.K. That’s what is happening in the U.S. But if you follow a very strict kind of separation of powers, you will be left, so everyone has to start somewhere. Because judges who are so based on paper and writing in Dar es Salaam can be encouraged to start small. In our country, it is mandatory for every judge to use a computer. You cannot avoid this. The former professor of law at the University of Dar es Salaam is at least ten times techier than me. You can confuse him with a computer scientist, because he talks about data protection and everything, and he’s the one who has brought Tanzania to that level of use of AI. It is true, question two, we first got assistants who are young lawyers. Every judge has one legal research assistant who knows the computer and has been encouraging judges. So, you can welcome you to come to Tanzania to learn. We are welcoming many countries. We get at least ten countries visiting the judiciary of Tanzania in half a year. So you are welcome from Iraq to come and we will deliberate how we can transfer the knowledge so far.


Nazarius Kirama: Thank you, judge.


Rachel Magege: Mr. Nazarious?


Nazarius Kirama: Yes.


Rachel Magege: If I may, I had written on the chat section to add just a very quick response after the honorable judge, if that’s okay.


Nazarius Kirama: Go ahead.


Rachel Magege: Yes. Honorable judge, thank you so much, as always, for your endless wisdom. And to answer this question to the gentleman who asked about what is the political will or the acceptance of these technologies, I do want to say a little something as well, if you may allow me, of course, much to the context of Tanzania in different aspects of Africa. So, because, and I’ll be very honest, because there is already a fear of technology and I’m looking at diverse groups of people, yes? We are here sitting in this room and virtually, we are lawyers, I’m a lawyer, we have judges, we have different technology experts. But outside of us, there are so many people out there, we’re looking at that digital divide. There are people in the rural areas, there are people with different levels even of education and access to education. But these are the same people who in one way or another may find themselves in courts for different matters and different reasons. And here you have the judiciary of Tanzania, for example, already using artificial intelligence. So how do you bridge that gap together? One of the things that I think really helps is also showing people the benefits and the good of AI. It has to be more of a narrative than what they are seeing online and on the internet or hearing from their neighbors, yes? Because if you’re here and telling people or telling me that because of artificial intelligence, a vulnerable judge like Taka can now read large volumes of evidence in a shorter amount of time, that it can help him as he writes his judgments, that is already a good thing. In Tanzania, for example, one of the big benefits of AI that came in just a number of projects, one of them was from the Sokoine University of Agriculture, where a lecturer has been using artificial intelligence to quickly detect diseases in cash crops, in maize, in corn and things like that. So if there is a language to use to already start communicating to lawyers, to court clerks, to judges, to many different members of society, that artificial intelligence can help you predict floods that are coming in your country, can help you do this and that. I think that might be a good way, a good avenue to use to even start creating that political will for people to see and understand that artificial intelligence can be used to make life easier and therefore frameworks to be established and this and that. So thank you so much. That was my additional contribution.


Nazarius Kirama: Thank you, Advocate. I saw a question there and over here. So you start with him and go to the gentleman over there.


AUDIENCE: Thank you. Thank you, Modrito. Sorry to put you on the spot, but I enjoyed your response to that question. Maybe drawing from your experience in Tanzania, it is true really that from when ChargePT started to now, there’s been some level of accuracy, like you said. But even still in 2024, there is still instances, drawing from experience, where we find that the AI systems hallucinate and they tend to present what is not there. So then my first question is, how does the judiciary in Tanzania take care of instances where facts are presented or authorities are presented and they do not really exist? Just on a little bit of background, AI hallucination has given rise to conversations around ensuring the human in and on the loop in the system. So what measures are there in the judiciary to correct, to ensure that materials presented by the AI systems are actually accurate? And the second question, and it’s going to be very short, the focus seems to be on judges from what you just said. We also find that lawyers coming before the courts use ChargePT for their briefs. What measures are there in place in the Tanzanian case to check accuracy in the briefs filed by lawyers in the court? Thank you.


Nazarius Kirama: Thank you. I will also take it to the gentleman, so these answers can be answered.


AUDIENCE: Okay. Thank you. My name is Doron from the UNEGOV. Thank you for this very productive session and opening call to all these relevant questions about judiciary. I will just quickly say, because we mentioned the three branches of government, executive, legislative. There is a common sense that the executive goes with full speed with these digitalization efforts. But judiciaries, there are some countries that catch the pace, but there are plenty of countries where it’s kind of left behind. And this is where the independence is coming to place because judiciary is not strong enough to have to be financially independent to take all the benefits from the digitalization. I will just briefly mention an example. We’re teaching, making courses for public prosecutors in one country. And at the end, the comments was, okay, experts, we know this is all good, data protection works, we know the rules. But the problem is we are ten prosecutors, but we have only three computers. So we wait for one prosecutor to go to court, for the others to work and to write some of the things, which means that we can have access to his evidence and the other, and so much of the privacy. My question first is, how we can push the executive power, the government to understand that the modernization, the capacity building of the judges of the skills must be supported by the government, by the government financially, mostly. And my second one is a more provocative one. How far do you think we can go with the digitalization in the judiciary? I’m saying this on the service supply side. For example, can we allow people to leave their last will on using online service, or like concluding their last will, can they conclude marriages online? How far do you think, where we set the red line, and we say, okay, because there is a reason why you can go and leave your last will in front of the judge and two witnesses to witness that it’s not under pressure. So how far do you think we can go with this digitalization?


Nazarius Kirama: Thank you so much. If we can have the answers, and then we’ll go, you know, to the last segment of our session. We only have like 15 minutes left. So let us keep our, you know, contributions short and intervention short as well. Thank you so much.


Martin Koyabe: Okay. Let me try and answer part of that question, but others, I’ll leave to the judge. The fundamental issue here is the starting point for many countries. And what we are seeing from Iraq and others is the need for having what we call robust frameworks, digital frameworks in the country. So for example, when you look at the case of Tanzania, for example, when they were developing their cyber security strategy, they were very keen that they must have those fundamental tools and instruments embedded within their strategy. So for Iraq, for example, I would urge that in your strategy and framework, you include specific areas of prescribing the type of cases that could arise, being proportionate in terms of the punishment that is allocated, and being able to have those fundamentals within that. The second thing is also to have what we call the human rights component in your framework. Things like freedom of speech, freedom of expression, being embedded within the framework and the structures of the country. And then lastly, also is the area around capacity building. And capacity building, we can’t argue. It is one of the key areas that we need to really look at. So what happens is you’ve got to have what we call the soft approach. So you take the judges. They don’t like going to workshop. They want to go for retreats. So take them where they want to go. Or to impact on the judges in a gradual manner. And take the judiciary, take the legislators, take the executive, and then train them within that concept. So that each of them can see an equitable contribution towards the functioning of what’s supposed to happen. In terms of the budget, which I’ve had here, there has to be a concerted effort politically to be able to support the judiciary to automate its systems. There are so many countries that have broken the backlog of cases. They are really efficient in terms of how they bring also the benefit to the citizens. Because their cases can be heard quicker. There are also cases where platforms have moved. We are now having cases being done online. So the idea about budgetary allocation is critical. But let me also come to the technical people who describe these things to the executives. I think we have a duty as technical people in the room how we describe and also how we explain problems. If you go to a politician, a politician doesn’t want to hear money in the middle of the attack. I don’t know what sort of winning elections is their money. So there’s a way of how you actually interpret these things to the people who make decisions. Like in your case, if you’re trying to convince other people. Thank you.


Nazarius Kirama: Thank you. Judge, if you have anything to add?


Eliamani Isaya Laltaika: Thank you very much. I really do. But I will be brief. The first question is for my brother. Okay. So it’s a new concept. Hallucination is a new concept in AI, generative AI, where the robot simply fails to understand what you’re asking. You ask them what are factual issues related to economy development in Tanzania, and they give you things from Mozambique. You have to tell it several times for it to come back. Hallucination is there to stay. But I want to be positive. Even among lawyers addressing judges, they can experience hallucination. Counsel, did you really mean this? Anyway, on a serious matter, I want to leave on legal issues that gen AI depends on large data and trained on it to become accurate. When it comes to court cases, they are massive, millions. In Tanzania, every single judgment must be uploaded online. If my principal judge gets a report that Judge Laltaika has decided 20 cases today, tomorrow he must see them on the signature and stamp of the court. So if you go to Tanzli, T-A-N-Z-L-I, you will see every case I decided for the past four years. And AI is feeding on this to develop almost exact copy of cases of authorities I need. And that is what is happening in other countries, because judgments are copyright free. The law in Tanzania says if you write anything as a judge, you cannot copyright it. It belongs to the public. That’s the case in Pakistan, in Kenya, everywhere. So out of all other fields, law and AI go so well. In the next 10 years, it will be very easy to just generate something and you get almost what you would have written as a judge. However, the second question, it is upon you to be sure. And in Tanzania, we have 15,000 advocates. These are the guardians of law. If they see anything unusual written by a judge, the screenshot, it is sent all over. Look at this judge. What is he writing? So we are fearful. We are very afraid. So if I generate something from the internet, I will still read it so carefully to make sure that it goes there while it has gone through a factual process. In Tanzania, we got out of this because we started with a strategic plan. We sat down and made a five-year strategic plan. We identified what we need. We handed it over to the executive. The executive said, okay, now we know your needs. In the past, they didn’t know what we need. They would just simply give money this month, next month, but now they have a clear picture of what the judiciary needs in the next 10, 15 years. Lastly, this is a very difficult question to answer from my brother. How far should we go with digitization? If I were to decide, I would go digitization for everything. For example, during COVID, many people were waiting to come and conduct their marriage or meet a judge. The judges are afraid. I could just say, okay, raise your hand where you are. Say this solemnly. I declare you married. Then you go. You will send me a signed copy of your signature. Why should I see you? There is a book by a professor of law from New Zealand, I don’t know, Australia, who is a futurist of virtual courts. He has written how courts will be in the next 50 years. Everything will be online. You will not need to have a lawyer come before you and inspect them as if you are a police officer inspecting. You just need material. We will be working on that. That’s how the world is moving. Thank you.


Nazarius Kirama: Thank you. Thank you, Judge. Now we go back to the last segment of our session. Advocate Umar, I don’t want to shortchange you in terms of questions. I know you have another question here. What challenges do judges face when handling cases about crimes and how can they be addressed?


Umar Khan Utmanzai: Microphone. So it’s a very important thing. Rapidly growing technology. Judges also have certain issues with the technology. In terms of the cases dealing to cyber security or cyber crime, judges are facing the complexity of the technology because judges are not well trained on the things. Like in Pakistan, the judge sitting over here is lucky to be sitting in an area like this. In Pakistan, a judge cannot even sit in public. He is not, meet the people in the public. So those who are judges in the high court are the judges from last, have graduated before 20,000. 1998, 1999, and 2000. So they even do not know about the internet, they do not know about the AI. So there is a complexity of the technology. Along with that, the jurisdictional issues, like if a person is sitting in Pakistan, has been harassed in somewhere outside of Pakistan, a case come before the judge, how will he trial that case? That is one of the biggest issues in the cyber cases. Along with that, the lack of precedent, like we in Pakistan have a prevention of electronics crime, ECPECA, which was passed by the Parliament of Pakistan in 2016. So there are not so many cases for the judges to know the lack of the precedents. Along with that, the evidential challenges, like the cases, the evidence can be altered, can be tempered. So how will a judge look it up to it, the tempered and altered, the forensic? So in so many countries, like the brother has mentioned in Iraq, in the developing state, on the global south countries, there is an issue of the technology along with the forensic. So there are certain issue according to the, that is the speed of the proceeding. Like in my country, in the whole district, there is one, this is one judge who is dealing the cases of cyber crime. So if a person is involved, or is, it will take years. So it is very, also an important factor to speed up, to make maximum number of judges for the cyber crime cases. Along with that, the awareness regarding the emerging threats, law, the technology is changing. Like five years ago, there was no AI. So if the laws were made according to the situation, now we have AI. After the five years, what will be the technology next? So the changing of the emerging threats for the judges. So I believe that the training for the judges, the curriculum for the judges, the cyber, along with that, which is very important, that the students of law should be taught the technology, should be taught regarding the cyber law. Like in Pakistan, I have been graduated from a very renowned law college, but I have never taught the cyber law as a minor subject, as a major subject. So when I came into the field, this is what I learned by myself. So I believe that the cyber law and the things related to the cyber, or the technology, should be included in the curriculum. So this is what my intervention from my side. Thank you.


Nazarius Kirama: And I think we need some kind of, like Dr. Martin Koyabe alluded to, I think we need frameworks. Is there a question from online? And then we’ll go to that lady. As the panelists, please make your final two minutes, prepare your final two minutes contribution, because the session is about to wind up. Yes, there is a lady. If you can take the microphone. Take the microphone to the lady, please. Yes, go ahead.


AUDIENCE: Hello, thank you so much for all the informative presentation that you’ve presented. My name is Hassar Taibi. I’m from Mawadda Association for Family Stability in Riyadh. I have actually not a question, but an input, if you allowed it, please. Despite rapid advancement in digital laws and regulation addressing online rights, judicial system face significant challenge in adapting to fast evolving technologies. Gaps in current legal frameworks hinder the protection of digital freedom of expression, especially in light of cross-border risk, like data breach and discriminatory AI applications. Marginalized groups face heightened barriers to accessing digital justice due to the lack of…


Nazarius Kirama: Is that a question?


AUDIENCE: No, no, no, this is just an input. No, this is just an input, if it’s okay.


Nazarius Kirama: Yeah, if you can keep it short because of time. Yeah, yeah.


AUDIENCE: Yes. Okay, sure. So we call to develop comprehensive legal frameworks, draft laws aligned with the emerging challenge to safeguard digital rights, privacy and transparency, strengthen cross-border judicial collaboration, establish mechanism to coordinate judicial decision across countries for effective handling of cross-border digital issues, and engage multi-stakeholder…


Nazarius Kirama: Thank you so much. And we are going to go to the… Thank you for your intervention. And I think we will share… Yes, yeah, the copy of that. I will share the copy extensively with the participants. Okay, sure. Because of time to go back to the panel, if you can make your parting short, two minutes of summary of today’s session, starting with Uma, please. Thank you.


Umar Khan Utmanzai: So the first thing, I’m just loving it that for the first time somebody and the legal fraternity has taken the responsibility to come. And it’s my fourth IGF, but for the first time I’m having a session on something from the judiciary. And we should include only the judges. It includes the lawyers, it includes the prosecutors, it includes all. So I think the beauty of this panel is that we have it from academia, we have from the judges, we have from the lawyer, we have from the civil society. So I believe that this should continue and I’m hopeful to collaborate with the judges and you guys for the next IGF. We might have judges from Pakistan. The Honorable Judge has talked about the visit to Tanzania. I would love to connect him to some judges who are here in Pakistan. So I believe that this should continue and the judicial system should be empowered. Thank you.


Martin Koyabe: Thank you very much for this session. Let me just make three very important issues here. So the first one is that this initiative, which started in Kigali, and I’m very glad we’re here, has actually got a lot of potential to make sure that we can do, as my colleague Omar said, have inclusivity from different facets, whether it’s from technology, judiciary, and other areas. So that’s something that I think we need to bond from this particular conversation going forward. The other thing is the frameworks that exists within member countries. And let’s try and look at also embedding into our frameworks some of the conventions, like the Budapest Convention and other conventions that are very, very straightforward, the Malabo Convention for Africa and so forth, that can enhance the way we are looking at judiciary and the laws. And then technology is here to stay. Let’s remember one thing, that technology, what technology gives, it can also take in equal measure. So as we move towards technology and embedding technology, let us also put mechanisms in place that can go wrong. Because we have advisories out there, they can attack our systems, they can take over some of the decisions, and that can be very disastrous. So let’s make sure that at every point, we do what we call privacy first, security first, as we implement some of this particular issue. Thank you.


Nazarius Kirama: Judge.


Eliamani Isaya Laltaika: Thank you very much. Three issues. First, what the lady was reading is really fundamental, and I invite you and everyone else, just after this, at 12, we are at the parliamentary track room for another session where IGF is now planning to have judges included. You will hear now this dream-taking face. So you can very much still share that, and if you don’t get an opportunity, please give it a copy to me. I know how to work with it. What you are saying is very important. Secondly, let us all imagine we are like a person who has bought a house, and they move in. They realize that the door doesn’t work, the window is too old, the bed is too small. That is what happened half quarter of the 21st century. We have to demolish a few walls to rebuild them, to accommodate the digital world. I am meeting my minister for communication from Tanzania, and we will be discussing a few things. I will ask him to sponsor a few lawyers next year. If they are not sponsored by IGF, and advocate tomorrow, you’ll be sure that next year you’ll see your fellow advocates here, because I’m meeting the minister. The first thing I’ll tell him, please tell TCRI to sponsor lawyers to come here, because if there is a joke they say, and I will finish with this, is a joke, don’t be offended, please. Are you allowing me to say this joke? Yes, they say, if you are doing anything serious, and you are not including a lawyer, there are only two things involved. Either you are not serious, or what you are doing is not serious. Thank you.


Nazarius Kirama: Thank you so much. Thank you so much for attending our session, and be sure that we continue to interact in the future. And for those who would like to follow us in terms of the global judicial network on internet governance, and school on internet governance for the judges, I think we can share contacts later on as we finish. And thank you for joining us, and thank you for your contribution. I know the time was not sufficient for everybody to be able to contribute as much as she or he would like. I think we’ll take your paper, and make part of our conversation. Thank you so much, and we look forward to collaborate with you guys. Can we have a group photo? Yes, yes, we should have a group photo. Thank you. Getting more crowded. Yes, yes.


E

Eliamani Isaya Laltaika

Speech speed

127 words per minute

Speech length

2902 words

Speech time

1360 seconds

Judges need to understand digital issues to properly adjudicate cases

Explanation

Judge Laltaika emphasizes the importance of judges understanding digital technologies to effectively handle cases in the modern era. He argues that without this knowledge, judges cannot properly protect rights in the digital realm.


Evidence

He gives an example of a judge who cannot tell what a mouse is from another device, illustrating the need for technological literacy among judges.


Major Discussion Point

Importance of Judiciary Engagement in Internet Governance


Agreed with

Nazarius Kirama


Umar Khan Utmanzai


Agreed on

Importance of judiciary engagement in Internet Governance


Judges should embrace AI and other technologies to improve court processes

Explanation

Judge Laltaika advocates for the use of AI and other technologies in the courtroom to enhance judicial processes. He argues that these tools can help judges in various aspects of their work, from scheduling to research.


Evidence

He mentions that in Tanzania, AI is used for scheduling cases, assigning cases to judges, language translation, and legal research.


Major Discussion Point

Importance of Judiciary Engagement in Internet Governance


Agreed with

Umar Khan Utmanzai


Martin Koyabe


Agreed on

Need for updated legal frameworks and training


Differed with

Umar Khan Utmanzai


Differed on

Approach to AI adoption in judiciary


AI systems can “hallucinate” and present inaccurate information

Explanation

Judge Laltaika acknowledges the issue of AI hallucination, where AI systems can generate inaccurate or irrelevant information. He emphasizes the need for judges to verify information generated by AI systems.


Evidence

He gives an example of asking AI about economic development in Tanzania and receiving information about Mozambique instead.


Major Discussion Point

Challenges in Applying Law to Digital Spaces


N

Nazarius Kirama

Speech speed

0 words per minute

Speech length

0 words

Speech time

1 seconds

Judiciary has been absent from Internet Governance Forum discussions

Explanation

Kirama points out that the judiciary has not been engaged in the Internet Governance Forum since its formation in 2005. He argues that this absence needs to be addressed to ensure proper engagement in debates about internet governance.


Evidence

He mentions that this session is happening for the first time during the lifetime of the Internet Governance Forum.


Major Discussion Point

Importance of Judiciary Engagement in Internet Governance


Agreed with

Eliamani Isaya Laltaika


Umar Khan Utmanzai


Agreed on

Importance of judiciary engagement in Internet Governance


Policies needed to prevent digital exclusion of marginalized groups

Explanation

Kirama emphasizes the need for inclusive digital policies that address the needs of marginalized and disabled communities. He argues that these policies are necessary to prevent digital exclusion.


Major Discussion Point

Inclusivity in Digital Rights


U

Umar Khan Utmanzai

Speech speed

166 words per minute

Speech length

1162 words

Speech time

419 seconds

Legal frameworks need to be updated to address digital rights

Explanation

Utmanzai argues that existing legal frameworks are inadequate to address the challenges posed by digital technologies. He emphasizes the need for laws that can effectively protect digital rights and handle cyber-related cases.


Evidence

He mentions that in Pakistan, the Prevention of Electronic Crimes Act was only passed in 2016, indicating the recent nature of digital rights legislation.


Major Discussion Point

Importance of Judiciary Engagement in Internet Governance


Agreed with

Eliamani Isaya Laltaika


Nazarius Kirama


Agreed on

Importance of judiciary engagement in Internet Governance


Cross-border nature of internet creates jurisdictional issues

Explanation

Utmanzai highlights the challenges posed by the global nature of the internet in legal proceedings. He points out that judges often struggle with determining jurisdiction in cases involving cross-border cyber activities.


Evidence

He gives an example of a person in Pakistan being harassed by someone outside the country, questioning how a judge would handle such a case.


Major Discussion Point

Challenges in Applying Law to Digital Spaces


Lack of precedent in cyber cases creates difficulties

Explanation

Utmanzai points out that the relative newness of cyber laws means there is a lack of legal precedent for judges to rely on. This absence of established case law makes it challenging for judges to make consistent rulings in cyber-related cases.


Evidence

He mentions that in Pakistan, the Prevention of Electronic Crimes Act was passed in 2016, indicating the recent nature of such laws and the consequent lack of precedents.


Major Discussion Point

Challenges in Applying Law to Digital Spaces


Increase judicial training on technology issues

Explanation

Utmanzai advocates for enhanced training for judges on technology and cyber-related issues. He argues that this is necessary to equip judges with the knowledge needed to handle digital cases effectively.


Evidence

He mentions that in Pakistan, many judges who graduated before 2000 lack knowledge about the internet and AI, highlighting the need for training.


Major Discussion Point

Strategies for Improving Digital Rights Protection


Agreed with

Eliamani Isaya Laltaika


Martin Koyabe


Agreed on

Need for updated legal frameworks and training


M

Martin Koyabe

Speech speed

151 words per minute

Speech length

1416 words

Speech time

560 seconds

Digital evidence is complex and requires new skills from judges

Explanation

Koyabe highlights the challenges posed by digital evidence in legal proceedings. He argues that the volatile and easily alterable nature of digital evidence requires judges to have specific skills and understanding to handle it properly.


Evidence

He gives an example of a case where critical evidence was lost because police officers unplugged a computer, causing the evidence on the screen to disappear.


Major Discussion Point

Challenges in Applying Law to Digital Spaces


Agreed with

Eliamani Isaya Laltaika


Umar Khan Utmanzai


Agreed on

Need for updated legal frameworks and training


Frameworks should embed human rights protections

Explanation

Koyabe emphasizes the importance of incorporating human rights protections into digital frameworks. He argues that elements like freedom of speech and expression should be embedded within a country’s digital strategy and structures.


Major Discussion Point

Inclusivity in Digital Rights


R

Rachel Magege

Speech speed

155 words per minute

Speech length

1506 words

Speech time

579 seconds

Gender-based violence can be exacerbated online

Explanation

Magege points out that digital platforms can amplify and exacerbate gender-based violence. She argues that what happens in physical spaces can be mirrored and intensified in online environments.


Major Discussion Point

Inclusivity in Digital Rights


Digital divide affects access to justice

Explanation

Magege highlights the issue of the digital divide and its impact on access to justice. She argues that disparities in technology access and literacy can create barriers to justice in an increasingly digital legal system.


Evidence

She mentions diverse groups including people in rural areas and those with different levels of education who may struggle with access to digital legal services.


Major Discussion Point

Inclusivity in Digital Rights


Show benefits of AI to increase acceptance

Explanation

Magege suggests that demonstrating the positive aspects and benefits of AI can help increase its acceptance in the legal system. She argues that this approach can help bridge the gap between technology and those who fear or resist it.


Evidence

She gives examples of AI being used to detect crop diseases and predict floods, showing its potential benefits beyond the legal system.


Major Discussion Point

Strategies for Improving Digital Rights Protection


A

AUDIENCE

Speech speed

136 words per minute

Speech length

1148 words

Speech time

505 seconds

Develop comprehensive legal frameworks for digital rights

Explanation

An audience member emphasizes the need for comprehensive legal frameworks to address digital rights. They argue that these frameworks should be aligned with emerging challenges to effectively safeguard digital rights, privacy, and transparency.


Major Discussion Point

Strategies for Improving Digital Rights Protection


Strengthen cross-border judicial collaboration

Explanation

The audience member advocates for enhanced collaboration between judiciaries across different countries. They argue that this is necessary for effectively handling cross-border digital issues.


Major Discussion Point

Strategies for Improving Digital Rights Protection


Agreements

Agreement Points

Importance of judiciary engagement in Internet Governance

speakers

Eliamani Isaya Laltaika


Nazarius Kirama


Umar Khan Utmanzai


arguments

Judges need to understand digital issues to properly adjudicate cases


Judiciary has been absent from Internet Governance Forum discussions


Legal frameworks need to be updated to address digital rights


summary

The speakers agree that the judiciary needs to be more involved in Internet Governance discussions and that judges require a better understanding of digital issues to effectively handle related cases.


Need for updated legal frameworks and training

speakers

Eliamani Isaya Laltaika


Umar Khan Utmanzai


Martin Koyabe


arguments

Judges should embrace AI and other technologies to improve court processes


Increase judicial training on technology issues


Digital evidence is complex and requires new skills from judges


summary

The speakers concur that legal frameworks need to be updated to address digital challenges, and that judges require specialized training to handle technology-related cases effectively.


Similar Viewpoints

Both speakers highlight the challenges posed by digital evidence and AI in legal proceedings, emphasizing the need for judges to have specific skills to handle these technologies effectively.

speakers

Eliamani Isaya Laltaika


Martin Koyabe


arguments

AI systems can “hallucinate” and present inaccurate information


Digital evidence is complex and requires new skills from judges


Both speakers emphasize the importance of addressing the digital divide and ensuring that marginalized groups have access to digital services and justice.

speakers

Nazarius Kirama


Rachel Magege


arguments

Policies needed to prevent digital exclusion of marginalized groups


Digital divide affects access to justice


Unexpected Consensus

Positive aspects of AI in the legal system

speakers

Eliamani Isaya Laltaika


Rachel Magege


arguments

Judges should embrace AI and other technologies to improve court processes


Show benefits of AI to increase acceptance


explanation

Despite concerns about AI hallucination, both speakers unexpectedly advocate for showcasing the benefits of AI in the legal system to increase its acceptance and improve processes.


Overall Assessment

Summary

The main areas of agreement include the need for judiciary engagement in Internet Governance, updating legal frameworks to address digital challenges, providing specialized training for judges, and addressing the digital divide to ensure inclusive access to justice.


Consensus level

There is a high level of consensus among the speakers on the importance of integrating the judiciary into Internet Governance discussions and the need for capacity building. This consensus implies a strong recognition of the challenges posed by digital technologies in the legal realm and a shared commitment to addressing these challenges through education, training, and policy updates.


Differences

Different Viewpoints

Approach to AI adoption in judiciary

speakers

Eliamani Isaya Laltaika


Umar Khan Utmanzai


arguments

Judges should embrace AI and other technologies to improve court processes


Judges are facing the complexity of the technology because judges are not well trained on the things


summary

Judge Laltaika advocates for embracing AI in judicial processes, while Utmanzai highlights the challenges judges face due to lack of training and understanding of technology.


Unexpected Differences

Overall Assessment

summary

The main areas of disagreement revolve around the approach to integrating technology in the judiciary and the extent of training required for judges.


difference_level

The level of disagreement among the speakers is relatively low. Most speakers agree on the fundamental issues but have slightly different perspectives on implementation. This suggests a general consensus on the importance of addressing digital rights in the judiciary, which is positive for advancing the topic.


Partial Agreements

Partial Agreements

All speakers agree on the need for judges to understand digital technologies, but they differ in their approaches. Laltaika emphasizes general understanding, Utmanzai focuses on specific training, and Koyabe highlights the need for skills in handling digital evidence.

speakers

Eliamani Isaya Laltaika


Umar Khan Utmanzai


Martin Koyabe


arguments

Judges need to understand digital issues to properly adjudicate cases


Increase judicial training on technology issues


Digital evidence is complex and requires new skills from judges


Similar Viewpoints

Both speakers highlight the challenges posed by digital evidence and AI in legal proceedings, emphasizing the need for judges to have specific skills to handle these technologies effectively.

speakers

Eliamani Isaya Laltaika


Martin Koyabe


arguments

AI systems can “hallucinate” and present inaccurate information


Digital evidence is complex and requires new skills from judges


Both speakers emphasize the importance of addressing the digital divide and ensuring that marginalized groups have access to digital services and justice.

speakers

Nazarius Kirama


Rachel Magege


arguments

Policies needed to prevent digital exclusion of marginalized groups


Digital divide affects access to justice


Takeaways

Key Takeaways

The judiciary needs to be more engaged in Internet Governance discussions and processes


Judges require training and tools to properly handle digital evidence and cyber cases


Legal frameworks need to be updated to address emerging digital rights issues


Cross-border collaboration is needed to address jurisdictional challenges in cyber cases


AI and other technologies can improve court processes but also present new challenges


Inclusivity and protection of marginalized groups must be considered in digital rights policies


Resolutions and Action Items

Launch of the Judiciary Global School on Internet Governance to train judges on IG issues


Plan to include more judges and legal practitioners in future IGF meetings


Tanzanian judge to request government sponsorship for lawyers to attend next IGF


Unresolved Issues

How to balance judicial independence with the need for technology adoption


Extent to which court processes should be digitized (e.g. online marriages)


How to address AI hallucination and ensure accuracy of AI-generated legal information


Funding and resource allocation for judiciary digitization in developing countries


Suggested Compromises

Gradual introduction of technology in courts, starting with scheduling and case assignment


Use of legal research assistants to help judges navigate new technologies


Framing digitization benefits in terms of efficiency and citizen service to gain political support


Thought Provoking Comments

Within the East African community, and I’m not going to mention countries here, there is, in one country, a judge, a high court judge was summoned by the disciplinary committee because a part of his judgment was allegedly written by a chat GPT.

speaker

Judge Eliamani Isaya Laltaika


reason

This comment highlights the real-world challenges and controversies surrounding the use of AI in judicial processes, illustrating the tension between technological advancement and traditional legal practices.


impact

It sparked a deeper discussion about the ethical implications and practical challenges of integrating AI into judicial systems, leading to considerations of guidelines and regulations for AI use in courts.


In Tanzania, we got out of this because we started with a strategic plan. We sat down and made a five-year strategic plan. We identified what we need. We handed it over to the executive. The executive said, okay, now we know your needs.

speaker

Judge Eliamani Isaya Laltaika


reason

This comment provides a concrete example of how to effectively implement technological changes in the judiciary, emphasizing the importance of strategic planning and collaboration with the executive branch.


impact

It shifted the conversation towards practical solutions and strategies for digital transformation in the judiciary, encouraging other participants to consider similar approaches in their own contexts.


So for example, when you look at the case of Tanzania, for example, when they were developing their cyber security strategy, they were very keen that they must have those fundamental tools and instruments embedded within their strategy.

speaker

Martin Koyabe


reason

This comment emphasizes the importance of integrating cybersecurity considerations into national strategies, highlighting a proactive approach to addressing digital challenges.


impact

It broadened the discussion to include the importance of comprehensive national digital strategies, encouraging participants to consider how legal frameworks, cybersecurity, and digital rights can be integrated at a policy level.


In Pakistan, a judge cannot even sit in public. He is not, meet the people in the public. So those who are judges in the high court are the judges from last, have graduated before 20,000. 1998, 1999, and 2000. So they even do not know about the internet, they do not know about the AI.

speaker

Umar Khan Utmanzai


reason

This comment provides a stark contrast to the more technologically advanced judicial systems discussed earlier, highlighting the significant disparities in digital literacy and access among judges in different countries.


impact

It brought attention to the global inequalities in judicial digital literacy, prompting a discussion on the need for targeted training and capacity building for judges in less technologically advanced jurisdictions.


Overall Assessment

These key comments shaped the discussion by highlighting the complex challenges of integrating technology into judicial systems across different contexts. They moved the conversation from theoretical discussions about digital rights to practical considerations of implementation, training, and policy development. The comments also underscored the global disparities in judicial digital literacy and access to technology, emphasizing the need for tailored approaches in different countries. Overall, these insights deepened the conversation, making it more nuanced and action-oriented, while also broadening its scope to consider diverse global perspectives on judicial engagement with digital technologies.


Follow-up Questions

How can frameworks be developed to guide the use of AI in courtrooms?

speaker

Eliamani Isaya Laltaika


explanation

There is a lack of guidelines or regulations for AI use in courtrooms, leading to inconsistent approaches across jurisdictions.


What measures can be implemented to check the accuracy of AI-generated content in legal briefs?

speaker

Audience member


explanation

There is concern about lawyers using AI tools like ChatGPT for their briefs without verifying the accuracy of the generated content.


How can the executive branch be encouraged to financially support the modernization and capacity building of the judiciary?

speaker

Audience member (Doron from UNEGOV)


explanation

Many judiciaries lack the financial independence to fully benefit from digitalization efforts.


What are the appropriate limits for digitalization in the judiciary, particularly for sensitive legal processes?

speaker

Audience member (Doron from UNEGOV)


explanation

There is a need to determine which judicial processes can be safely digitalized and which require in-person interactions.


How can legal education be updated to include more technology and cyber law components?

speaker

Umar Khan Utmanzai


explanation

Many law graduates lack formal education in cyber law and technology, which is increasingly important in legal practice.


What strategies can be employed to enhance cross-border judicial collaboration on digital rights issues?

speaker

Audience member (Hassar Taibi)


explanation

There is a need for better coordination of judicial decisions across countries to effectively handle cross-border digital issues.


Disclaimer: This is not an official session record. DiploAI generates these resources from audiovisual recordings, and they are presented as-is, including potential errors. Due to logistical challenges, such as discrepancies in audio/video or transcripts, names may be misspelled. We strive for accuracy to the best of our ability.