Widening Lens: A New Narrative for Media Coverage of Cyberspace

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Faisal J. Abbas

The analysis highlights concerns regarding the spread of fake news and stresses the importance of cybersecurity in addressing this issue. The argument is made that cybersecurity should go beyond basic hacking and also focus on the dissemination and veracity of information. The impact of fake news on society is discussed, with the ‘Pizza-gate’ incident being cited as an example of the real-world consequences that can arise from the manipulation of information.

The role of social media in propagating fake news is particularly emphasized. It is noted that 80% of Arab youth obtain their information from social media platforms, and a significant number of people are prone to retweet or repost fake news. This has led to a decline in trust in journalists and has made it challenging for individuals to differentiate between genuine and false information.

There is recognition of the potential of artificial intelligence (AI) in combating fake news. It is argued that AI can play a crucial role in identifying and combating fake news effectively. The creation of realistic fake videos by AI technology is discussed, highlighting the difficulty that humans face in identifying such content. Therefore, AI is seen as an essential tool in addressing this problem.

The analysis also highlights the dangers associated with the misuse of AI to create compelling fake videos. Reference is made to ongoing wars that have been escalated by misinformation. The instantaneous spread of fake news, especially through videos, is seen as a threat to global security.

Due to the severity of the issue, there is a call for a global initiative to combat fake news. The need for collective action is emphasized, and it is stressed that tech companies should take responsibility for the dissemination of fake news. The analysis suggests engaging in serious discussions with tech companies to regulate the content to which users are exposed.

Furthermore, the importance of education and content monitoring for young technology users is emphasized. It is highlighted that young children, as young as three or four years old, are being given access to iPads without proper content monitoring. The lack of literacy and control over exposure is seen as a significant concern.

In conclusion, the analysis underscores the urgency of addressing the spread and impact of fake news through comprehensive cybersecurity measures. While there is a consensus on the need for action, there are differing views on the responsibility of various stakeholders, including media, tech companies, and individuals. The arguments and evidence presented shed light on the complexities of this issue.

Margery Kraus

Summary:

Margery Kraus, the founder of APCO Worldwide, emphasises the importance of embracing technology, particularly artificial intelligence (AI) and cyber technologies, in order to facilitate progress and transformation within organisations. APCO is actively involved in training initiatives that explore how AI can revolutionise and streamline operations. They are utilising AI to automate routine tasks, thereby allowing individuals to focus on higher-order responsibilities. This approach is seen as essential for driving innovation and improving efficiency. The clients of APCO primarily seek assistance with cybersecurity, as well as guidance on how to deploy cyber technologies in a positive manner to shape the future. Common requests include support in combatting online abuse, developing crisis response plans, and exploring the potential benefits of cyber technologies in long-term planning.

Margery Kraus highlights the need for greater media coverage that showcases the positive uses of cyber technologies for social inclusivity. This aligns with the Sustainable Development Goals (SDGs) of Reduced Inequalities and Peace, Justice, and Strong Institutions. There is also an urgent need to teach young people about cyber literacy. Furthermore, the media should focus on instilling a better understanding of how to consume information and highlight the positive uses of cyber platforms. Collaboration across multiple sectors is necessary to address cyber-related challenges. Lastly, it is important to strike a balance between using cyber technologies and understanding the associated risks and fears.

Overall, embracing AI and cyber technologies, promoting positive uses, enhancing media coverage, teaching cyber literacy, promoting digital equality, and encouraging collaboration are all essential in addressing the challenges and opportunities presented by cyber technologies.

John Defterios

The debate surrounding the importance of global cybersecurity coverage in journalism has brought forth various viewpoints. Some individuals argue that there is a lack of sufficient coverage in this area, while others believe that it should be prioritised.

Those who support the notion claim that global cybersecurity is under-covered in journalism. They argue that cyber threats, such as data breaches and threats to infrastructure, require more informed and comprehensive reporting. Currently, these topics often receive only a brief mention in the media before being forgotten. The lack of in-depth reporting on global cybersecurity, which is considered less exciting but highly significant, is a cause for concern.

On the other hand, critics argue that the media tends to focus excessively on major tech companies, such as Facebook, Google, and Apple. These companies attract a significant amount of traffic, resulting in over-coverage of their activities. As a result, important issues like cybersecurity are overshadowed and receive inadequate attention.

To effectively cover cyber threats, it is suggested that dedicated resources and specialised experts be employed. Currently, the knee-jerk reaction is to call upon national security or IT personnel in the event of a cybersecurity issue. However, it is believed that communities should establish a pool of cybersecurity experts who can be consulted in such situations. This approach would ensure a more informed and efficient response to cyber threats.

The role of algorithms in shaping media consumption patterns and opinions is also scrutinised. Critics argue that algorithms tend to divide individuals into polarised groups, limiting the representation of diverse perspectives. This polarisation not only affects the way we consume media but also damages our attention spans.

Inclusivity and global collaboration are emphasised as essential components in effectively addressing cybersecurity challenges. The COVID-19 pandemic highlighted the exclusion of the global south until much later in decision-making processes. Thus, inclusivity across the board is considered crucial in tackling global issues like cybersecurity. It is also noted that collaboration on cybersecurity already exists on a regional scale in areas such as Asia, the GCC, and the Americas, and it can be extended globally.

The importance of an impartial centre for cybersecurity is also stressed. Given that the US and China are major competitors in technology and data, there is a need for a neutral entity to broker cybersecurity agreements. The Kingdom is suggested as a potential unifying force in this regard, playing a role in creating a safe space to address cybersecurity concerns.

In conclusion, while the coverage of global cybersecurity in journalism is a matter of debate, there is a consensus that more in-depth reporting and attention should be directed towards this critical issue. It is imperative to allocate dedicated resources, consult specialised experts, address the influence of algorithms, promote inclusivity and global collaboration, and establish an impartial centre for cybersecurity. By taking these steps, the media can more effectively inform society about the challenges and risks posed by cyber threats.

Massimo Marioni

The analysis highlights several important aspects of cybersecurity reporting and media practices. One key point is the critical role of fact-checking and verification in cybersecurity reporting. This is because false information can spread rapidly and cause significant harm in the realm of cybersecurity. It is crucial for journalists and media professionals to ensure the accuracy and reliability of their reporting when it comes to cybersecurity matters. By diligently fact-checking and verifying information, media outlets can provide the public with trustworthy and credible news.

Another important aspect is the involvement of experts in cybersecurity reporting. By including experts in coverage and reporting, media outlets can tap into their knowledge and experience to provide informed and authoritative perspectives. This adds credibility to the reporting and helps the audience better understand the complexities of cybersecurity issues.

Furthermore, the analysis emphasizes the significance of education and digital literacy in cybersecurity. Many individuals are not sufficiently aware of cybersecurity threats and best practices, making them vulnerable to cyber-attacks. By promoting education and increasing digital literacy, people can become savvier in protecting themselves online. This can be achieved through initiatives that focus on educating the public about cybersecurity risks, providing guidance on best practices, and enhancing digital literacy skills.

The analysis also highlights the need to avoid sensationalism in media reporting. Media outlets have the power to shape public opinion and perception of cybersecurity risks. By hyping up certain aspects unnecessarily, they can spread fear and uncertainty. It is crucial for media professionals to maintain balance in their reporting, focusing not only on problems but also on solutions and progress in the cybersecurity field. This helps provide a comprehensive and accurate understanding of cybersecurity issues.

Additionally, the analysis notes that imparting digital literacy requires collaboration between governments, media, and tech companies. This joint effort ensures that the audience receives the necessary resources and support for developing digital literacy skills. It is important for these stakeholders to work together in designing educational programs, creating digital content, and fostering partnerships to effectively address the digital literacy needs of the audience.

In conclusion, the analysis highlights the importance of fact-checking, the involvement of experts, education, and balanced reporting in cybersecurity journalism. It underscores the need to avoid sensationalism and promote digital literacy. It also emphasizes the significance of collaboration between governments, media, and tech companies in effectively imparting digital literacy skills to the audience. By embracing these practices, media outlets can contribute to a more informed and secure society in the face of cybersecurity challenges.

Intro

The Cybersecurity Market Ecosystem Development event convened prominent figures from various countries in the field of cybersecurity. Engineer Waleed Abu Khalid, CEO of Saudi Arabian Military Industries (SAMI), stressed the need for collaboration between the public and private sectors to drive the growth of the cyber industry. He underscored the importance of nurturing local talent and establishing robust educational programs to meet the demand for skilled cyber professionals.

Dr. Miqat Zuhairi Bin Miqat, Chief Executive of Malaysia’s National Cybersecurity Agency, highlighted the significance of proactive measures in addressing cyber threats. He emphasized the development of a strong cybersecurity ecosystem, including effective legislation and regulations, as well as investments in research and development.

Felix Barrio Juarez, Director General of the Spanish National Cybersecurity Institute, discussed the role of government in promoting cybersecurity innovation. He emphasized the need for public-private partnerships in sharing threat intelligence and promoting best practices.

Engineer Abdurrahman Al Malki from Qatar’s National Cybersecurity Agency stressed the importance of tailored cybersecurity solutions that meet each country’s specific needs. He urged governments and organizations to remain vigilant and adapt to rapidly evolving threats.

The panel’s moderator, John Defterios, provided an international perspective to the discussion, drawing on his experience as a former CNN editor and editor for emerging markets. He emphasized the global nature of cyber threats and the need for coordinated efforts to tackle them.

The event’s panelists agreed that ecosystem development plays a pivotal role in stimulating the cybersecurity market. They highlighted the need for international collaboration, information-sharing, and investment in research and development to stay ahead of cyber threats.

Additional contributions from Massimo Marioni, Europe Editor at Fortune, Rebecca McLaughlin, an international TV anchor and media trainer, Marjorie Cross, founder of APCO Worldwide, and Faisal Abbas, Editor-in-Chief of Arab News, provided valuable insights into various aspects of the cybersecurity landscape.

Overall, the event demonstrated the importance of collaboration and proactive measures in addressing cybersecurity challenges. The diverse perspectives of industry leaders underscored the need for continuous innovation and adaptation to effectively counter cyber threats in an increasingly interconnected world.

Rebecca McLaughlin-Eastham

Rebecca McLaughlin-Eastham, an expert in drone technology, skillfully and successfully landed a drone, ensuring its safe return. She emphasizes the importance of handling technological devices with great care and caution, given their significant costs and potential risks. It is clear that the drone, being an expensive piece of equipment, requires a gentle and controlled landing procedure in order to prevent any damage.

The focus on careful handling of technological devices arises from the understanding of the potential dangers they can pose. By safely landing the drone, McLaughlin-Eastham demonstrates the necessary skill and precision required when working with advanced technology. Her achievement reminds others in the industry of the need for responsible and meticulous handling of expensive equipment.

The supporting facts further emphasize the importance of a safe landing for the drone. McLaughlin-Eastham’s affirmation that the device must land safely underscores the crucial role careful handling plays in preventing any potential damage or loss. Additionally, the mention of the drone’s expensive nature highlights the significance of gentle landing to avoid costly repairs or replacements.

In conclusion, Rebecca McLaughlin-Eastham’s successful landing of the drone not only showcases her expertise but also underscores the vital need for careful handling in the field of technology. Consideration of the high costs and potential dangers associated with these devices is paramount to ensuring their longevity and effective use. Her accomplishment serves as a valuable lesson for professionals and enthusiasts alike, reminding them to approach technological equipment with caution and responsibility.

Session transcript

Intro:
Catalyzing Cyber, Stimulating Cybersecurity Market Through Ecosystem Development Engineer Waleed Abu Khalid Chief Executive Officer, Saudi Arabian Military Industries, SAMI Dr. Miqat Zuhairi Bin Miqat Chief Executive, National Cybersecurity Agency, Malaysia Felix Barrio Juarez Director General, Spanish National Cybersecurity Institute His Excellency Engineer Abdurrahman Al Malki National Cybersecurity Agency, Qatar John Defterios, Moderator, Former CNN, Emerging Markets, Editor and Editor Massimo Marioni, Europe Editor, Fortune Rebecca McLaughlin, East Ham, Moderator, International TV Anchor, MC and Media Trainer Marjorie Cross, Founder and Executive Chairman, APCO Worldwide Faisal Abbas, Editor-in-Chief, Arab News

Rebecca McLaughlin-Eastham:
Good afternoon everybody It’s wonderful to be here, see a packed room and we have a fantastic conversation coming up You’ve just heard my guests being introduced I have luminaries from the world of media, strategic communications and of course journalism on stage with me and we are going to have a deep dive into how cyber security is being covered in the media Is the narrative correct? Is it balanced? Is it informative? Is it constructive? Is it responsible? It’s our duty to inform and to engage but we also don’t want to spook, we don’t want to deter So how do we strike that right balance? Well here with all of the answers I’m delighted to say are my esteemed guests So let’s start with the bigger picture Faisal let me come to you first from Arab News How has the narrative traditionally been when it comes to covering global cyber security?

Faisal J. Abbas:
Thank you Rebecca and thank you for that very important introduction It is actually very telling that we are here discussing this particular topic at a global cyber security forum I say that because people occasionally or more than occasionally most likely relate cyber security to things like phone hacking, going into your bank account etc But the reality is cyber security should and must encompass much more than that We are living in an era which we have as a humanity We have not experienced this before where every person on the planet provided they have wifi or internet connection can disseminate and receive information at the same time What has happened with the advancement of technology particularly with what we are seeing with AI Is not only are you allowed now or capable of disseminating and receiving information You are also capable of faking realities and faking news And if you think this is not related to cyber security then you are wrong Because just look at some of the world events that have happened as a direct result of fake news spreading I can name so many incidents, I don’t want to get very political so I am going to name a non-political one Which is what we all know as the pizza gate in 2016 in the United States Where somebody posted fake news that there is a child exploitation ring being used Children being exploited in a pizzeria And somebody ended up taking a gun and shooting everybody and it was a completely fake news story I am going to end with three figures on why cyber security should include media And to make it easier for everybody to remember, just remember 80, 70, 60 According to the Arab youth survey, 80% of Arab youth get their information from social media According to a recent study by MIT, 70% of people are more likely to retweet or repost fake news 60% according to the Edelman trust barometer now no longer trust journalists and believe they are misleading The conclusion is we are heading in a direction where we will no longer be able to tell what is true from what is fake And this I believe is at the heart of what cyber security should be

Rebecca McLaughlin-Eastham:
100% trust is integral to that and to our discussion today Marjorie let me come to you, talk to me from both sides At APCO what are you doing when it comes to using systems for generative AI and the like And also what are your clients asking you for help on and what are you telling them? Three pronged question from a journalist there

Margery Kraus:
I think for us, I’m a big believer even though we have about five generations in APCO and I’m at the upper end of that I’ve never ran away from technology I think technology is really important and as we get into cyber it’s even more important Because this is not something we’re going to stop So this is something we have to embrace And I think this is something actually that the media needs to educate us more about In terms of how we embrace in a positive way, how we use cyber for good, how it becomes inclusive And so within APCO what we’re doing is a lot of training and how AI can help transform our organization And how we can then use this so that some of the routine tasks that a company like ours would do Whether it’s monitoring another thing that you can train AI to do And then other things that then you can use your people for higher order things And you’re taking a lot of the routine out of the work and making it more interesting And if we can do that transformation I think that’s a really important way to use this So our clients are coming to us for two things One is the more traditional things of cyber security And the fact that as you read in the media that there’s a lot of abuse that goes on online And how do they protect for it, how do they plan for it, how do they organize their crisis response Things of that sort But the other side is hoping that will be an example is coming to us for how can they deploy cyber In the most positive way in terms of what they’re doing for the future So I think both of those are things that we end up doing

Rebecca McLaughlin-Eastham:
Thank you John, nice to see you again You’ve covered breaches, you’ve covered GCF for many years Talk to me about how you see it from a journalist point of view How are we covering global cyber security? Accurately, adequately, and what is the impact on consumers?

John Defterios:
Yeah, you put a lot in there and you’re correct to do so, Rebecca So thanks very much Rebecca and I had a chance to tackle this subject in a podcast two months ago Which you can find on the GCF website And I think it’s a similar approach to this I think it’s undercover and we can talk later in the panel about from the broadcast journalism side Which I did for better than 30 years And the challenges there But at the FII last week, I thought it was very interesting They pulled the people that were attending FII And they said, what are the most pressing issues of today’s time? It varied between the youth at the end and those kind of in the C-suite And the ministers at the start of the FII But the common ground was the cost of living You know, quality of life, conflict, and climate change, right? They listed those as the four major issues And I have a very strong belief that cyber space and cyber security should be at least in the top five Because of all those things I talked about there You get information and make educated decisions based on what you’re reading But you have to have a common trust And in the previous panel, when we said deciphering or catalyzing cyber We got into this idea that right now the consumer is pretty unaware What the challenges are in the near future where AI, generative AI meets cyber So I think as a duty from the journalistic community I don’t think anybody in this panel would disagree with me We have a duty to inform society of what’s ahead We were even talking about in that panel what sort of certifications If you have a driver’s license, for example, to drive a car What sort of certificate do you have to surf the world? Because it’s going to be a much more complex world in which to navigate So we have a challenge now because of not AI But algorithms that we have people in different echo chambers Not believing what is real news versus fake news, right? And that’s a big challenge today But I think, and this is a personal viewpoint That cyber security should be in the top three subjects of our time Because it’s moving quickly So right now we’re lucky that governments like Saudi Arabia And others that we’ve talked to that are participating in the GCF Take the challenge very seriously We rely on the private sector to continue to innovate and invest And take care of society But it’s going to be much more challenging in the future The numbers indicate it So we’re looking at least $2 trillion of lost commerce in the last year It’s growing by 35% to 40% a year That should get everybody’s attention But I think when consumers cannot trust what they have on technology platforms Whether it’s their banking app or a B2B system Or an e-commerce site like an Amazon Because you’re worried about a transaction It’s when we have as a responsibility as journalists and media Strategic communications group to educate society So I think we should start there The GCF and the Institute is a great place to start that journey But I think we need to go into the next layer here Where consumers have a greater awareness And how do we tell those stories so they understand it

Rebecca McLaughlin-Eastham:
Massimo, how do we tell these stories? We have a responsibility We need to heighten awareness We need to inform How do we best do that in a balanced way?

Massimo Marioni:
Yeah, that’s a good question So I think there’s probably six things which help us do that One of those things is fairly basic And that’s to fact-check and verify It sounds obvious and it’s very core to all journalism But especially with cybersecurity I think it’s even more important Number two is to get experts involved As many experts as we can Have that expert opinion across all coverage of cyberspace and cybersecurity Trying to educate, which John touched on Trying to educate a community who isn’t perhaps as… Well, I don’t think many people are as savvy as they need to be In the cybersecurity game and consuming that news So I think the more education and digital literacy that we can give people The better they will be Constantly reminding people of best practices Whether that’s with passwords or whether that’s with banking My mother is 80 years old And she’s quite savvy with digital and things like that But she still sometimes contacts me whenever something comes through She gets an email and she’s not sure about it Sometimes it’s from me And she’s asking, did you really send me this? I was like, yes, yes I don’t need any money, but… And avoiding sensationalism, I think, is also quite crucial There’s a temptation within all of media To hype up either the really bad or the really good And I think that can spread fear and uncertainty across audiences When perhaps there isn’t that immediate need to do that And lastly, balance, I think Reporting on the solutions and progress of cybersecurity or AI or whatever it is Is just as important as reporting on the problems So those six aspects, I think, are ways that media can really help Bring that knowledge and literacy to the audience

Rebecca McLaughlin-Eastham:
Talk to me just before we go any further About the expertise that you do have in-house Arguably every newsroom around the world will have national security experts IT reporters, tech reporters But what about AI, Massimo? Do you have people dedicated to that?

Massimo Marioni:
Yeah, so we’ve got a reporter called Jeremy Khan And he’s been an AI expert for many years So we’re very lucky to have him And he’s covered AI in various forms for a long time And people sort of think that it’s just popped up in the last few months AI has been a thing for a very long time And Jeremy’s been an expert for a very long time So at Fortune, we’re very lucky to have someone with deep, deep understanding And deep knowledge of AI But obviously, that is in an area which every newsroom has devoted time and resource to So that’s very important, I think, for newsrooms across the world To invest those resources into experts like Jeremy Don’t leave, Jeremy, if you’re listening to this Into experts who can really deliver value for their audience For this super, super important topic

Rebecca McLaughlin-Eastham:
Thank you. John, I’ll come back to you in just a second Faisal, take me inside your newsroom Who is dedicated to that beat? Cyber security in particular

Faisal J. Abbas:
Here’s where I disagree I don’t think it’s the job of one person It’s the job for the whole newsroom, collectively Look, let me talk about a much bigger example A huge organization such as the BBC They have a whole initiative called BBC Verify Which, all it does, that team, all it does All they do is go on the internet, look for fake news and identify it Great, great initiative But let’s be realistic here This is a drop in the ocean That is not going to be enough There is, in fact, no newsroom in the world No matter how much resources they have That are capable of standing up to this thing My point of view is as follows What AI breaks, only AI can fix And as we are at a global cyber security forum There needs to be global action towards this I’m not here to do fear mongering But this is a reality We’ve seen, we are currently living a war in this region We’ve seen how quickly fake news spreads And this is now just words And this is with AI still tiptoeing Imagine the same war in five years’ time When you can instantly create fake news videos About babies being decapitated or soldiers being killed, etc This is the stuff that starts wars And there is no human capability to be able to identify Because these videos, in terms of sound, in terms of video In terms of the surrounding Are so real that you cannot tell what’s fake from what’s new So the only solution is a global initiative To have AI filters that can identify immediately What is real, genuine footage Which is then our job as news reporters But immediately label things that are manufactured by AI As manufactured by AI

Rebecca McLaughlin-Eastham:
John, do you want to come in on this?

John Defterios:
Sure, I’ll take it from the prism of broadcasting And I think it’s a tall order And Massimo touched on this and so did Faisal It’s a tall order to say that somebody that covers national security Should also cover cyber security It’s also a very tall order, a big ask, if you will For somebody who covers IT And I think there’s a tendency within the television community Which I spent my career in To stray away from the fangs Or the biggest technology companies of the world So we know Jeff Bezos We know Meta, Facebook, Google, Apple Twitter, now X And the tendency is to gravitate to those big companies And those personalities Because they get a lot of traffic But again, this is where responsibility comes in, right? So that gets a lot of traffic There’s a tendency to cover those companies way too much Or the latest gadgets that are out there Because consumers interface with those gadgets And they’re so pervasive in our lives But there’s a big gap in the middle To have an expertise And I’m suggesting if the private sector And the governments are paying so much attention To cyber and cyber threats And the future of cyberspace And making it a safe place for people to operate We should be dedicating resources Trained resources to be able to cover it Now the second layer of what I’d like to talk about Is how do you tell this story in television? And Rebecca, you have that experience as well There are no visuals So, you say you had a hack on the air traffic landing system in Heathrow, it’s not fair to be using file video of the Heathrow airport and say they had a big, you know, technology breach today because that lasts for about 30 seconds. We had a case in the last year where South Asian Bank was hit for $200 million, which is extraordinary. It’s an extraordinary event or 80% of the airports and 65% of the power systems in the world are being hacked on a daily basis, but you can’t go in and show it physically as a television correspondent. So I think we should, A, be a lot more clever about how we tell the stories, and this is where data visualization would come into play. It works for Faisal at Arab News, it works for Masouma at Fortune. You use data to tell your story and to educate people at the same time, but I think we’re making a profound mistake in our profession to say that the same person that covers national security and the same person that covers IT, oh, try to give this a whack and cover cyber. I think it’s a profound mistake, and we should take it more seriously as a topic is what I’m saying. I mean, I’m talking to the converted there because you educated me for our broadcast, but the reality is we need to be taking it at a much more higher level of attentiveness. Final point that Masouma brought up in his opening remarks, there’s a pool of experts, but I think we often call as a knee-jerk reaction an IT specialist or an ICT specialist to talk about cyber, or we call a national security person to talk about cyber. Let’s as a community build experts that we share the resources and say, if I’m in America, these are the top 25 people. If I’m in the European Union or in the UK, these are the people that really know cyber. If I’m in the Middle East and North Africa, if I’m in Asia, and we should, I think as a GCF, I would even extend, let’s help that process to say these are the top 50 cyber security specialists in the world. So if there’s a story to be told, let’s respond that way.

Rebecca McLaughlin-Eastham:
Absolutely. Marjorie, please.

Margery Kraus:
So I want to come at this just a little different and follow on something Masouma said as well, and take it more from the other end of the telescope. Because I think that one of the things, you talked about media literacy, and there was a whole time when we were teaching younger people how to discern news, how to check their sources, how to look for various things. And we need to do that. We need to create cyber literacy on the part of young people. How do they know what sources? You said 80% get their information from cyber, from these platforms. Think of how terrifying that is, given fake news. So what skill sets, and what can we bring, and how do we demonstrate this through the media to have more educated consumers, especially young people? So they know where to find the news, or where to find information. They know how to verify it as much as they can. They know if they’re being bullied, where to go for help. They know, you know, there’s certain basic skills. And as this cyber world kind of envelopes all of us, we need to give young people the skills, and older adults. So that when they, you know, it’s great that your mother called you to, if she should open the email. I wonder how many other people would call if they’re getting an email like that. They would open it, and they would be subject to all kinds of scams. So I think that’s one of the things the media could do, is to help us all understand better how to consume what it is that we’re getting. I think one other thing that the media could do is also focus on some of the positive uses of cyber, and the way in which we are using it to become more, to have more equity in society, or to give people access to education, and things of that sort. And it tends to be that, you know, like good news doesn’t get covered, as well as difficult things.

Rebecca McLaughlin-Eastham:
That’s interesting. Thank you, Marjorie. Well, Faisal, let me pick up on that with you, in terms of the positive aspects, the good news stories, rather than the scary headlines, let’s say. The media arguably has a responsibility, of course, to be balanced, but is one overshadowing the other? The urgent is crowding out the other important stories that should be told, but aren’t as much.

Faisal J. Abbas:
Well, look, Journalism 101, right? So if a dog bites a man, that’s not a story. If a man bites a dog, that’s a story. That’s always been the case, and that’s always been human. People don’t want to know what they already know. But here’s where I disagree with my honorable colleagues. It’s not the media to blame for reporting negative stories. That’s your job. Your job is to alert people to important things. But let me take you to the beginning of last year, when the Facebook whistleblower Frances Hogan spoke at Congress. And this is not from me. Take it from the horse’s mouth. This is what she said. She said, at some point at Facebook, we realized that the emotion that triggers the most traffic, the most engagement, the most reaction, is anger. So I’m not saying Facebook or Meta or Twitter are evil. I am saying, given that it is in their interest, this is … means more users on their website, on their pages, means more advertising revenue. There is a fundamental issue that needs to be addressed here with that business model. And again, I’m not pointing fingers, and I’m not accusing them of being deliberately behind this. But we need to remember one important aspect. These companies were not built by journalists, like newspapers or broadcasters. These companies were built by teenagers who were coders and engineers, who probably didn’t understand or appreciate the impact of what they are doing. And the result is what we are having to deal with today. And I reiterate, this is very important, because the next war, this war might be contained, the next one might not be containable because of fake news.

Rebecca McLaughlin-Eastham:
Massimo, coming to you at the end. Is the end nigh? Give me an optimistic outlook, or even a bleak one. Do you agree with Faisal or disagree?

Massimo Marioni:
Yes, to an extent. Meta did discover that, and they fueled a lot of the problems that we’re facing today. But I think every news organization, to an extent, also can identify with the similar sort of sentiments and findings that Meta found, that anger and shock drives more interaction than, say, good news stories, or happy stories, or stories of progress and solutions. So I think there is a responsibility there in the media to not just chase that engagement, but to also try to break down the complexities that your average reader may switch off when they’re reading, because ultimately, the cyberspace world and cybersecurity, they’re complex things, and readers don’t tend to read too deeply into things that they don’t understand unless they’re super, super engaged. So the media’s job is, one, to break down that complexity, to avoid sensationalism where they can. Obviously, as Faisal mentioned, it’s our job to report newsworthy events, events which people find interesting, events which are important for the reader’s everyday lives. And usually, that’s on the spectrum of really good and really bad. And the stuff in the middle doesn’t really generate the interest or engagement that perhaps it needs to. And because all media companies are trying to make money as well and keep themselves afloat, it’s a very challenging time for media. The temptation is to veer into that one side of the spectrum.

Rebecca McLaughlin-Eastham:
So I think somebody at Facebook doesn’t like what I said. Somebody is watching. Yes. They sent a drone. Wow. They sent a drone for revenge. Is Facebook coming for us? I mean… It’s so quiet. I hardly noticed it was there. Please, Massimo, continue if you can.

Massimo Marioni:
I don’t know what I was saying anymore. That’s amazing.

Rebecca McLaughlin-Eastham:
He must have said something very inflammatory. I’m sorry. Can you hear us in the audience?

Massimo Marioni:
Can someone shoot it down?

Rebecca McLaughlin-Eastham:
Stay with us.

Faisal J. Abbas:
Can we get the drone to land?

Rebecca McLaughlin-Eastham:
I think so. There we go. Safe landing. Just like in TV. Expensive piece of kit. So you have to make sure it lands gently.

Faisal J. Abbas:
A real-life example of how technology can disturb reality.

Rebecca McLaughlin-Eastham:
Oh, yes. And you’re always being watched. Yes.

John Defterios:
For ten seconds there, I was a little bit worried that it actually might have been an armed drone. But thank God it wasn’t.

Massimo Marioni:
I must have upset Facebook or something.

Rebecca McLaughlin-Eastham:
Sorry, Massimo, please finish your sentiment.

Massimo Marioni:
I think that was it.

Rebecca McLaughlin-Eastham:

John Defterios:
No, the only thing I would add to this is that, and Faisal alluded to it, we’ve taken algorithms for granted, and I think it’s actually caused a lot of damage in terms of our attention spans. So it was into the market to help us. So if you were searching a story and you said you want to get more of this story, or if you’re looking for a restaurant, it offers you alternatives. That’s the good side of it. But it does divide us into the group that could be far left in its politics, far right in its politics, and there’s very little voice given to the center, which I think is part of the problem. And the reason I bring that up, a lot of people may not be interested in global cybersecurity and a safe cyberspace, but they should know. It’s not the sexiest story in town. It probably won’t get you to click, but we could be much more creative in the way we tell those stories. In a 24-hour news cycle, Rebecca knows this, if there’s a major breach, and it was news, as Faisal was suggesting, you do need to cover it. So if there’s a major breach in the world, our role was to inform, because it was a who’s what, where, when, and why. It was big enough. It affected enough people. It threatened our water supply. Our power system went down, right? Could have been contamination in your local drinking waters. That’s just examples of what we’ve seen. A major data breach where people’s private data were breached and went into public. We need to inform it, but we need a much deeper knowledge of the sector to be able to inform people correctly, is my view on this. We should go deep, not quick. And usually, if you’re in a 24-hour news cycle, we’d have a conference call, what are the major stories? Oh, I saw there was a breach on the water system in Australia. It almost killed 50,000 people, but they were able to detect it, but let’s give it 30 seconds, and then it’s gone. It was a Northern Ireland police breach of data, which was a huge story in the UK, but they didn’t have the visuals to tell it, so it lasted for a 24-hour news cycle. But it was the data that goes back 20 years, and it risked lives of those police officers and detectives. So it shows the vulnerabilities of the systems. I don’t think we should be so flippant in our coverage going forward, is my premise here.

Rebecca McLaughlin-Eastham:
Thank you. Marjorie, and I’ll ask all of my esteemed panelists the same question as we conclude. In terms of the theme of GCF 2023, our shared priorities in cyberspace, when it comes to advocacy, strategic communications, what is the most important shared priority that we should all have? Marjorie, what would you say?

Margery Kraus:
I think the shared priority is to try to bring multi-sectors, the way we’re doing it here at GCF, together to come up with belts and suspenders frameworks for solutions of how we can deploy the benefits of cyber with the right frameworks so that the abuses are limited. There are always going to be abuses, and we have to try to keep one step. But if we don’t do what Faisal was saying about engaging with the corporate side, and if we don’t improve the coverage of how we expand the knowledge about cyber, we’re going to end up in a place where this gets so far ahead of us. People are already afraid, and if we feed into that fear, we will never take the full advantage of what cyber has to offer us. So getting this right is really important.

Rebecca McLaughlin-Eastham:
Thank you. Faisal, getting this right is very important. When will we get there?

Faisal J. Abbas:
Well, we need to start with education, education, education. It is mind-boggling. It is unthinkable that we give iPads to three-year-olds and four-year-olds, and we don’t monitor the content that they are watching. To use the metaphor that John used, this is the equivalent of giving a four-year-old keys to a car without a driving license and without brakes. And then we complain when the car jumps off a cliff. There needs to be literacy from a very young age. There needs to be a serious conversation with tech companies to what are we being exposed to, and there needs to be a global initiative to immediately identify what is genuine and what is manufactured by AI, because as I said in my opening remarks, the line is going to become very, very blurry in the very near future.

Rebecca McLaughlin-Eastham:
John, a global initiative much needed.

John Defterios:
Okay, number one, I think, and we’ve addressed this at different sessions, I think it’s extremely important from a policy perspective, and this even trickles into the media, that we are inclusive across the board. So I would say inclusive for the global south. So we can use two examples of the COVID-19 pandemic when there was hoarding taking place and it was every government for themselves until about nine months into it, and then the global south became part of the equation. So it’s very important because cyber has no boundaries. We’ve talked about it. It does cross borders. What happens in the global south, what would happen in Africa would also make the GCC quite vulnerable because they have great connectivity. So we should be aware of that. When it comes to climate, it’s very difficult to try to build a consensus of 190 plus countries but when it comes to cyber, and we’ve heard the collaboration on previous panels, we see collaboration on a regional basis in Asia. We see it in the GCC, and it was articulated, it should extend to the Middle East and North Africa. We see it in the Americas collaboration. How do we make that global collaboration to have this sharing of best practices, the data is where it’s comfortable to protect your sovereignty. And then I think the third leg of this, and this is why I’m very proud to be involved with the Global Cyber Security Forum, we have two elephants in the room. It’s US and China. They compete fiercely when it comes to technology and they’re competing fiercely when it comes to data. We need a center for cyber security that is an equal broker that can go east and it can go west. It tilts north and it tilts south. It can be a unifier. So I think there’s an opportunity in the cyberspace. I think there’s an opportunity when it comes to the regional conflict where the kingdom can serve a role to be a unifier, to find solutions, to allow a safe space to deal with cyber security. And I think that’s our shared responsibility at GCF 2023.

Rebecca McLaughlin-Eastham:
Thank you very much. Massimo, the final word to you. Our shared responsibility and key priorities for GCF and going forward into 2024.

Massimo Marioni:
Yeah, I think, as I said before, I think imparting digital literacy on the audience is probably the key thing because, and that’s a difficult task because you can take a horse to water but you can’t make a drink. So as much as we try to educate and inform and bring the most important news to the audience, it’s difficult to make them consume something that they perhaps don’t understand or don’t have an interest in. So that’s, I think, the key challenge which is not just on one organization to overcome. It’s got to be a combined effort from governments, from media, from tech companies, a very collaborative effort to bring this very important attention or very important topic to the audience and make them interested in what needs to be known.

Rebecca McLaughlin-Eastham:
Thank you very much. And on that collaborative note, that optimistic note, we shall have to wrap up proceedings. But those are your GCF headlines, ladies and gentlemen. For this panel discussion, please join me in thanking my esteemed guests for their contribution to our forum today.

Faisal J. Abbas

Speech speed

164 words per minute

Speech length

1232 words

Speech time

450 secs

Intro

Speech speed

82 words per minute

Speech length

104 words

Speech time

76 secs

John Defterios

Speech speed

198 words per minute

Speech length

2261 words

Speech time

684 secs

Margery Kraus

Speech speed

181 words per minute

Speech length

868 words

Speech time

288 secs

Massimo Marioni

Speech speed

156 words per minute

Speech length

934 words

Speech time

360 secs

Rebecca McLaughlin-Eastham

Speech speed

184 words per minute

Speech length

821 words

Speech time

267 secs

Cyber Costs Reframed: The Human Costs of Cyber Insecurity

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Ryan Chilcote

The discussions revolved around several key topics related to cybercrime and AI. Firstly, the rising costs of combating cybercrime were a cause for concern. The former president of Estonia expressed worries about the escalating expenses in fighting cybercrime globally and specifically in his country. In Estonia, the budget for combating cybercrime has grown five-fold over the past five years. This highlights the financial strain that governments face in dealing with the ever-evolving nature of cyber threats.

Another area of discussion focused on the use of AI by attackers to create sophisticated, zero-day attacks. Zero-day attacks refer to attacks that have no prior fingerprint, making them difficult to detect and defend against. It was noted that attackers do not need to be cybersecurity experts to utilise AI in their attacks. New attacks using AI are being invented on a daily basis, posing a significant challenge to cybersecurity professionals and organisations.

To address the potential misuse of AI, there was a consensus that regulation is necessary. Notably, AI is considered an uncontrollable technology, and there are ongoing efforts by the UN and governments to find ethical ways to regulate it. The goal is to prevent malicious actors from harnessing AI for nefarious purposes, while still allowing for its beneficial applications.

However, regulating AI is not an easy task due to its fast-changing nature. AI technology evolves rapidly, and as a result, regulations need to be constantly updated to keep pace. There was expressed doubt about whether enough time exists to develop comprehensive AI regulations, as it took the European Union nine years to create GDPR regulations.

The need for international cooperation in addressing cybercrime was emphasised. It was highlighted that 40 countries have agreed not to pay ransom during cyber-attacks, showcasing a concerted effort to refuse ransom payments. This unity in refusing to pay ransoms aims to discourage cybercriminals and reduce their financial incentives.

One of the notable points of discussion was the practical implications and boundaries of banning ransom payments. Ryan Chilcote questioned whether a policy of banning ransom payments would also apply to individuals who are threatened with the release of sensitive personal information. This raised considerations about striking a balance between protecting individuals and preventing further harm caused by ransomware.

In conclusion, the discussions brought attention to the challenges posed by cybercrime, the use of AI in sophisticated attacks, the need for regulation to prevent AI misuse, the difficulties in regulating a fast-changing technology, and the importance of international cooperation to counter cyber threats. The rising costs of combating cybercrime were seen as a pressing concern, while the practical implications of banning ransom payments highlighted the complexities of finding effective solutions. The analysis shed light on the ongoing efforts to tackle cybercrime within the framework of peace, justice, and strong institutions.

Mohammad Abdulaziz Boarki

The analysis reveals that the healthcare sector, emerging technologies, and oil sectors are highly susceptible to high asset cyber attacks. The healthcare sector has become a prime target for ransomware attacks, disrupting surgeries and compromising patient data. Similarly, emerging technologies, such as IoT systems, are connected to wide networks, making them attractive targets for cyber attacks. Additionally, systems holding sensitive or valuable information, including government entities, are frequently targeted.

Countries with poor infrastructure face significant challenges in protecting their cyber space due to budgetary constraints and lack of resources. A global effort is needed to protect these countries from cyber threats. Awareness training and capability building in cyber space are crucial in enhancing cybersecurity. Adequate budgetary allocations are necessary to combat cybercrime and protect institutions and citizens. Cybersecurity is now one of the top three priorities for any country, and countries need to invest more in cybersecurity.

Regulating artificial intelligence (AI) is complex due to its fast-changing nature. However, it is important to establish and adapt regulations to ensure ethical and safe use of AI. The decision to pay ransomware depends on the value and impact of the stolen data, and each country has the right to make decisions based on national interest.

In conclusion, this analysis highlights the vulnerability of various sectors and systems to high asset cyber attacks. The importance of global collaboration, awareness training, budgetary allocations, and investments in cybersecurity is emphasized. Adequate regulation of AI and thoughtful decision-making regarding ransomware are crucial in ensuring cybersecurity. By addressing these issues, countries can protect their institutions, citizens, and national interests in the digital landscape.

Dan Cîmpean

Phones, tablets, and laptops are considered the most vulnerable devices to cyber attacks because they are in close proximity to humans. The aggressive digital transformation in recent years has resulted in the installation of numerous applications and tools on these devices, making them prime targets for malicious activities. These devices also contain a significant amount of data and are constantly used, further increasing their susceptibility to cyber threats. Protecting personal devices from such threats is crucial as any negative impacts can have serious consequences on productivity, finances, and daily activities. The healthcare sector is another area particularly vulnerable to cyber attacks. The consequences of such attacks can have a direct and harmful impact on human lives. There have been documented cases, such as a hospital in Germany being subjected to a ransomware attack, which resulted in a patient’s death. The potential disruption caused by cyber attacks on healthcare systems can render hospitals unable to handle patient cases, leading to tragic outcomes. Consequently, there is a need for greater investment and focus on improving the cybersecurity of healthcare systems. The healthcare sector, being relatively less mature from a cybersecurity perspective, requires increased financial resources to ensure the safety and well-being of patients and medical professionals. It is recommended that the cybersecurity of healthcare systems should be given priority by national competent authorities. Privacy protection, especially among young people, presents a significant challenge. While young people are often proficient in using digital technologies, they tend to overlook the regulatory landscape. However, it is noteworthy that young people also play a vital role in knowledge transfer to older generations when it comes to online safety. They are often the ones teaching their parents and grandparents how to behave safely online, as they possess more experience and understanding of digital technologies. Consequently, there is a call to invest more in educating young people about cybersecurity, given their proficiency and their potential to bring about a paradigm shift in the dissemination of digital knowledge. Regulatory measures are crucial in combatting cybercrime; however, the ever-evolving nature of technology poses a constant challenge in enforcing effective measures. Cyber criminals exploit the vulnerabilities of technology, causing harm that is often difficult to prevent and mitigate. It is recognized that the education and resilience of regular internet users play a significant role in reducing cybercrime. With millions of users directly or indirectly needing protection, their behavior on the internet, as well as the resilience of critical infrastructures, become crucial factors in preventing cyber attacks. In order to achieve this, there is a need to improve the education of internet users and enhance their ability to respond effectively to potential threats. Dealing with the ransomware phenomenon is an intricate issue that presents complex problems with no clear or effective solution at present. There are debates surrounding whether paying ransoms to cyber criminals should be prohibited or encouraged. It is acknowledged that paying ransoms can perpetuate the cybercrime economy; however, finding alternative solutions to tackle ransomware remains a challenge. There are difficulties in cascading down decisions of not paying ransomware at an individual or organizational level, highlighting the complexities of addressing this issue. In conclusion, protecting personal devices from cyber threats and ensuring the cybersecurity of critical sectors like healthcare is of paramount importance. Education and awareness, particularly among young people, play a crucial role in combating cybercrime. Regulatory measures need to be continually updated and enforced to keep up with the ever-evolving nature of technology. Additionally, efforts to deter cybercrime include the banning of ransomware payments to discourage the growth of the cybercrime economy. Overall, a comprehensive approach that combines investment, education, regulation, and cooperation is essential for effectively addressing the challenges posed by cyber threats and protecting individuals, organizations, and society as a whole.

Dr. Ahmed Abdel Hafez

Cyber attacks have both direct and indirect impacts on humans, affecting both individuals and digital services. Individual loss of control over data, such as banking credentials and social engineering details, can greatly affect individuals. Furthermore, cyber attacks on digital services like healthcare, intelligent transportation systems, and other emerging service systems that are being digitised can have direct or indirect impacts on human beings.

The psychological impact of cyber attacks and digital dependency is becoming prevalent. The fear of losing a mobile phone, known as “nomophobia,” is a psychological issue that is on the rise. In addition, issues such as cyber bullying cause harm to people, particularly vulnerable individuals like young girls.

The increasing dependency on mobile phones is a concern as well. People’s lives are now heavily reliant on their phones, which contain their bank details, personal information, and social accounts. Even the loss of battery life in a phone can cause stress in individuals.

Awareness plays a crucial role in combating cybercrime. Dr Hafez suggests that teaching people how to handle digital transformation safely is crucial and can reduce cyber attacks by 80 to 90 percent. This highlights the importance of educating individuals about cybersecurity risks and best practices.

Strict regulations and laws are necessary to control cybercrime. Dr Hafez believes in implementing strict rules and regulations that should be followed by individuals and government officials. In Egypt, for example, anti-cybercrime laws and data privacy laws have been enacted.

A Child Online Protection strategy is essential to help children access the internet safely, especially considering that 40% of the population in Egypt is under 18. This underscores the need to protect vulnerable individuals from the potential harms of the internet.

The role of artificial intelligence (AI) in cyber attacks is significant. AI can be used to invent new sophisticated attacks, including zero-day attacks, which complicates the task for cybersecurity professionals. Additionally, the scope of potential attackers has expanded with AI, as individuals do not need to be cybersecurity experts to use it.

The ethical use and control of AI are important considerations. Currently, AI is seen as an uncontrollable technology, leading governments and organizations like the United Nations to work on managing its use in an ethical manner.

Ransomware attacks pose a significant issue, with losses reaching three trillion US dollars last year. Nations’ efforts to control ransomware have become crucial in mitigating the impact of these attacks.

Data has become the most important asset in the global economy, on par with oil. As such, responsible data management and protection are essential for economic sustainability.

Strong data backup control measures and international collaboration are necessary to effectively combat cybercrime. Dr Hafez emphasizes the importance of a three-to-one backup for data assets to prevent ransomware attacks. Furthermore, increased collaboration among nations is necessary since cybersecurity is a cross-border activity that requires cooperation and collaboration.

Overall, cyber attacks and their various impacts on human beings are significant considerations in today’s digital world. From the direct impact on individuals to the societal implications of digital dependency, it is crucial to address these issues through awareness, regulation, protection strategies, and international collaboration.

Session transcript

Ryan Chilcote:
Chairman of the Executive Beirut Egyptian Supreme Cybersecurity Council Dan Campin, Director, National Cybersecurity Directorate, Romania Major General Retired, Engineer, Mohamed Abdelaziz Bouarki Chief, National Cybersecurity Center, NCSC, Kuwait Ryan Chilcots, Moderator, Master of Ceremonies, former Bloomberg, CNN, CBS, PBS, and Fox News It’s so nice to see so many of you are still here. We must be doing something right at the Global Cybersecurity Forum. This panel, as you’ve probably seen in your programs, is called Cyber Cost Reframe. And the idea is we’re used to measuring financial losses, economic losses, when it comes to cyber activity, cyber disruption, cyber attack, cyber crime. Less used and perhaps less skilled at talking about the direct human harm that can come from cyber disruption. So that’s what we’re going to do just now with my three esteemed panelists who were just introduced, so we won’t have to go through that again. Thank you so much for joining us. I like this topic because we can really take it where we want to. But we need to kind of nail some things down before we get into it. So Dan, let me do that with you. Let’s start with where the harm can be done. In other words, what cyber-related systems are most vulnerable to malicious cyber activity when it comes to causing us humans?

Dan Cîmpean:
Thank you. Thank you so much, Ryan, for the question. Most intuitively are the devices, the systems that are the closest to our own person. The phones. Phones and tablets and laptops and so on. Simply because we saw it in the last years, thanks to the very aggressive digital transformation, we installed plenty of applications, plenty of tools. We have plenty of data on devices that are really, literally on our person. And they are the ones that influence and impact our daily life, our relations, our communication, our work, actually. So everything that impacts a device that I’m using on a daily basis, definitely it harms me in a variety of ways. Whether I lose productivity or I lose money or I lose time or I get impacted in a negative manner in the way I do my work and my activity.

Ryan Chilcote:
Thank you very much. Dr. Ahmed Abdel-Hafiz, I’m trying to figure out, Your Excellency, if Dan’s point just now was kind of obvious and simple as a result of being obvious or actually a bit profound. So if you could weigh in on that. And also, let’s zoom out. Okay, phone. I think we all understand that our phone and losing control of the data on our phone can cause us trouble. If we zoom out, what kind of macro problems can we run into? Yeah.

Dr. Ahmed Abdel Hafez:
I would like first to thank Saudi Arabia for inviting me for this great event. Thank you very much for the hospitality and for the great event. First of all, let us talk about if you are talking about any digital transformation or any kind of to help the property of the human being or well-being for the human. So any cyber attack will harm the human being, if it is direct or indirect. So coming up with my friend Dan saying about the phones, there is a psychological disease right now called nomophobia. No mobile phone phobia. Yes. So the fear of losing your mobile phone. Since whole life on your phone. A bank account, your credential, your social engineering, your WhatsApp, everything is on your phone. So if you lost your phone, even if you lost the battery of your phone, you are feeling you are always shaking the life of your phone if it is going to lose the battery or not. So there is a lot of activities. If cyber attack will harm this, will have a direct impact on the human being which is indirect or indirect like healthcare property, like ITS, intelligent transportation system, which will be digital transformation. Emerging service systems which will be digitized. All these services will be digitized. So will be affected with cyber attack, will have an impact or direct or indirect on the human being and the well-being of the human. So everything, every cyber attack, whatever it has a direct impact on the human or not a direct impact, will have a bad impact on the human being about his well-being, about his life. Even in the societal environment itself, for the cyber bullying, cyber bullying in the social engineering, using, abusing of the small girls or something like that. All these activities will be harmed with the human activities. Thank you very much. And that term again was no phone? No mobile phone phobia. It’s called a nomophobia, no mobile phone phobia. Yeah. I think there might be some people in the audience suffering from that right now. It’s a disease for the psychologist known eight years ago. It’s not a new disease here.

Ryan Chilcote:
Thank you very much, Your Excellency. Engineer Borki, we also just heard about health care. So if we, is health care a big concern when it comes to human harm?

Mohammad Abdulaziz Boarki:
First of all, salam alaikum wa rahmatullah wa barakatuh. Thank you for Saudi Arabia for having us here. And I have to greet Saudi Arabia for having the World Cup hosting for the next few years. Second of all, for answering your question, as you said, it’s a wide answer question. Health care has become the last few years one of the highest assets for ransomware attacks as well as financial sectors. Health care is close to financial? Let’s say health care was the first statistically, the first high asset was targeted by ransomware attackers. Because it makes money and money is everything. And because they encrypt data for patients, which cause disruption for executing surgeries around hospitals. That’s why it becomes a high target. Now, statistically, also financial sectors has become one of the highest assets. Money has been always, is the highest asset for everything. And if we want to go also wider with that, any high asset information which lays in a system, it becomes a high asset. For example, smartphone. Your smartphone, if it doesn’t have any sensitive information or bank information, it won’t be a harm if you’ve been attacked. But what lays inside actually the system, whether it’s a smart system or IoT, Internet of Things system, which is attached to the big wide network, it becomes a high asset. Various and as emerging technology becoming very fast evolving and very fast changing, also the high asset for attacks become changing by the time and by how important actually this smart or system is important. So, for example, also as Dr. Hafez said, military system has been always a high asset. Health care system and I can add on also oil sectors has become also one of the highest assets. So, you cannot just define whether this is a high asset this year or next year. So, it becomes a high asset when it becomes a target. So, you will not be targeted unless you are an important entity or a system or a high target for as a governmental, let’s say, target.

Ryan Chilcote:
Thank you. Thank you. Dan, can you give us an example of an attack on a health care system or a part of a health care system that caused direct human harm?

Dan Cîmpean:
Absolutely. As we all may know, about three years ago in September 2020, I think it was the first ever documented unfortunate human death due to a ransomware attack. It happened in Germany, in Dusseldorf, where due to a ransomware attack on a hospital, actually one patient was impossible to be treated by the doctors and had to be moved from one hospital to another. And actually the root cause of the death of that patient unfortunately was assessed, was ruled out as being that particular ransomware attack. And let’s just imagine that one hospital that is treating 1,000 patients every single day due to a cyber attack is not able to handle 1,000 patients a day, but, I don’t know, 100 or 200. So, the risk is gigantic. And honestly, no manager of the hospital, no authority can afford such risk. And we as regular users, we should be aware that any disruption in this sector can produce a tragic impact on our lives. And how well are we prepared to deal with those kind of attacks right now? I’m choosing carefully my words now. Unfortunately, I think there are plenty of challenges and risks over there. The healthcare sector systematically in many, many countries is not the most mature one from a cybersecurity perspective. And it’s simply because there is a proliferation of very specialized technologies for healthcare. It’s also a proliferation of digital technologies that support the infrastructure of hospitals. And thanks to this, it’s very difficult and very complex to have a very, let’s say, good security baseline for the sector as a whole. It’s also one of the sectors that needs very, very high investment. Because lives could be impacted, because patients are at risk in case something goes wrong. And I truly believe that it’s one of the sectors that should be systematically on the top of the agenda of the national cybersecurity competent authorities in terms of focus and investment.

Ryan Chilcote:
Yeah. Okay, so we’ve talked a little bit about the so-called attack surface, where these attacks can happen. Your Excellency, Dr. Hafez, if you could talk a little bit about, you know, how one measures the impact of these things, if it’s not financial, if there isn’t a… How do you… Because governments are good at dealing with problems that they can measure. And money is easy to measure. But what about, like, what we’ve just been talking about?

Dr. Ahmed Abdel Hafez:
Are you talking about the role of the government to understand the cybercrime will impact as a human being? I’m always saying, awareness is a very important issue. All the governments will take care about it. To raise the awareness of the human being, how to deal with the digital transformation in a safe manner. So all governments all over the world are moving right now for the digital transformation. To make the life of the people very easy or something like that. But in the other way, you should learn with them how to deal with digital transformation in a safe manner. By awareness, by regulations, by laws. So if the people know how to deal with the digital transformation, with the digital life, for all life, even if it’s financially, or the healthcare, or transportation, and every service in a safe manner, will reduce cyberattacks at least from 80 to 90 percent. For that, to protect themselves from being attacked, even personally. Or if this person is an employee of any organization, of any government, if he’s going to be attacked officially, his official credential, for example, his official email, if this has been attacked, the whole organization will be attacked. The mail server of the whole organization will be attacked. So awareness is the most important thing to help the people to deal with the digital transformation in a safe manner. Second one, to put very strict regulations and rules to be followed by the people and the officials in the government. So if you are talking about the human being, a normal human being, like children, like the women, like the elderly, or the disabilities, you have to learn with them how to deal with the digital transformation. For example, in Egypt, we are a big country of about 40 percent of the population under 18, which is by law considered as children. So right now we are making child online protection strategy to help the children to get benefit from using the internet, but in a safe manner. So using regulations to help the people to know their rights. The other thing is the law. We have many laws in Egypt right now, anti-cyber criminal laws, for the data privacy law. So if we are issuing this law, but the people didn’t know about this law, they didn’t know that this activity may be criminalized, or they didn’t know how to get that rights if it had been attacked by something else. So as I said, the most critical three activities in any nation should do to withstand these cyber attacks, which will be harmed directly for the human being, the awareness, regulation, and the laws.

Ryan Chilcote:
Thank you very much. Engineer Borki, we heard there from His Excellency about how so much of the population in Egypt is under the age of 18. You can’t talk about young people without talking about privacy, young people sharing, for example, images of themselves amongst themselves. It’s quite common now, and then those can get in the wrong hands. How big of a problem is that, and how do you deal with it?

Mohammad Abdulaziz Boarki:
I want to elaborate on my colleagues’ feedbacks. It’s a great impact. It’s a scary impact. One of the challenges now is not about having technical and regulation publications. It could be about budgetary. People or country doesn’t have enough budget or the right budget actually to execute publication and regulation for sizing and measuring cyber impacts. If you can measure it, you can’t manage it. So, the great impacts about any country individually or collectively, so it has to be a collective approach and a collective collaboration. And I believe, I suggest, I mean, since we have now a global medical organization for any coming up pandemics, which they can help poor countries for not having medical. Now, there is poor countries which they have poor infrastructure, and they don’t have the capabilities actually for protecting their cyber space. So, why not having now a global effort which helps other countries to protect, because this is a via versa. For example, now, if I am a country A, which have a good high capabilities in cyber, and a next neighbor country has a weakness in cyber, it could be a threat for me. So, now, helping the whole surrounding countries which having a great, let’s say, plan or executive plan for cyber is a must. And the impact is devastating, and it could be costing million of dollars by not having the right strategy or clear objectives. And the clear pillars, as my colleague says, awareness training and building capabilities in cyber.

Ryan Chilcote:
And just because you mentioned the word budget there for a moment, we heard during the plenary session the former president of Estonia talking about how… she’s concerned about the spiraling costs of dealing from a governmental perspective. Not exactly our topic here, but you both are all coming from governments dealing with this issue. She mentioned that the budget of Estonia for combating cybercrime has grown five-fold over the last five years. Of course, Estonia has a neighbor that part of the reason why that budget has been going up. But how do you, is that an issue? That attracting the necessary resources to deal with cybercrime in your country and in general for countries right now?

Mohammad Abdulaziz Boarki:
It is an issue. I mean, if you don’t believe that cyber could be devastating, and now it’s the fourth domain in the world, we have physical domains, for example, land, sea, and maritime domains. Now we have a cyberspace domain, and it’s nothing less than those three physical domains and borders. So now, if you believe now that cyber could take you to a nightmare for any countries, now you will set up the right budgetary. But things, now I’m speaking about many challenges now. It’s some countries, and they don’t believe now cyber is a threat.

Ryan Chilcote:
You wanna name names?

Mohammad Abdulaziz Boarki:
Until they have been hit.

Ryan Chilcote:
You wanna name the countries right now?

Mohammad Abdulaziz Boarki:
Many countries. So, I mean, if you believe that cyber could be power, and cyber could be a threat, it’s the way, how can you deal with it? And if I wanna quote from His Excellency Adel Al-Jubeir, he mentioned a very important quote, that now any country pillars, now the top three, I think cyber could be the top three, or it is the top three priorities for any countries. It could hit your economy, it could hit your society, and it could hit your financial system. So this is something we need to actually invest on, and we need to take it in consideration.

Ryan Chilcote:
Thank you. Dan, if I could bring it back, when we think about the attack surface, and we’re gonna move on from this and talk about collaboration in a moment, the issue of privacy and protecting your privacy, which we just started to kind of move into, particularly amongst young people, how big of a problem is it? And how do you deal with that?

Dan Cîmpean:
I think it’s a big challenge, honestly. And one of the root causes of being such a big challenge is that, especially the young generation, it’s by far better and more proficient than we were in using technologies. And something that we tend to forget is that they get their knowledge and their good practices from each other in the very first place. They tend to not look too seriously at regulatory landscape. Kids and young people, they don’t really read cybersecurity-related laws, and they get good practices in the way they find it more appealing and receiving it from each other. So we have to address, actually, this challenge, and also not to forget that, simply because they are more experienced in using digital technologies than older generations, we have a shift in the paradigm. So now kids and youngsters are teaching their parents and grandparents on how to behave safely, how to protect privacy, how to protect their data on the internet. So it’s something that we should look very, very careful at and, honestly, invest a bit more in the knowledge of this young generation to get them to help all of us to get more resilient and more secure in cyberspace.

Ryan Chilcote:
Your Excellency, how does, we were talking about this over the last day and a half, emerging, our favorite emerging technology, AI, how does that complicate threats when it comes to human harm?

Dr. Ahmed Abdel Hafez:
Well, as we, as cybersecurity guys, got benefit from using AI, the attacker as well got benefit from using AI to invent a new attack, a zero-day attack, which will be sophisticated, which will be very complicated to deal with. So AI, it has both sides, it’s a good and a bad one. For the good one, the cybersecurity guys, we’re using, for example, if you have a very big data or a very big incident, we need to analyze, we need to, using AI will help us to accelerate that process. But on the other side, as I said, even if you don’t, a cybersecurity expert, if you’re just a human, a normal one, knowing a little bit about AI, using the very well-known, the track share GPT right now, you ask them to make a new attack, they’re gonna do that for you. So AI, it’s uncontrollable technology until now, since all the government right now, all the United Nations right now, are looking for how to control or to manage using ethically AI, in an ethical manner. Even in educational service, any student right now can write his report using AI. So AI help the attacker very well to invent a new attack, a sophisticated one. So as a security guys, we are suffering right now from a zero-day attack. Zero-day attack, it means that an attack with no fingerprint, for example. So using a new one, so we need to deal with the new attacks every second. Every day right now, there’s a new attack using AI. So the span of the attacker has been increased using AI. As I said, you don’t have to be an expert for the cybersecurity to be an attacker. But since it gives a lot of money, so a lot of people right now using AI gonna be attacker. So it will be sophisticated, it will be harder for us to withstand this activity using AI.

Ryan Chilcote:
Thank you, Your Excellency. Engineer Borky, they just had an AI summit in London, which heard the word 50 times in the last several sentences. How do you regulate moving to the solution at the end of this conversation? How do you regulate AI so that you don’t have these kind of problems? I think I’ll rephrase your question. Can we actually regulate AI?

Mohammad Abdulaziz Boarki:
This is the main question. I don’t think it’s something constant. AI is fast changing also, and it could be also a powerful protection, and it could be a weakness and a threat. It depends the way you use it. So regulating AI is not something I believe, it’s not an easy job. It should be constantly changing your publication and policies to keep up with fast and changing technology. AI has been approved both ways, have been approved positive approaches and have been approved negative approaches. AI has been one of the ways of attacking system by the attackers, also as well as it has become a good solution in medical sector, for example, for helping surgeries and around the world by using the 5G connectivities. So AI, it’s a big topic. AI, it’s a deep thought. AI is not something we can, I believe, it’s not an easy job to regulate.

Ryan Chilcote:
Yeah. Okay. That’s a bit worrisome. This is my thought. Thank you. Dan, okay, so if we can’t, I mean, because we were listening to Jose Barroso, the former president of the European Commission, the other day talk about how it took the European Union nine years to come up with GDPR. That was a good thing because they got scale and we all use it now. It’s sort of like the global standard, but AI, I don’t know, maybe it’s another beast and we probably don’t have nine years. So what can, what should a nation do to control this problem of cyber crimes causing harm to people?

Dan Cîmpean:
I think obviously in the very first place, we should have very good regulatory measures, which is something extremely difficult to put in place. For a very simple reason, technology will be always one step ahead of regulatory environment. So first technology will come, cyber crime, for example, will use and exploit the vulnerabilities of those technologies and will do harm. And then national competent authorities at the level of one country or group of states, they will have to come with some measures. That’s one big challenge. It’s not very easy to align those measures because if one country is very resilient, very strong in terms of regulatory measures and others are less mature, basically we don’t fix the issue. Then we have to really, really invest a lot in educating the user. And just to give a simple example, we have to protect millions of users, honestly, either directly in the way they act when they behave on the internet, in cyberspace, or indirectly through the critical infrastructures that need to be resilient, available, and so on. So we have to really work on those dimensions. Regulatory measures, on one hand, and this is not easy to put in place, especially when it comes to, for example, the ransomware phenomenon, the always debatable issue of do we want to ban payment for ransom or not? How should we tackle this? And no one has a magic solution up to this moment. Up to the moment of how to increase education and resilience of regular users that if we put them together, they become a gigantic attack surface that can be exploited by malicious actors. So what I truly believe is that we have a very, very, very serious challenge ahead of us and we have to focus really systematically on this.

Ryan Chilcote:
Let me pick up on the ransomware idea real quickly with you for a second, because just, I guess, last week, 40 countries came together to agree that they would not pay ransomware on a tax. Now, my assumption was that they were talking about on a national level. So if the United States gets attacked and someone tries to extract a ransom from the US, I don’t know if the US was a signatory to that agreement, then the US wouldn’t pay it, just like the other countries, and so they’re coming together to try and, you just mentioned basically banning people from paying ransom. So for example, if someone sends you an email and they say, Ryan, we have some really sensitive information about you and we’re gonna share it with the world, you’re saying that you would ban me from paying those people to get my information back?

Dan Cîmpean:
The difficult challenge is how to cascade down decision that is taken at the national level. For example, my country doesn’t pay ransomware. Yeah, that I kind of get. To cascade it down to private actors, to industry organizations, and ideally cascade it down to the level of users. But how to create this mechanism, how to enforce it, that’s very, very complicated, because at the end of the day, users are autonomous, they behave in their own way, and it’s extremely difficult to enforce it, actually. My personal opinion is that we should attempt to ban ransomware payment across the board, simply because by paying a ransom, actually we encourage the phenomenon. We finance the cybercrime, actually.

Ryan Chilcote:
I wanna come over to your excellency in a moment, but real quickly, and Engineer Borki, I saw you shaking your head. Should countries ban their citizens from paying ransomware?

Mohammad Abdulaziz Boarki:
I don’t think there is a correct answer here. It depends on how valuable are, for example, the attack. My information is very valuable. My private information. You answered, actually, the question, is how valuable, actually, the attacker has taken in national-wide or individual-wide. So, for example, now, if someone stole your data, and the only data that you have is in your smartphone, and you don’t have a backup for it, would you negotiate? If it was cheap to get it back, maybe, yeah. If there is a way, and you can get it back in a cheap way, of course you’re gonna, because this is your life and your smartphone. Let’s talk, let’s take it to the next level. Now, if this information or whatsoever that the attacker has been taken and encrypt those data, and those data can cause a national threat or a disruption of services in this country, do you think we cannot negotiate? So, I mean, it depends. I don’t think this is something that it could be a one solution in each country, but each country has the right to deal with it how they see it, and I believe if it’s for the national interest, I don’t think there is a problem to negotiate.

Ryan Chilcote:
Your Excellency, Dr. Hafez, I’m gonna give you.

Dr. Ahmed Abdel Hafez:
Let me add something to my process. The issue of the nations trying to control the ransomware, the losses for ransomware attacks has been increased in the last year, three trillions of US dollars. So, we are trying globally to control the ransomware, but as His Excellency said, if any organization didn’t follow the controls of making three to one backup for their assets, so the data right now, it become the most important assets. It’s gonna be the oil of the world right now. The data will be the oil of the global right now. So, if any organization didn’t control or make a backup for their data, as a punishment, they should pay the ransomware. They would pay the money to get their data back.

Ryan Chilcote:
I’m gonna give you the last word. Sure. Are you satisfied with international collaboration to combat the cybercrime that can lead to human harm or just in general, cybercrime? Are you satisfied with the international collaboration we have now? And if you’re not, because this is GCF and we’re all about having a shared action plan and tangible results, give us one thing nations can do to collaborate better.

Dr. Ahmed Abdel Hafez:
If you’re asking me, I am satisfied. No, I am not satisfied about. I’m almost prefer a word, collaboration rather than cooperation. Yeah. In Arabic word, collaboration means ta’adud, cooperation means ta’awni. Ta’awni means to be shoulder by shoulder for to be with a nation, yeah. Collaboration between nation will help all of us to overcome the cyber attack with keeping the dignity or the classified data for the nation. So, cooperation or collaboration doesn’t mean to reveal your classified data between nations, yeah. But we should collaborate even regionally in the Arabic world, the Middle East world and international. Right now, there’s a lot, many efforts for the whole nation to collaborate, to come up with the anti-criminal law. Regional, it’s international law. But it’s very difficult. Each region has its mindset about the data protection, data privacy, human rights. You know, this is a very conflict, yeah. But if we didn’t collaborate, the attacker will be succeed. You know what? We, all over the world, the spending of, for cyber security in a billion. But the losses in trillions. So, we need to change our philosophy for dealing for cyber security. And one most important thing of them is a collaboration. Since cyber security is a cross-border activity, you didn’t control. You have to collaborate with all government to get agree upon certain controls, a certain framework, a certain laws about anti-criminals. So, I’m not satisfied. We still have more efforts for collaboration.

Ryan Chilcote:
Dr. Achmed Abdel-Hafiz, thank you so much. Dan Simpion, and Engineer Borki, your excellencies. Thank you very much for this conversation. We’re out of time, but I learned a lot, and I hope you did as well. Please join me in giving a big round of applause for our esteemed panelists. Thank you. Thank you guys.

Dan Cîmpean

Speech speed

156 words per minute

Speech length

1220 words

Speech time

468 secs

Dr. Ahmed Abdel Hafez

Speech speed

174 words per minute

Speech length

1673 words

Speech time

577 secs

Mohammad Abdulaziz Boarki

Speech speed

152 words per minute

Speech length

1302 words

Speech time

512 secs

Ryan Chilcote

Speech speed

179 words per minute

Speech length

1383 words

Speech time

465 secs

Smoke & Mirrors: Social Engineering and Sophisticated Phishing

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Joy Chick

Phishing and social engineering attacks are prevalent across various industries, including healthcare, government, and finance, due to people’s busy schedules and lack of attention. These attacks have become the easiest way for criminals to obtain sensitive information and credentials. The increasing volume, scope, and sophistication of social engineering attacks are a concern, as attackers continue to evolve their strategies.

It is important to note that cyber attacks can happen to anyone, regardless of their level of technical knowledge. Therefore, individuals must remain vigilant and take necessary precautions to protect themselves and their information online.

The use of emerging technologies like Gen AI and machine learning by cyber criminals has enhanced phishing attacks. These technologies allow for automated and personalized campaigns that are difficult to detect and deceive people. This underscores the need for individuals to stay informed about the latest cyber threats and adopt robust security measures.

However, AI and Gen AI can also be used to enhance cybersecurity efforts. Companies like Microsoft employ AI to evaluate the security of user identities, devices, networks, and data. This technology can detect anomalies and breaches by analyzing vast amounts of information, while Gen AI automates these processes and reduces the burden on cybersecurity specialists.

To effectively combat social engineering attacks, individuals are advised to use phishing-resistant multi-factor authentication (MFA) and remain cautious of potential threats. However, it is important to recognise that MFA is not foolproof, as attackers have found tactics, such as SIM jacking and creating fake websites, to bypass these security measures. Maintaining a high level of vigilance is therefore essential.

The inconvenience of managing multiple passwords poses another challenge. Remembering different passwords for various accounts can be difficult and can lead to security risks. Password management solutions are necessary, and individuals should avoid reusing passwords and credentials across multiple accounts.

Responsibility for online protection should not solely rest on users. Collaboration among industries, authorities, and society as a whole is crucial for implementing effective cybersecurity measures. Biometrics and device-based authentication methods, such as Fast Identity Online (FIDO), are increasingly being adopted to securely verify users’ identities.

A zero-trust approach to identity verification and security is essential. This approach involves continuously verifying identities, granting minimal privileges, and assuming that breaches can occur, focusing on prompt detection and remediation.

In the era of cloud services, protecting workload identities is crucial. As more customers transition to the cloud, safeguarding non-human identities becomes increasingly important. Streamlining and decentralising verifiable credentials are necessary to ensure robust protection.

AI has the potential to revolutionise the security industry by identifying anomalies, detecting breaches, and taking real-time action. It simplifies the work of cybersecurity professionals by reducing reliance on multiple tools and logs.

Overall, security is a collaborative effort that requires the active participation of various stakeholders. By staying informed, adopting robust security measures, and fostering cooperation among industry players and societies, we can effectively combat the growing threat of cyber attacks and safeguard our digital ecosystem.

Moderator

In a recent discussion on the topics of smoke and mirrors, social engineering, and sophisticated phishing, Joy Chick, the President of Identity and Network Access at Microsoft, and Lucy Hedges, a technology journalist and TV presenter, explored the intricacies of cyber attacks and the necessary steps to protect against them. The discussion provided insights into the deceptive tactics employed by cyber criminals, including the use of smoke and mirrors to create illusions and misdirect attention. These tactics often result in successful social engineering attempts, where attackers manipulate individuals into revealing sensitive information or compromising security.

Both speakers stressed the critical importance of educating people about the various tactics employed in cyber attacks. By raising awareness and promoting digital literacy, individuals can become more vigilant and better equipped to identify and defend against deceptive strategies. Chick emphasised the need for organisations and individuals to invest in comprehensive cybersecurity training covering topics such as phishing awareness, safe browsing habits, and password hygiene.

Furthermore, the discussion highlighted the increasing sophistication of phishing techniques, noting that attackers are constantly evolving their methods to outsmart security measures. Traditional approaches to identifying phishing emails, like checking for spelling errors or suspicious links, are no longer sufficient. Cyber criminals have become adept at crafting highly convincing and targeted emails that are nearly indistinguishable from genuine communications. This necessitates the implementation of advanced security measures that go beyond traditional email filters and firewalls.

In conclusion, the discussion underscored that smoke and mirrors, social engineering, and sophisticated phishing are persistent threats that require continuous improvement in cybersecurity practices. Education and awareness are key to mitigating these risks, and organisations should prioritize implementing robust security measures to counter the evolving tactics employed by cyber criminals. By staying informed and proactive, individuals and businesses can enhance their defenses and safeguard their sensitive information from falling into the wrong hands.

Lucy Hedges

Social engineering and sophisticated phishing attacks are emerging as increasingly concerning threats to our digital society. These attacks exploit human vulnerabilities and security gaps and are executed by highly skilled perpetrators. It is worth noting that emerging technologies, such as Gen AI, are accelerating the innovation curve in these attacks.

To effectively defend against these threats, it is crucial to have a deep understanding of how social engineering and phishing attacks work and how they are evolving. These attacks are becoming more sophisticated, necessitating individuals and organizations to stay informed and updated on the latest tactics employed by cybercriminals. Without this knowledge, countering these threats becomes increasingly difficult.

In this context, Lucy Hedges implicitly praises Joy Chick, highlighting her authority in the security landscape and her exceptional leadership role in managing Microsoft’s Identity and Network Security Solutions. With oversight of the largest user base in the world, encompassing both consumers and commercial entities, Joy Chick’s leadership underscores the importance of expertise in combating security threats.

Lucy Hedges emphasizes the evolution of social engineering attacks over time, noting their increased intricacy and sophistication. It is crucial to recognize that cyber attacks can happen to anyone, regardless of their technological knowledge or industry of work. This serves as a reminder that no one is immune to such threats and that everyone must take precautions to protect themselves and their data.

In conclusion, the escalating threats of social engineering and sophisticated phishing attacks present a significant risk to our digital society. The evolving nature of these attacks calls for continuous education, awareness, and the adoption of advanced security measures. Strong leadership, exemplified by Joy Chick, plays a pivotal role in navigating and mitigating these risks. Cybersecurity is a collective effort that demands vigilance from individuals and organizations alike.

Session transcript

Moderator:
Smoke and Mirrors, Social Engineering and Sophisticated Fishing. Joy Chick, President, Identity and Network Access, Microsoft. Lucy Hedges, Moderator, Technology Journalist and TV Presenter.

Lucy Hedges:
Hello, hello. I hope we’re all having a great event so far. Lots of insights and lots of inspiration to go home with after today. So we are here to talk about social engineering and sophisticated fishing. You know, these are the kinds of attacks that involve the use of deception by incredibly skilled perpetrators who are really adept when it comes to exploiting human vulnerabilities and security gaps to really capitalize on trust to gain unauthorized access to sensitive information and systems. Now these kind of attacks are moving at unprecedented speed which is in no small part down to emerging technologies like Gen AI that’s really accelerating the innovation curve when it comes to modern social engineering which is ultimately escalating its threats to our digital society. So it’s crucial or critical even for us to really understand the intricacies of these attacks which are getting more sophisticated by the day in order to really understand how to defend against them. Now I am joined by someone who is very well-versed in this area. Joy Chick is, I think it’s fair to say, a force to be reckoned with in the security landscape. She runs Microsoft’s Identity and Network Security Solutions running the world’s largest security systems across consumer and commercial which has over a million enterprise users, a billion enterprise users and almost a billion consumers on a monthly basis. So Joy, how are you?

Joy Chick:
Great. Thank you, Lucy. And good afternoon, ladies and gentlemen. It is actually my first time visiting the kingdom and it’s very much a great honor to be here.

Lucy Hedges:
Absolutely. Now we’ve got a lot to get through in a short space of time so I’m going to dive straight into my first question. So we’re going to start by defining the problem and the impact of this issue. So why are phishing and social engineering attacks such a big problem in cybersecurity?

Joy Chick:
Yeah, with any breaches, the most important thing that our criminals want to get is your credentials. Yes. And guess what? The easiest way to get credentials is through social engineering and phishing. That’s because it’s easy when we are busy. It’s easy for us when we’re not paying attention. Yes. You click on that email link and you get hacked. I think we talked over the break, like, geez, even us as security professionals, we get tricked sometimes. And when it happens, it really feels like it really breaches our trust, if you will. But it happens. And actually when it happens, it’s not just for consumers, it’s across the entire industry whether it’s healthcare, whether it’s government, critical infrastructures, financial industry. Yes. And the impact is devastating.

Lucy Hedges:
Yeah. I think there’s this misconception, isn’t there, that these kind of attacks happen to people that aren’t very clued up, they don’t work in tech, they don’t really know. But it can happen to anyone.

Joy Chick:
Anyone and to every one of us.

Lucy Hedges:
Hands up who’s been a victim of a phishing attack or have clicked on a nefarious link in the past. I know I have. I was busy, I was on the move and I clicked a link in WhatsApp and, you know, my phone got taken over. Exactly. It was really scary, a really scary time. You know, things are evolving so quickly at such an incredible pace. It really keeps us on our toes and especially you in your line of work. So how have social engineering attacks evolved over time and in what ways have they become increasingly sophisticated?

Joy Chick:
Yeah. And I want to say the sophistication is both volume, scope and also just the scale, if you will. And, you know, from Microsoft, you know, we see globally all the attack that happens across our cloud services. Just some data points. In 2021, we see about almost 600 passwords get attacked every single second. OK. And in 2022, that number has doubled to over a thousand. And guess what? In 2023, we haven’t finished yet. And the numbers has already quadrupled to 4,000 passwords attacked every single second. So it is really that exponential scale, if you will. And also at the same time, you know, our criminals are getting very well funded. And, you know, frankly, I would say that they’re innovating at the speed just like our cybersecurity professionals, if you will. So they get really well organized and many are backed by nation state and the multinational criminals, if you will. And some of the patterns what we see is, you know, you can say the old days or the easiest way is really just to send you an email, trick you to a website, and then you accidentally type your credentials and, you know, and you get hacked. And that still probably remain to be predominantly the primary attack factors. So we tell all our users, our customers to turn on multi-factor authentication, which by itself, by the way, so multi-factor authentication is in addition to a password, you know, second factor, you know, SMS, you know, second factor authentication. By itself, it really reduces attack by 99.9%. Yes. However, the, you know, cyber criminals then continue to work around it. So some of the techniques is called, you know, MFA SIM jacking, because the majority of the MFA is through SMS. So what the attacker does is they get in between your, you know, telephony and your, you know, your phone. So they intercepted the SMS signals and then kind of reply that multi-factor authentication on your behalf. So that’s something they are escalating. So then we said, hey, then we can talk about, you know, maybe doing phishing resistant MFA, if you will. But the reality is, you know, I think Lucy, you and I all get a lot of MFA prompts every day. Yes. Sometimes we just get fatigued and frankly confused. Yeah. So what happens, you might accidentally approve the one that is not being intended. And then there’s other methods. For example, like, you know, the criminals can do something called adversary in the middle phishing, which is they can fake a website that looks exactly like the real website and get you over there and then store your credentials through that method. And sometimes they can come across as from some kind of official authorities. And like, you know, hey, I come from officially some tech support. So you thought you are being helped, but instead you are being hijacked.

Lucy Hedges:
Yeah, it really is unbelievable just how sophisticated and complex these attacks are. And like you say, these kind of nefarious characters are moving at the same pace in which the industry is moving. And, you know, if these guys could only just apply this incredible knowledge to good, the world would be a much better place. But unfortunately, that’s not how it works. You’d be out of a job, that’s for sure. And I’m gladly to be, if that’s the case. Let’s talk about Gen AI, because, you know, this is a massive talking point at the moment for various reasons. So how are cyber criminals leveraging emerging technologies like Gen AI and machine learning to really enhance these phishing attacks and create more convincing and targeted phishing emails and websites like you just discussed?

Joy Chick:
So I would say, you know, in the past, we probably, for those of us a little bit more sophisticated, we say, hey, maybe you can detect phishing email forms, like, you know, an email is poorly written with grammar mistakes. Or kind of in a form, you know, it is, you know, sort of massively produced, you know, so like, I don’t need this, right? So you kind of can filter some of that. Or the address looks a bit dodgy. Or the address looks a bit dodgy and all that. But now with Gen AI, they can improve the quality of the email. So, A, it’s a lot more compelling email. And frankly, they can also tailor that email. A, they can tailor to be more coming from, like, your work, you know, from people you know from work. Because they can actually use some of the AI to learn what’s your context. Yes. You know, so through that. They can also tailor to your own personal needs. Like, Lucy, if you like, you know, shopping or sort of specific website, they might tailor as if it comes from that specific website that’s tailored to your needs. So they have more context about you. So from that perspective, you know, I think it makes it a lot harder to detect it’s a phishing email. And frankly, a lot easier to trick people. Yes. And at the same time, also, Gen AI helps, you know, to generate these phishing email campaigns much faster. Yes. And the fact that you can, you know, using kind of natural language. So even for the, you know, attackers, they actually have to write less code. They have to write less scripts. And they, you know, Gen AI help them to automate the phishing campaign for what it’s worth. So, yes. So I think that’s why we see the, you know, the attack patterns that has been exponentially escalating over the years.

Lucy Hedges:
Yes. It’s almost enough to make us incredibly paranoid, isn’t it? Absolutely. Yes. And I think the rule of thumb here is to really always assume breach. I think sometimes that can be detrimental. You know, something good might come in and you’re like, I don’t trust that. And you don’t click it or you don’t get involved in it. And that can be detrimental to the user. But unfortunately, the sophistication of which these attacks are coming, it means that we always have to have our guard up. Absolutely. Yes. So let’s talk Gen AI for good. You know, we talked about the evil side, you know, the nefarious side. How can Gen AI, no, would Gen AI also, how can Gen AI help defend and protect in the cybersecurity space?

Joy Chick:
Yeah. I would say both AI and now Gen AI, if you will. So, you know, one of the things that, you know, at Microsoft, we’re really thinking about protecting our customer is you have to think about an end-to-end approach. Because, you know, it starts with identities, user identity and credentials. But, like, you know, you’re using the iPad. The device that you are on, whether the iPad can be trusted or not or it’s being compromised or not, the network we are on, whether the network is secure or whether the network is compromised. And, frankly, the application you access, the data you are trying to really try to protect. So we are really looking at what we call the digital estate of end-to-end for our customers. So from that perspective, as we’re looking through all the, you know, trillions of signals in our cloud services, we can really apply AI machine learning to detect what are the anomalies and how to then real-time, if you will, to help to, you know, help our customer to detect any breach and to remediate it quickly. And then with the Gen AI, what it helps is really to help us to automate a lot of this process as well as helping security professionals so that rather than they have to use different security tools, rather they have to understand the logs, then they can use more human natural languages to understand, hey, if Lucy is being compromised, why Lucy is compromised? So by simply asking that question, rather than have to be the detective to go through all the tools and find out what’s happening. So I think Gen AI really democratize in terms of skill set, skill set that’s required to be a cybersecurity specialist.

Lucy Hedges:
Yeah, yeah. And this is, I think it’s fair to say, quite relatively new territory for a lot of businesses. You know, Microsoft is obviously incredibly well-versed when it comes to this. But do you think there’s maybe a bit of a apprehension or, you know, this lack of knowledge and education that prevents companies from really benefiting from this technology that ultimately is going to affect, benefit their customers and benefit them as a business?

Joy Chick:
Yeah, like, you know, go back to the phishing campaign, if you will. And we always, you know, talk about education is important. Yes. But guess what, Lucy, just, you know, just, you know, admit it. Do you share your credentials across your user accounts? Maybe. Some of them. Some of them.

Lucy Hedges:
But you know, my to-do list is always, you know, switch, you know, on the iPhone, for example, it’s constantly telling you when you’re using multiple passwords. And I know it’s there. Right. But I, you know.

Joy Chick:
But it’s not convenient, right? Exactly. How many passwords do you want to remember?

Lucy Hedges:
I’ll do it later. I’ll do it later.

Joy Chick:
So, you know, we talk about, hey, don’t reuse your, you know, password, don’t use your credentials for multiple accounts. You know, sometimes, like we still say, even to this day and age, we still put a little password on a sticky note on our, like, you know, iPads or computers. I can’t believe people still do that. That is crazy. Right. I don’t do that at least. Or share your credentials with your friends, you know, because of some services you want to use. So these are some of the basics. But the reality is, I would have called it, we don’t want to, you know, have the burden of protecting our users to be on the users. Right? Like, they can have the education, but that’s just not an excuse to say, hey, oh, you get hacked. It’s because you don’t know. Yeah. I think at the end of the day, we ask, why do we need passwords? Frankly, it is really, I mean, passwords is not a magic. It’s really about how to identify, like, Lucy, you as a unique person. And so we now look ahead to say, hey, what is a better way of doing that? So one of the things that’s industry standards is called a Fast Identity Online FIDO. It is an industry standard. It is a way to use leveraged biometrics because your biometrics is uniquely, you know, Lucy. And then in addition, something you have, like your iPad. So both something you are and something you have is a great way to identify Lucy as a unique person and as your credential. But in a way that is so user friendly because you do not have to remember password at all. So some of the examples are like, you know, Microsoft Windows Hello, if you will, the Authenticator app. And then now some of the newer inventions that we collaborate across Apple, Google, Microsoft and industry is about passkey support. So it is a phishing resistant passwordless method that can roam across trusted devices. And these are the things that we’re moving forward as an industry so that we can help our customers to users to be secure. And so they can, you know, prevent things like these, you know, credential theft.

Lucy Hedges:
Yeah. And it really is about time that this stuff becomes more mainstream, more talked about. I was saying to you earlier when we were having a chat about five or six years ago, I wrote an article for the Metro newspaper where I used to work, which was the password is dead. And, you know, I wrote this article, you know, we’re moving on from the password and years later, we’re still using passwords. And I want to say at least now we have more and more ways for us to accomplish that. But we still have a ways to go as an industry. Yeah, absolutely. And of course, not everyone is up to date with these latest mitigation techniques. So what I want to ask you, what role does education and awareness training such as, you know, digital literacy initiatives, what role do they play in preventing social engineering attacks?

Joy Chick:
These are the, you know, if I tell all customers, one thing they need to do is really turn on multi-factor authentication. Because even like we talk about, you may have still legacy, you know, applications. They still use passwords. Turn on MFA, multi-factor authentication itself. By itself, it reduces attack by 99.9% of the time. So I think that’s a great start. But I don’t think that that’s enough, right? So the next thing we really tell, I think that’s more about like, you know, the government and all our enterprise, the commercial customers, is really how do we apply techniques that we call the real-time conditional access risk-based access control. Basically, you know, we’re sitting here. I typically don’t travel this far. So suddenly, if I’m right now at this moment, I signed in into my work account. At least there’s a policy trying to validate, hey, is Joy really trying to access the work at this location at this time? What we call is an anomaly, if you will. And these are the things, if we can apply these in real-time based on user’s identity, based on where their location they’re trying to sign in, and based on what kind of applications and all these kind of we call the risk factors or condition factors, then we can really help to protect our customer. And you earlier talked about zero trust. Yes. You know, one of the key principles we always apply is always use zero trust, what we call assume breach. You always verify, and then you apply the least amount of privileges. You know, so you only get only the access you need with the amount of time you need for the resources you need, right? And you always assume breach so that you can detect when it happens and how can you quickly remediate. And also how can you reduce that blast radius or the impact, if you will. Yeah, yeah. Oh, and then I would say we talked a lot about human identities. But as we all know, as our customers move online, move to more and more cloud services, and guess what? There are more non-human identities than human identities combined. Wow. And so how do we think about protect what we call a workload identity? Just think about all the services, the microservices across the cloud. How do we protect them? It’s equally, if not more important. Yeah. And last thing I would just say is we still have too many identities. So how can we move to a system so that we have fewer identities using techniques like digital identities, that kind of decentralized verifiable credentials, so that we can have portable identities, so that we can make it secure and make it apply across all different applications?

Lucy Hedges:
Absolutely. That’s the way moving forward. Yeah, absolutely. Now, just to quickly wrap up, my final question. What advice do you have for organizations trying to stay secure, in addition to all the amazing things that you’ve said already on this stage? I’m sure there’s a lot of people in the audience that want to know.

Joy Chick:
Yeah, I would say, you know, AI, right? Do what I just said, and then really look into how AI can revolutionize for us in this industry. You know, AI, I think it can be scary. But at the same time, it can use to really help us to secure for all of us. And so I think, you know, keep that mind open. And I think we need to, you know, I would say security is a team sport. Yes. We have to do this together as an industry, as a society together.

Lucy Hedges:
Yeah. And what a brilliant sentiment to end this whistle-stop conversation on Joychik. It has been an absolute pleasure. I did not doubt for a second that this conversation would be nothing but insightful and inspirational. And it’s brilliant to hear from someone like yourself, who is such an impressive force in the world of cybersecurity. So I want to thank you very much. Thank you so much. Let’s give it up for my amazing panelist, Joy. Thank you. Thank you.

Joy Chick

Speech speed

180 words per minute

Speech length

2468 words

Speech time

823 secs

Lucy Hedges

Speech speed

221 words per minute

Speech length

1203 words

Speech time

326 secs

Moderator

Speech speed

77 words per minute

Speech length

27 words

Speech time

21 secs

Ready for Goodbyes? : Critical System Obsolescence

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Ben Miller

In the analysis, several speakers provided insights on various aspects of cybersecurity in relation to industrial control systems (ICS) and digital transformation. Dragos, represented by Ben Miller, is a notable company dedicated to protecting and securing ICS. Miller leads Dragos’ services team, which includes instant response and preparedness checks, demonstrating the proactive approach of the company.

The analysis highlights a shift in companies’ cybersecurity approach from solely relying on protection-based measures, like segmentation, to more proactive measures that involve creating visibility for threat detection. This change is needed as companies integrate more similar systems, increasing the attack surface. Outdated infrastructures, running on systems that reached end of life several years ago, are particularly vulnerable and require enhanced visibility.

The analysis emphasizes the need to combat obsolescence and vulnerabilities through implementing appropriate technology. Recent incidents, such as a case where ransomware affected an undetected traffic control system for months, highlight the urgent need for improved defensive measures. Prevention alone is not enough, and visibility is crucial to understand the environments.

Additionally, the analysis acknowledges that prevention in terms of security measures can eventually fail. It is crucial to create a defensible architecture with active system monitoring and capable personnel to respond to threats or incidents. Staff members should understand how to operate in an environment where they may be provided with incorrect information.

The analysis suggests that achieving a completely secure system is not a realistic goal due to the constant introduction of new technologies and capabilities by adversaries. Cybersecurity is an ongoing journey that requires continuous adaptation and improvement.

Collaboration between IT and OT is crucial in the context of cybersecurity. It is acknowledged that the life cycle and pace of change in IT and OT are significantly different. Conversations between the domains should focus on understanding the facility’s mission and working within constraints to avoid disruptions. IT disruptions to OT systems can cause downtime in revenue-generating assets, leading to tension between the two domains.

In conclusion, the analysis provides a comprehensive overview of cybersecurity in relation to industrial control systems and digital transformation. It highlights the proactive approach of companies like Dragos in protecting and securing ICS. The shift towards creating visibility for threat detection, combating obsolescence, and the importance of a defensible architecture with active system monitoring are emphasized. The analysis recognizes that achieving absolute security is not feasible and that cybersecurity is an ongoing journey. Collaboration between IT and OT is seen as crucial, focusing on understanding the facility’s mission and constraints to prevent disruptions.

Joshua Kennedy-White

The rapid pace of technological change leads to obsolescence as new technologies continuously replace older ones. Telecommunications, for instance, have moved from 3G to 4G and now to the latest 5G network, rendering previous generations obsolete. This highlights the constant need for adaptation to keep up with the ever-evolving landscape of technology.

Adaptability emerges as the best approach to embrace these changes. Being flexible and adaptive is crucial in navigating technological advancements. Surara, for instance, actively cultivates a culture of adaptability through research and development, training, and promoting workforce diversity. This helps prepare their employees to anticipate and embrace obsolescence.

Technology itself is a major driver of obsolescence. The introduction of new technologies like artificial intelligence (AI), 5G networks, quantum computing, and space technologies fuels rapid change. For example, the development of a new navigation system for airlines can make an entire fleet of aircraft obsolete. Similarly, the potential rise of driverless cars could make drivers themselves obsolete.

However, the biggest challenge in transitioning from legacy to modern technologies lies in people. Individuals are often resistant to change and may struggle to adapt to new technologies and ways of doing things. Despite being the largest asset of a company, human resources can be the pain point in the transition process. Overcoming this challenge requires effective training and change management strategies to facilitate successful adoption of new technologies.

The concept of absolute security is explored, suggesting that it is impossible to achieve complete security. The security vendor community’s obsession with achieving absolute security is questioned, as it is proposed that resilience and good enough security should be prioritised instead. This highlights the importance of finding a balance between security and usability in technology.

The expectations of consumers and the government also need to be recalibrated in response to technological changes. It is argued that the government does not always hold the responsibility to address every issue, and consumers should have a concept of resilience. Furthermore, the sudden criticality of modern services necessitates a revised understanding of their importance as critical infrastructure.

Strategic planning emerges as a crucial factor in successfully transitioning from legacy to next-generation technologies. Without a well-thought-out plan, organisations risk accumulating a plethora of technologies without a sense of security. To mitigate this, it is recommended to establish a shelf life for technology, adopt a modular architecture, and involve vendors in the upgrade processes. These strategic considerations can help facilitate a smooth and successful transition.

In conclusion, the constant change in technology drives obsolescence, necessitating adaptability to embrace these changes. Technology itself is the leading cause of obsolescence, and the transition from legacy to modern technologies can present challenges, particularly related to human resources. Achieving absolute security is deemed impossible, and instead, the focus should be on resilience and good enough security. The expectations of consumers and the government need to be adjusted, and strategic planning is crucial for a successful transition.

Major General Manjeet Singh

Obsolescence, the concept of something becoming outdated or no longer useful, has long been practised in military inventories, with certain percentages of outdated equipment maintained. However, the pace of technological advancements, user expectations, market forces, and security requirements have significantly accelerated obsolescence.

In response to this accelerated obsolescence, it is crucial to establish a cycle to effectively manage it while ensuring functionality and security. This means finding ways to address the challenges posed by rapidly changing technologies, evolving user needs, and the market-driven demand for up-to-date equipment.

One notable effort in mitigating the impact of obsolescence is being undertaken by Major General Manjeet Singh in India. India boasts a large population of approximately 800 million internet users and 1.3 billion phone users, resulting in a significant number of transactions, around 10 billion per month. Recognising the importance of minimising obsolescence in such an advanced and connected society, Major General Manjeet Singh is working towards finding effective strategies to manage and reduce the impact of obsolescence in India.

Furthermore, India is also making commendable strides in securing its cyberspace. They are actively addressing governance issues related to cyberspace, developing comprehensive crisis management plans, and creating resilient infrastructure. Additionally, India is taking measures to ensure disaster recovery and backup plans for data, emphasising the importance of network resilience.

The analysis reveals that obsolescence is not a new concept for militaries, with certain strategies like maintaining specific percentages of outdated equipment being employed. However, the increasing speed of technological progress, evolving user expectations, market dynamics, and security considerations present challenges that require proactive management of obsolescence. The case of India highlights how the country recognises the significance of addressing obsolescence in its technologically advanced society and is taking measures to both minimise its impact and secure its cyberspace.

Overall, the detailed summary highlights the various factors accelerating obsolescence and the importance of managing it effectively. It also underscores the efforts made by Major General Manjeet Singh in India, along with the country’s commitment to securing its cyberspace.

Dr. Yacine Djemaiel

The obsolescence of software and hardware components in critical infrastructure can pose significant threats to the services they provide. There is a strong dependency between the software and hardware for each component in most cases. When the hardware fails to respond after software updates, the process to replace such hardware is initiated. However, this process can be time-consuming and may lead to potential threats regarding critical infrastructure if not addressed promptly. This raises concerns about the need for up-to-date regulations and strategies for critical infrastructure.

From the Tunisian experience, it has been observed that targeting regulation is essential in addressing this issue. In 2023, Tunisia defined a new law for cybersecurity, updating a previous law from 2004. Critical infrastructure had a dedicated chapter and a set of laws that major companies must respect. This demonstrates the significance of up-to-date regulations and highlights the importance of having specific laws that govern critical infrastructure.

Regulatory guidelines for critical infrastructure are also crucial. Dr. Yacine Djemaiel emphasises the need for such guidelines to ensure that these infrastructures are maintained and updated in a timely manner. Including criteria against which the components of the infrastructure should be certified in the regulations can further enhance their effectiveness.

However, upgrading hardware or software for critical infrastructure can be challenging for government companies. It requires detailed planning and budgeting. The process of acquiring the necessary budget and carrying out the changes in compliance with regulations may be lengthy, causing delays in maintaining and improving the infrastructure. This issue underscores the need for more efficient solutions to reduce the time required for infrastructure replacement and upgrades.

Dr. Yacine Djemaiel advocates for reducing the time needed for updates, as it would make compliance with regulations more efficient. Faster replacement and upgrades can mitigate the risks posed by outdated infrastructure. By streamlining the process and making it more time-efficient, the potential threats to critical infrastructure can be reduced.

In conclusion, the obsolescence of software and hardware components in critical infrastructure poses significant threats to the services they provide. It is crucial to have up-to-date regulations and strategies to mitigate these risks. Regulatory guidelines, along with efficient infrastructure replacement and upgrade solutions, can help maintain and update critical infrastructures more effectively. By addressing these issues, the potential threats to critical infrastructure can be mitigated, ensuring the smooth and secure provision of essential services.

Rebecca McLaughlin-Eastham

This comprehensive analysis examines the level of preparedness and protection of companies and entities against obsolescence and vulnerabilities. It sheds light on the budget companies allocate for upgrades and resilience measures, questioning whether it is adequate. The analysis also explores the broader perspective of how well-protected or exposed entities are in the face of obsolescence.

One of the key points raised is the budget companies allocate for upgrades and resilience measures. This raises concerns about whether companies are sufficiently prepared for potential obsolescence and vulnerabilities. The analysis emphasizes the importance of investing in upgrades and resilient infrastructure to mitigate the risks associated with technological advancements and changing market dynamics.

Another significant point is the overall preparedness of entities when it comes to obsolescence. The analysis urges us to take a broader view and consider the extent to which entities have considered the implications of obsolescence and taken proactive measures to protect themselves. By doing so, entities can ensure their sustained viability and competitiveness in the face of rapidly evolving technologies and changing industry landscapes.

The analysis also notes the neutral sentiment surrounding this topic. While it does not provide a clear indication of stakeholders’ views, it signifies the importance of a balanced perspective when examining the level of preparedness and protection against obsolescence and vulnerabilities. It suggests that a well-rounded assessment is essential in identifying areas of improvement and developing strategies to address any gaps.

In conclusion, this analysis highlights the significance of preparedness and protection when it comes to obsolescence and vulnerabilities. It underscores the need for companies to allocate sufficient budget for upgrades and resilience measures, as well as the importance of taking a comprehensive approach to ensure entities are adequately protected against obsolescence. By addressing these issues, companies and entities can enhance their ability to adapt, thrive, and remain competitive in an ever-evolving business landscape.

Session transcript

Rebecca McLaughlin-Eastham:
Good afternoon, everybody. Nice to see you all again. I hope you are continuing to enjoy a fantastic first day of GCF 2023. It’s wonderful to be back on this hallowed stage with another fantastic panel. Our topic is obsolescence, the long or maybe the short goodbye we shall have to debate and see. In today’s world, with such rapidly advancing technology, the life cycle of critical systems is becoming ever shorter. So what exposure, what challenges, what threats does that pose to organizations around the world today? And what can we do to traverse these waters to mitigate those dangerous times? So I have all the answers in my learned friends to my left. You’ve had them introduced, but let me come to you each individually first just to set the scene. Tell me a bit about your role and your remit and what you bring to this conversation today. Major General Majid Singh, it’s wonderful to see you. Thank you for being here. How are you? Thank you.

Major General Manjeet Singh:
Thank you, everyone. At the very outset, let me thank the global cybersecurity team for having invited us to speak on an important issue such as obsolescence. I also thank the moderator who has introduced us and to my fellow panelists to be all here. Let’s hope we have a great discussion on the topic. Obsolescence, in my initial thoughts, is something I would like to say that it’s not a new concept. It’s been practiced all over. It’s been practiced by the militaries. They do lay down certain percentages of what do they really maintain in their inventories. Say, 30% of the equipment which is obsolete or in the obsolescence phase. About 40% is current. And there is 30% wherein the induction of the modern technology or the modern equipment happens. So 30, 40, 30 concepts. Some people may practice 20, 60, 20 concepts depending upon various factors of technology regulation, the budgets, the HR, all those concerns. However, in light of the technological advancements, the going analog to digital, the user aspirations, the market-driven forces, our aspirations, our security requirements, all that has really speeded up the way the obsolescence is happening. So, therefore, it’s really become a challenge to take care of that cycle of obsolescence. And, however, the bottom line is that we should be able to maintain the functionality as also maintain the security. So we have to maintain a very fine balance between the two and ensure that we have a cycle wherein we are able to manage obsolescence.

Rebecca McLaughlin-Eastham:
Thank you so much. Ben, nice to see you again. Familiar face in Saudi Arabia. Hope you’re enjoying GCF 2023. Tell us a little bit more about what you do for those who might be unfamiliar.

Ben Miller:
Yeah, absolutely. It’s great to be back here, two years running. I work at Dragos. So Dragos is focused on obsolescence systems at the end of the day. We focus on defending and securing industrial control systems, sometimes called operational technology. And in my role at Dragos, I lead the services team. So our instant response team, our assessments team, the teams that do preparedness and checks against the defenses. And so, in many ways, what I’m representing today is not so much Dragos but our customers at large and what we see from that ground level.

Rebecca McLaughlin-Eastham:
Thank you. Thank you. Dr. Yassine, nice to see you. How are you today? Thank you. Talk to me a little bit about, from your point of view, when it comes to Tunisia and the importance of not only core systems but obsolescence.

Dr. Yacine Djemaiel:
Yeah. This is a great issue that we should discuss carefully when we deal with critical infrastructure because there are many factors that should be considered when we look carefully to the component of critical infrastructure. So we will find that there is a dependency between the software and the hardware for each component in the most cases. We are updating. We will update the software the first time, the second time. But at the moment, there is a limit. There is a point where we stop because the hardware does not respond. And this will initiate for us the process to replace such hardware in order to be able to continue providing the needed service by this critical infrastructure. This is an important point. Now, this time out between the instant where the system does not provide the needed hardware properties may lead to a set of threats regarding our critical infrastructure. And this is most dangerous because we are providing critical services. And at this time, we are not able to provide this service in an efficient manner. It means that there is something that is missing. There is some vulnerabilities related to this system that may be exploited by attacker to engender damage to this infrastructure. In this way, from the Tunisian experience, we have tried to focus on a major component that is the regulation. And we have defined since 2023 a new law for cybersecurity since we have a law that is dated from 2004. And in this year, we have elaborated a new law text for cybersecurity. And we have dedicated for critical infrastructure a chapter and a set of laws that should be respected by major companies. So, this is very important. And this is the first step if we need to help company to be compliant with the requirement of critical infrastructure. So, this is the first point that should be discussed here regarding the regulation that should be up to date. Followed by the strategy that should be also up to date in a country regarding critical infrastructure.

Rebecca McLaughlin-Eastham:
Thank you very much. Policy and regulation will definitely be discussed. Absolutely so important to our conversation. Joshua, let me come to you from the standpoint of Surara by STC. How are we currently positioned when it comes to obsolescence?

Joshua Kennedy-White:
Yeah, thank you very much and thank you for having me. It’s my second time here. And I’ve been coming to the Kingdom since about 2005, which I think is a nice backdrop to think about how much has changed. Just when we talk about obsolescence, we normally think of legacy technology and how we adapt and change. It’s interesting that we’re having that conversation here at the Global Cyber Security Forum in a quite new and modern country that doesn’t have a lot of existing legacy, perhaps less than others. I’m privileged to be an executive board member on Surara, which is a young company that we spun out of STC, the Telco, with a young team that is addressing a lot of the problems that are emerging now in the Kingdom or the opportunities, if you like. When I think of the obsolescence question, I’d like to just take a step back. If we were having this conversation 200 years ago and we were talking about critical infrastructure, probably the two things that would stand out would be a lighthouse and telegraph lines, two things that don’t really exist anymore, or maybe they do as a tourist attraction. They existed for a long time. Technology didn’t have much of an effect on them. Lighthouses went from using wood to oil to electricity. Telegraph had morse code and other things, but they generally didn’t change. Now we’re in an environment where the thing that fundamentally changes the obsolescence of critical infrastructure is technology. It’s just compressed in such a short space of time. If you were to think of just three things in business, telecommunications, we now have 5G, that’s made 4G obsolete, that’s made 3G obsolete. We have multi-core processes. We’ve got the cloud. There’s so many things there. What does that mean? When I look at that from a Sarah perspective, the ultimate question is, we know that things that we’re dealing with today are going to be obsolete tomorrow, so how do we plan around that? I think back to the best trait in evolution is to be adaptive, to be adaptable, to accept those things that are coming. From our perspective, it’s not to be too fixed in our ideas, to be able to have flexibility to say we need to adapt, we need to change. That has to be pervasive throughout the organisation as a culture, as an approach to R&D, as an approach to training, as a diversity of the workforce. When I look at what we’re trying to achieve with Sarah, I think that sits behind that. When I look at the numbers that we have in terms of what we’re doing, the people, the projects they’re working on, I think in the background, we’re preparing ourselves for a constantly changing world and how we can help our business and government clients adapt to that. What are the leading causes of obsolescence? Let’s take it back to basics. How do we make sure that they’re on our radar, that we’re aware of what we need to be fixing? Let me come to you, Joshua, first. I think the biggest one is technology. We’re now living in, I don’t know, is it the fourth or is it the fifth industrial revolution? The rate of change of chat GPT and large language models, it’s happening right now. When we look at the first industrial revolution with steam and others with electrification and automation and mechanisation, those things took decades to happen. We’re looking at things that are happening now in literally months. I think that the technological change, which poses so many challenges, the things we define as critical infrastructure, there are many, many more of those. The regulations around them, I mean, look at AI. We haven’t even begun to get our heads around that. I used to work in government. With all due respect to government, we’re not normally on the cusp of technology and the ability to regulate it. We tend to go through a cycle of making something illegal, compulsory, obsolete. These cycles happen. I think the big one for me is technology, the pace of the change, the depth of the change, whether it’s space, quantum, AI, 5G. There are other things that sit behind that. We might bring in a new navigation system for airlines, which makes a whole fleet of aircraft obsolete. Or we might driverless cars. It’s probably going to make me, as a driver, obsolete. There’s a range of those things. Or traffic signals might be obsolete or railway signals. I think that as we devolve to harness all of the benefits of this next digital transformation, enabled by this amazing new technology that’s out there, it will create a wave of obsolescence. I don’t think that’s necessarily a bad thing, but it does pose many, many questions to how we’re going to secure it, how we’re going to regulate it, etc., which we’re only just thinking about.

Rebecca McLaughlin-Eastham:
Let’s talk about security and regulation, not least for a variety of sectors, because the impact is different, of course, across many different industries. In Tunisia, Dr. Yassine, what regulation do you want to see? What is it critical to put in place to make sure that there is a more manageable, seamless transition?

Dr. Yacine Djemaiel:
We deal with this regulation. When we focus on the content of this regulation regarding critical infrastructure, we will find that there is some restrictions that should be applied for this infrastructure regarding if the components are certified against a set of criteria. We should keep these constraints available. to implement the needed replacement updates in time in order to comply with this law. This is the first point that should be mentioned regarding these obsolescence. Now another problem that should be also presented is related to the act of replacement. When we make the upgrade, the needed upgrade for the hardware or the software, especially for the government companies, when we need to plan to get the budget. And this time to plan the needed budget and to get the needed amount in order to be able to make this change in order to be compliant with the law may be for a long period. And this period will be also another issue for our infrastructure. So this is among the aspects that should be also discussed, and we should find a solution for that in order to reduce this time and to be able to make the needed change in an efficient time. So this is another issue that should be also discussed.

Rebecca McLaughlin-Eastham:
Ben, when it comes to budgeting, when it comes to spending, protecting ourselves, making ourselves more resilient, sometimes the CAPEX is not there or even the OPEX as we were discussing backstage. So what level of preparedness and protection do companies and entities tend to have today? If you were to give us the broad view, how protected or how exposed are we when it comes to obsolescence and the vulnerabilities that causes?

Ben Miller:
Sure, yeah. Sure. I think the challenge within many of the critical infrastructure environments is around the idea of first 10 years ago, it was we were segmented, we’re okay. Or actually, no, it was air gaffed. We’re air gaffed, we’re not touching any other systems, we’re fine. And then it moved to what we’re segmented, so we’re protected. Now with the age of digital transformation and we’re adding more systems that are talking to each other and they’re more homogeneous, so they’re very similar from an attack service perspective. We have this challenge now where we can’t just rely on prevention, it’s getting in front of that. So when prevention fails, what’s next? And the old proverb, chance favors the prepared. How are we getting in front of an attack so we have the right visibility to detect them when they’re in their environment? Backstage we were talking about a recent case my team supported, ransomware related, that affected a traffic control system. They were within that environment in an order of months, and it wasn’t until they deployed the ransomware that they were detected, pretty obvious at that point. But there’s an opportunity there if you’re deploying the right technology to create that visibility. I think that’s the, when you’re dealing with old technology, and by old technology I mean systems that went end of life seven, eight years ago, the mitigations there are creating visibility and understanding what’s happening within those environments.

Rebecca McLaughlin-Eastham:
It may be basic to observe, but the actors are moving faster than we are. The technology is moving faster than companies and even governments are. So how do we bridge that gap? How do we step one step ahead, given some solutions, but what would your key advice to entities, to governments, to companies in the room be?

Ben Miller:
It really does come back to the idea of prevention does eventually fail. And so not just creating a strong architecture, but a defensible architecture. So that means people that are actively monitoring the systems and able to respond, and creating the expectation that the operators and the engineers know what to do if they think that if it were to go into a dangerous state, it’s actually a human safety issue. It’s not my database is corrupt. There’s a degree of impact there that’s really important to understand. And those staff members that are in that facility need to understand how to operate in an environment where they might be given the wrong information and make the wrong choices because of that. That’s the leading edge in training and where we need to build towards.

Rebecca McLaughlin-Eastham:
Thank you. Major General, how are you minimizing the impact of obsolescence in India? What examples can you point to?

Major General Manjeet Singh:
India is a huge country, has got a huge cyberspace. If we look at the numbers, we have about 800 million internet users. We have 1.3 billion people using these phones, but in the large quantity of them is smartphones. So the interconnectedness is very heavy. If you look at the overall payment landscape, it’s 10 billion transactions happening every month and they run into billions of rupees. So it’s a huge landscape. If I look at the resilience aspects at the strategic level, we are addressing it at the policy and the strategy level. Then we have the governance. Governance of cyberspace is being addressed through suitable governance structures. We have a huge amount of infrastructure development, capacity building programs. That’s at the strategic level. And if we come to the technical level, we are putting in place all issues which contribute to the resilience, whether it is the crisis management plan, or whether it is putting in place resilient infrastructure, having disaster recovery, backup plans for the data, the network resilience, the network time protocol, the DNS systems, the safety and security of our submarine cables, all that is being put in place. So it’s something a work in progress. We are doing fairly well to secure our cyberspace. It’s a work in progress.

Rebecca McLaughlin-Eastham:
Joshua, talk to me about the biggest pain point, transitioning from legacy to modern technologies and infrastructures and reinforcing those core infrastructure systems. Where is the weakness or what’s the biggest headache, if we can call it that?

Joshua Kennedy-White:
So I think in a word, it’s people. You always hear people are our greatest asset. They’re also incredibly hard to change, they’re hard to train, they’re hard to find, they’re hard to keep. I used to have a very large team with a lot of people. I’m sure I miss them individually, but in aggregate, less so. So I think the people piece is hard. But I just want to pick up on a theme that was talked about there, and Rebecca, you mentioned it with minimize. This seems to be sometimes, maybe often in the security vendor community, this obsession with making something absolutely safe. I can tell you, absolute security is absolutely impossible. And so if you think of that in the context of critical infrastructure, I think historically, the government had more of the ownership of those assets, power stations and the like. And today, if I was to define critical infrastructure in my house, it’s probably Netflix and it’s probably the Uber deliveries and Grab and all these other things. So I think that poses a couple of questions. The first of it is, let’s not always assume that it’s the government’s fault and the government has to fix things. But that the other side of it is, as a consumer of that service, whether it is provided by the government or not, maybe we do have to have an idea around resilience and good enough to be able to get there. We manage perfectly well without these things that suddenly is embedded. It owes us a favor. Why can’t I have Wi-Fi streaming on the airplane? So I think we have to recalibrate that discussion. And that’s a subtle political piece as well of what we expect of our political leaders. Maybe I’m being kind because I used to be in that frame. But the key thing, I suppose, when I look at moving from legacy through to the next generation is, in the absence of a really good strategic plan, you end up doing these tactical things and you amass a whole bunch of stuff. You feel secure because you’ve got one of everything and that doesn’t really happen. So I think a better approach is to be able to say, this has a shelf life. It’s an interim solution. We’re planning to do something else. We’re going to have a modular style architecture. We’re going to have a relationship with our vendors, that they’re going to be part of the upgrade process, that it’s not… There’s a lot of people involved in legacy infrastructure to get from where you are to where you need to be. And I think there’s interesting contracts that you can write with your technology providers. You can kick the question to them. I’ve been to multiple conferences where you walk in and if you’re someone trying to buy a solution, you’d be baffled. There’s 4,000 things all with a variation of shadow this, carbon that, trace this. And it’s quite baffling. I think the other part of it is we always think that it’s some super sophisticated hacker, probably criminal gang backed by a state. I can tell you, in a lion’s share, a lot of these things are kind of mistakes that people make. It goes back to the people thing because they’re not trained. They don’t understand it. They don’t know what they’re doing. So it’s a complex problem. I don’t think it’s going to be easily solved by perfect technology solutions. I think it’s about redundancy, resiliency, a discussion with people. I would say that because as a service provider.

Rebecca McLaughlin-Eastham:
Of course, he would never say that as a service provider. I’ve got to bring in Ben here. It takes many people. It takes a village. Absolutely safe is absolutely impossible. Do you agree?

Ben Miller:
That it takes a village? That it’s possible. Oh, that it’s possible. I think it depends on what your end goal is. I think if you’re focused on creating a robust, resilient, defensible system, absolutely. If it’s about preventing all attacks or that we’re 100% bulletproof, secure, I don’t think that’s a reality that we live in. And even if it were, it would be very transitory of, hey, we reached this state. There’s a new technology. There’s a new capability that the adversary is deploying that pushes everything to this side. I think a lot of our customers, as an example, focus on secure remote access. I’ve seen adversaries take advantage of secure remote access and use those appliances and that equipment to actually gain access, unauthorized access. So it’s always a cat and mouse game and it’s a journey, not a destination.

Rebecca McLaughlin-Eastham:
Speaking of cat and mouse or perhaps friction of a different kind, IT and OT. What’s the future? Never the twain shall meet. One will always outpace the other or have a disagreement, shall we say.

Ben Miller:
Yeah. In your last question, you had a great phrase that stuck out, actually, a legacy. I think perhaps in many environments, the IT teams see all the what they would call legacy equipment and software that’s deployed at pick your type of infrastructure, refinery, generation plant, green energy, they see that as legacy. That that plant was built maybe ten years ago. It’s not legacy. It’s that the pace of change is way different than IT. It’s not a phone. And so that the life cycle there is entirely different and it’s not, again, on we need to patch all your systems all the time, because that would put that facility in outage. And so that’s that friction, right? Actually, we’re generating the revenue for the business. Why are you creating downtime when we’re actually operating and building the capacity that’s needed for the business? So there’s that tension that exists. And I think as we understand the mission, as IT staff understands the mission of the facility and the constraints of the facility and works within those constraints rather than trying to constrain that revenue generating asset, I think that’s where the conversation needs to go.

Rebecca McLaughlin-Eastham:
I wish we had more time. We need to talk about collaboration as well, but sadly the clock has beaten us. But ladies and gentlemen, please join me in thanking my fantastic guests for their contribution today.

Ben Miller

Speech speed

155 words per minute

Speech length

897 words

Speech time

348 secs

Dr. Yacine Djemaiel

Speech speed

128 words per minute

Speech length

596 words

Speech time

280 secs

Joshua Kennedy-White

Speech speed

211 words per minute

Speech length

1537 words

Speech time

436 secs

Major General Manjeet Singh

Speech speed

127 words per minute

Speech length

506 words

Speech time

239 secs

Rebecca McLaughlin-Eastham

Speech speed

182 words per minute

Speech length

698 words

Speech time

230 secs

Safe Surfing: Understanding Child Online Activity

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Iain Drennan

The threat of child sexual abuse material online is growing and becoming more diverse. According to a global threat assessment published by the WeProtect Global Alliance, there has been an increase in such material appearing online. This includes the alarming trend of children being tricked into providing intimate images, which can have serious consequences. Additionally, there are concerns about the use of AI and deep fake technology to create intimidating images, further exacerbating the issue. The overall sentiment towards this issue is negative, highlighting the urgent need for action.

International action is required to address child sexual abuse online. Saudi Arabia’s initiation of a holistic framework to combat this issue is seen as a progressive step. The responsibility for child online safety lies with the global and national community, including the government and the private sector. Empowering children with tools and choices online is important, as is the need for user-friendly platforms with easy reporting systems to enable children to report any discomfort.

There is collaboration between the public and private sectors, with technology and software engineers engaging with governments and regulatory bodies. The aim is to establish high privacy and protection standards for child users. A collaborative and cross-sector response, including referring child protection issues to law enforcement, is essential to effectively address the problem.

However, funding for online child safety is inadequate and unevenly distributed. While there has been progress in legislation and regulations, with countries like Saudi Arabia, Nigeria, Singapore, the UK, Ireland, and Australia drafting laws to regulate the digital space, there is still room for improvement. It is hoped that the engagement of the global community with these difficult issues will lead to stronger measures for online safety.

In conclusion, the challenge of child sexual abuse material online requires urgent action. International cooperation, involvement from various stakeholders, and sufficient funding are crucial steps in safeguarding children online. Prevention measures should also be a focus in addressing this issue. While progress has been made in legislation and regulations, continued efforts and collaboration are necessary to ensure the online safety of children.

Moderator – Rebecca McLaughlin

During a panel discussion at the GCF 2023, experts convened to address the critical issue of protecting children in the online world. The focus was on the shared responsibility of ensuring children are well-educated, protected, and responsible digital citizens. The panel acknowledged the numerous existing threats and emerging challenges, particularly concerning AI and deepfakes.

The panel recognized that simply removing devices or disconnecting children from the internet is not a feasible solution. Instead, experts emphasized the need to effectively inform and protect children. Esteemed guests, including Dr. Maimouna Al-Khalil, Secretary General of The Family Affairs Council, Saudi Arabia, outlined the council’s work, shared reports, and discussed initiatives.

Ian Dreenan, Executive Director of We Protect Global Alliance, presented their latest findings, specifically addressing the emerging threat of extortion. Dr. Yuwan Park, Founder of the Deque Institute, provided insights into their work on holistic approaches to online safety and referenced the safety index.

Regarding policy and regulation, Dr. Al-Khalil stressed the importance of reinforcing efforts to protect children globally, particularly in Saudi Arabia. She highlighted the need for next steps, milestones, and regulations that can effectively safeguard children from potential harm. Dr. Al-Khalil emphasized the profound repercussions for children if appropriate measures are not implemented.

Ian Dreenan acknowledged that although legislation and regulation are crucial, much responsibility lies with the children themselves. He underlined the importance of encouraging children to share information and express their fears, particularly if they are unaware of the real threats they may face online.

The panel also discussed the vital collaboration between the public and private sectors, including tech and software engineers, in creating safe and engaging online environments. They debated the level of communication and cooperation necessary to develop platforms that prioritize safety while still being entertaining and educational.

Dr. Park highlighted encouraging developments in both the public and private sectors, indicating progress towards a safer educational environment. He expressed hope in ongoing initiatives and the increasing dialogue and funding for development in this area.

Concerns were raised about funding, research, and data collection. The panel suggested allocating greater attention and resources to ensure the protection of children online, emphasizing that it should be a top priority for society as a whole.

Ian Dreenan shared his concerns frankly, emphasizing the need for continuous vigilance and action. However, he also expressed hope for the future, acknowledging that child protection is a collective responsibility even for those without children.

Dr. Park echoed the importance of addressing societal taboos and encouraging open conversations, alongside increased investment in development. He acknowledged the progress made thus far but stressed the need to address tangible risks and maintain hope for the future.

Dr. Al-Khalil, as a parent and representative of the council, shared her concerns and hopes for the conversation surrounding child protection. She emphasized the need to move forward and increase awareness and education on this urgent matter.

Lastly, the moderator, Rebecca McLaughlin, recommended specific apps and protective tools to monitor children’s online activity and directed attendees to seek additional information from the respective agencies present.

Overall, the panel discussion highlighted the shared responsibility of protecting children online, emphasizing the need for ongoing collaboration, education, and regulation. It called for increased funding and attention from governments, the public, and the private sector to create a safer digital environment for children.

Dr. Yuhyun Park

The report titled “Persistent Cyber Pandemic” highlights a concerning trend in which 70% of children between the ages of 8 and 18 have consistently been exposed to at least one cyber risk for a period of seven years. This issue transcends regions and persists both before, during, and after the COVID-19 pandemic.

The report emphasizes that addressing cyber risk is not solely a children’s or family matter, but rather a persistent problem that requires the collective efforts of policy makers and industry leaders. This collective approach is crucial for effectively tackling cyber risks and ensuring the safety of children online. The report commends the approach taken by the Kingdom in addressing cyber risks and calls for its continued support.

Dr. Park, an expert in cybersecurity with 15 years of experience, emphasizes the importance of focusing on children’s issues in cybersecurity discussions. She argues that reducing the current cyber risk exposure of 70% among children should be a collective target, advocating for a decrease to at least 50%. To achieve this, she recommends reforms in the family, education, and technology sectors.

In the family and educational sectors, Dr. Park proposes implementing a digital skills framework and teaching responsible and ethical use of technology. She also highlights the need for ICT companies to prioritize safety by designing their products with user empowerment, age-appropriate measures, content moderation, and unified reporting systems in mind.

Furthermore, Dr. Park stresses the significance of policy and regulation in addressing cyber risks. She underlines the need for end-to-end safety measures, ranging from prevention to intervention and reporting. This underscores the importance of establishing comprehensive policies and regulations to safeguard children online.

Aside from the specific findings and recommendations, there are concerns regarding the impact of digital transformations, web developments, and online safety risks on children’s well-being and the security of their living environments. The dynamic nature of these advancements necessitates a mobilized effort to understand and address future risks, ensuring preparedness for potential challenges that may arise.

Overall, the report sheds light on the persistent and widespread nature of cyber risks faced by children, emphasizing the necessity of a collective approach involving policy makers, industry leaders, and the implementation of comprehensive reforms. It stresses the significance of prioritizing children’s issues in cybersecurity discussions and highlights the importance of policies, regulations, and safety measures to protect children online. Furthermore, it calls for ongoing efforts to anticipate and address future risks, aiming to create a safer digital landscape for children.

Dr. Maimoonah Alkhalil

Children in Saudi Arabia are actively participating in various online activities, with nearly 99% of them engaging in socializing, communication, and gaming. However, this increased involvement in the online world presents significant risks. Children are vulnerable to safety risks and exposure to inappropriate content, especially as boundaries between the virtual and physical worlds blur. Cyberbullying occurs both online and offline, further compounding the dangers associated with children’s online communication.

To address these concerns, Saudi Arabia has introduced the National Child Safety Online Framework. Developed with input from over 25 stakeholders, this framework will be overseen by the Family Affairs Council, responsible for its implementation, tracking, and reporting over a five-year period. The launch of this framework signifies a positive step in safeguarding children from the risks inherent in online activities.

The family also plays a crucial role in protecting children against online threats. Open conversations about these dangers are necessary, and parents need to be supportive and receptive when their child shares any online threats or discomfort they have experienced. Teachers also have a responsibility to raise awareness about online risks, helping students understand the various dangers that exist in the online world.

Efficient legislation and law enforcement are essential in tackling online threats. A well-defined system for reporting these threats, along with clear reporting channels and helplines, is necessary to support those affected. Additionally, a robust national infrastructure is required to effectively counter and address these challenges.

Funding is crucial for making progress in child online safety. It can be utilized to raise awareness through campaigns and develop tools that help children identify and manage online risks. Furthermore, a unified approach to measuring and assessing progress is key to ensuring effective intervention and evaluation.

Empowering children to handle potential online risks is crucial. Teaching assertiveness, resistance to peer pressure, and educating them on who to reach out to in case of danger are important aspects of enabling their safe navigation of the online world.

While concerns exist about the unknown and unexpected aspects of Artificial Intelligence (AI) in the future, it is important to remain vigilant and prepared. Plans are being implemented to address current challenges associated with AI and to ensure that children are adequately equipped to adapt and regulate their online experiences.

The family’s role is emphasized in adapting to future changes. Ongoing conversations and discussions, both nationally and internationally, are necessary to keep up with evolving trends and ensure the protection of children online. Preparing children, both in terms of their personality and their ability to regulate and face obstacles, is essential for their development.

Parents have a significant responsibility in safeguarding their children online. Actively seeking information and knowledge about online safety is crucial in ensuring their children’s well-being. It is imperative to disseminate awareness through various channels, equipping parents with the necessary information on parental controls, detecting signs of distress in their children, and encouraging positive online experiences.

In conclusion, while children in Saudi Arabia are heavily involved in online activities, there are risks associated with their online communication. The introduction of the National Child Safety Online Framework is a positive step towards addressing these concerns. The involvement of families, educators, legislation, and law enforcement is essential in creating a safe online environment for children. Funding, awareness campaigns, measurement, and assessment are crucial elements for ensuring progress in child online safety. Empowering children with the necessary skills and knowledge to handle online risks is essential, while also being prepared for the future challenges that AI may bring.

Session transcript

Moderator – Rebecca McLaughlin:
activity. Ian Dreenan, Executive Director, We Protect Global Alliance. Dr. Yuwan Park, Founder, Deque Institute. Dr. Maimouna Al-Khalil, Secretary General, The Family Affairs Council, Saudi Arabia. Rebecca McLaughlin, ISTAM, Moderator, International TV Anchor, MC, and Media Trainer. Good morning, everybody. Nice to see you all again on day two of GCF 2023. We have a very important topic to discuss in this panel, and it is all our responsibility to listen up and to protect the children of the world online. How do we make sure that they are well educated, that they are protected, that they are responsible when they grow into digital citizens? There are so many present threats. There are so many emerging ones, not least with the advent of AI, with deep fakes, and many new challenges that we will talk about. It’s not as simple as removing their devices. We can’t unplug the internet. We can’t prevent progress. So how can we best inform and protect them? I have an esteemed panel of guests to help me drill down into these important topics. Thank you for joining me, one and all. Shukran. Let me start by asking you, Dr. Al-Khalil, tell us a little bit about the council, the work that you do, and also the latest findings, the reports and initiatives that

Dr. Maimoonah Alkhalil:
you will be launching. Thank you. Good morning, and it’s very happy to be with you today. The Family Affairs Council specializes in the family as a unit, in empowering its members, in instilling values, and in ensuring cohesion. Particularly, we’re interested in the best interest of women, children, and the elderly, and the family as a unit as a whole. We are very alarmed by the numbers that are coming out on child online safety and the risks that are involved with that, and so it is our responsibility to, first of all, study the current situation, understand what is going on at the national level, and then begin to plan ways in which we can address some of these risks. We know that Saudi children are online by percentages that are almost up to 99%. We know that they are very active, they are socializing online, they are communicating online, they are playing online, they are being entertained online, and so that is a reality that we need to face. In addressing the issues and the risks that come with online communication, we also know that there are risks to their safety, there are risks to the content that they are being seen, there are risks associated with who they communicate with, and we know that there is the line separating the virtual world and the actual world is slowly disappearing, and so what happens in terms of cyberbullying, for instance, that is occurring online, it is coming offline as well by the same harassers, and so in response to these risks that we have noticed, we are very happy to be launching next week at our family forum on November 12th the National Child Safety Online Framework, where we convened and had many debates and many discussions with over 25 stakeholders from the industry, from the government, from the civil society, where we came together and identified who the main stakeholders are, and identified the roles that we want for them to take on, and put that together into a five-year plan and under this framework, and we will be launching it, and I extend an invitation to everyone here to join us next week, where we will be discussing how we can make sure we are implementing this plan, and the Family Affairs Council will be in charge of implementation and tracking and reporting.

Moderator – Rebecca McLaughlin:
Thank you very much. Ian, the important work that you do at WeProtect, talk to us about your latest findings, and also the new threats that are emerging, not least extortion.

Iain Drennan:
Thank you very much. So at WeProtect Global Alliance, we bring together experts from government, from the private sector, from civil society, intergovernmental organizations, to develop solutions to one of the most serious threats facing children online, child sexual abuse, and we heard His Excellency earlier today highlighting the risks posed by child sexual abuse material online, and the need for international action to address it. We published a global threat assessment last month. It’s on our website, it’s available in Arabic as well, and one of the key things we found was that the threat is growing, so we’re seeing an increase in material appearing online, and it’s diversifying. So an example is we’re seeing an increase in financial extortion, so where children, particularly targeting adolescent boys, are duped into providing intimate images of themselves, and then that’s then used to blackmail them, and the consequences of that are really serious. Boys have taken their own lives as a result of this, and now we’re also seeing AI coming in, so that image isn’t necessarily even of them. It could be a deep fake. So these are all issues that we have to address as policymakers, and I really applaud the initiative to launch a holistic framework to address this within the Saudi government. I think it’s incredibly positive and progressive work.

Moderator – Rebecca McLaughlin:
Thank you very much. Dr Parks, from your work at the Deque Institute, holistic approaches is very much something that you believe in too, but talk to us about your latest findings, the safety index

Dr. Yuhyun Park:
findings that you have recently released. Thank you very much. It is an honor to share the stage together with two esteemed speakers. Last year in this stage, we announced the 2022 Child Online Safety Index, and this is our fourth publication on the Child Online Safety Index, which we titled Persistent Cyber Pandemic. We actually track exposure to cyber risk, including cyber bullying and sexual extortations and risky content and contact and so on, and what we found is that from 2017, 70% of children aged 8 to 18 have been experiencing at least one cyber risk, and this number has been a little bit fluctuated, but consistently about 70% across seven years. Of course, there’s an increase in certain risk and decrease in certain risk, but what we found is that it’s across the regions before, during, after COVID, this persistency exists. What does it mean? It is not just about the children issue. It is not just about the education issue. It’s not just about family issue. It is a, there’s a persistent issue that we need to address together as a policy makers and industry leaders as part of the big frameworks of cyber risk, so I really appreciate Kingdom’s approach as a collective approach to address this issue, and we’d love to support if there’s anything that we need to support the Kingdom.

Moderator – Rebecca McLaughlin:
Thank you. Well, Dr. Khalil, let me come back to you. In terms of policy and regulation, so important to really reinforce our efforts when it comes to protecting children around the world, not least in the Kingdom, what would you suggest are next steps, next important milestones and regulations when it comes to protecting them?

Dr. Maimoonah Alkhalil:
Well, I believe that the family plays a major role. We are surrounding these children. We need to have a very open conversation about these threats. We need to know who they are speaking to. We need them to feel they can, they are comfortable to speak to us about any threats, anything making them uncomfortable online. They need to know that we are their number one supporter and that we, without judgment, we will help them. Now, that is in terms of the family surrounding them immediately, but the child has several spheres of existence, and so in this framework, there is a role for the education system. We need to have the same conversation in schools. Teachers need to be able to detect if there is any threat going on and need to include this awareness about what could happen and the risks that are online in their conversations and lessons. We also have to acknowledge that there is a mental risk here affected and connected with cyberbullying and with other threats online, and so health-wise as well, we need to be aware and be able to detect if there is any health issue that we need to address as well, and so reporting lines, helplines, need to be also playing their role here and making sure that we have very clear reporting systems. Once that is reported, we need to have the legislation in place as well, and we need to have law enforcement mobilization so that even the offenders before the children know that if anything of this is ongoing, that there will be consequences, and so from this comes the holistic approach where we understand that a country alone perhaps cannot do much. We understand that this is a global problem. However, we can, nation by nation, at least make in place the infrastructure to be able to counter and address these challenges instantly

Moderator – Rebecca McLaughlin:
as the repercussions are quite profound. Thank you. Ian, as much as we can be prepared, as much as we can put legislation and regulation in place, a lot of it does rest with the child and their confidence, as we were saying, to share the information, to share their fears. How do we encourage them to do that, especially if they’re unaware of the real threats out there, not least

Iain Drennan:
with deep fakes, as you say, with AI? It’s a real challenge, and I think it’s really important to emphasize to the child that the burden is not on them. We should be, as a community, as a global community and as a national community, exactly as you said, say that we are there for you. We are there to help, and I think that goes to private sector companies. It goes to governments. Everyone has a role to play on this, but I think it’s really important that children have the tools and should feel empowered to make the choices that they need to when they’re online, so that I think it should be as user-friendly as possible, and that’s again for the private sector to design it right from the get-go so that it’s easy to report something that you feel uncomfortable with, so that you’re able to block someone easily, and I think just to keep that system in place around them, so that they’re able to take advantage of all the opportunities that there are online, but do it in a safe way. I suppose that’s my question.

Moderator – Rebecca McLaughlin:
What is the level of collaboration between the public and the private sector, and certainly with the tech and the software engineers? Are they speaking to governments? Are they speaking to bodies like yours about creating entertaining, edutaining, but safe, engaging environments

Iain Drennan:
online? I think there’s a lot of positive things we can point to. I think we can look at things like putting positive default settings in for child users, so that when they log in to use a service, that they have the highest levels of privacy, that they have the highest levels of protection, but also I think that there is that dialogue and connection, because there are some things like referring to law enforcement that only governments can do, so I think there needs to be a collaborative response, and it also needs to be a cross-sector response. Thank you.

Moderator – Rebecca McLaughlin:
Dr Park, what have you seen in terms of encouraging or positive developments, not least in the public and the private sector, that give you hope that we’re moving into a safer space, education-wise when it comes to initiatives, or even with those tech and software developers?

Dr. Yuhyun Park:
Well, well, well. This is a million dollars worth question. Before going into that, I want to actually echo back to this morning session that His Excellency mentioned about the sustainability, because I found it is quite interesting analogy that we wanted to actually bring up, because a lot of times the children issue is not the center of cybersecurity discussion. Why is that? I want to actually ask that before we can actually discuss about a million things that private sector governments and families and education can do, but before we even go into that, why children issue has been so neglected? Because I’ve been speaking about this topic for 15 years. I think same with And same with our core speaker. And nothing has been changed. Do you feel the frustrations, actually? Well, let’s ask them. Just to pause there. How many people here believe that it should be a top priority when we’re talking about cybersecurity? This is how it’s supposed to be. So I’d like to boldly suggest, you know, we have this 1.5 Celsius degree. When we talk about the climate action, we have set the goal, 1.5 Celsius degree, no more will be permitted, right? Just like that. We have a 70% cyber risk exposure among children. Can we target at least 50%? Can we work together based on the research, based on the scientific approach? We work together, bring down this number to 50%. Can we work together with that? I think we need a collective approach, at the same time, collective target. And which I’d like to suggest that, you know, GCF can take the leadership to make this happen. And that’s, I’d like to see as a first target that we want to have. What is the barrier? What do we need to remove to make that happen? Exactly. So we need to start with family, right? We need to ensure that there will be the right frameworks from education, from the Minister of Education to set the digital skills frameworks, starting with the digital citizenship that teach children’s AI and digital safe and responsible and ethical way. That’s for sure. That’s minimum. Family, children, education, that’s number one. But at the same time, we need to ensure, like Ayaan just mentioned, we need to ensure the ICT company to have the right frameworks to self-regulate their technology to be safe from the starting point. We just shared about the several functions, but we need to think about safety by design as a very core of their technology development, user empowerment, content moderation, age-appropriate measure. At the same time, the lastly, most importantly, is about the unified reporting about their transparency report. Current transparency report, if you see that, if you compare the transparency report from MEDA and TikTok, can we make it consistent measure so that we know who is responsible on what risk they’re permitting to happen in their platform? And lastly, we need to have the right policy and regulations. And what I actually was quite encouraged about that this whole public health-wide ecosystem that Saudi is building, it is from the end to end, when prevention to intervention and reporting and intervention again, it has to be a virtuous loop that we have to create. So we work so hard to get our technology infrastructures to get in place. Look at this kingdom, and look at how we are actively using social media and digital media, which is great, but now we want to move into the next phase. We need to really think about the sustainability, not just in the digital physical place, but also digital place.

Moderator – Rebecca McLaughlin:
Thank you very much. Doctor, let me come to you. Controversial question, but is there enough funding? Is there enough research? Is there enough data collection, enough development on every front when it comes to making sure that we ring fence and safeguard our children? Is there enough attention going into this important issue, which we all agree should be a top priority?

Dr. Maimoonah Alkhalil:
Very important question. From where I’m standing, no matter how much we do, I still feel like we need to do more. I believe that we need to prioritize child online safety when it comes to funding, and I implore all the entities to make that a priority. I think also that there are some very good opportunities that we can, even in situations where there could be a lack of funding, to utilize and capitalize on other sources of expertise. I want to take an opportunity to thank the UNICEF for helping us and providing the expertise required and the international expertise in coming up with this framework. There is a lot to leverage on, and I don’t believe that lack of funding or that lack of opportunities should stop us from continuing to work, but I do believe that enough funding will go a long way, especially in awareness campaigns, and in bringing to the fore that although we will do everything in our might and this framework to ensure a safe space for children, in the end, it is just the child facing that screen, and so we need to make sure we are putting all our might into their ability to identify risks, what to do when they do sense a risk, how assertiveness, resistance to peer pressure, knowing who to speak to, knowing also that they have a role as a bystander and not to allow any cyberbullying to occur, and so there is so much to be done. Funding is key, research is key, measurement, I cannot agree more, is key, and so we need to have an especially unified way of measuring and assessing progress is key, otherwise it will be very difficult to continue that cycle of intervention and initiatives followed by implementation and then evaluation and intervention and restarting that cycle again.

Moderator – Rebecca McLaughlin:
Thank you very much. Ian, talk to me about your greatest concerns, speaking frankly going forward, but also what gives you the greatest hope, because even if we don’t have children, we all know children, we have them in our family, so what do you see that we don’t?

Iain Drennan:
So I would say, I would like to pick up on the back of Dr Al-Khalil’s point about funding, I think there is funding there, there is not enough, and it is not evenly spread. So this is, so during this week, so I am here in Riyadh, I have a colleague in Nauru speaking with the Pacific Islands Law Enforcement Association, I have another colleague who is in London at the AI Summit. What we were struck by is the shared experience and the shared appetite for engagement on online safety. It resonates around the world in these incredibly different places, but you could have a victim in Saudi Arabia, you could have a perpetrator in Ireland, you could have, and they could be using software or service that’s headquartered in Korea. This is a problem where we can’t build a boundary around it on a national basis and say, right, we’ve got a perfect solution, because that’s not how the internet works. It’s not something where we can invest nationally and then expect to resolve the problem globally. So I think a concern is that there’s not enough funding out there, it’s not being directed enough towards prevention, so stopping the harm before it happens, and that it’s not being shared evenly. I think in terms of grounds of hope, because I think that’s really important to finish on, we are seeing progress. So I think the very fact that I’m sitting here talking about this issue and seeing the words child sexual abuse appearing up there in large letters up on that screen, five, six years ago, I don’t think that would have happened. I don’t think I would have been able to do that. I think that we’re getting to grips with these issues that we face as a global community that are difficult, that are challenging, that are uncomfortable, that make me as a parent sometimes want to look away. But I think we owe it to children not to look away, to grip the opportunities that we have to make things better. And we’re seeing legislation happening all around the world. We’re seeing a framework right here in Saudi Arabia. We’re seeing legislation to regulate the digital world in countries as diverse as Nigeria, Singapore, the UK, Ireland, Australia. People are recognizing that we need to set a baseline here. We need to set a baseline for action. So I think we have seen progress. I agree with Yuhim that we haven’t seen enough. But we’re seeing momentum building. And I think now is an opportunity to really leverage that to turn the tide.

Moderator – Rebecca McLaughlin:
Thank you very much. Dr. Parks, removing the taboo, be it cultural, societal, even within families, getting that conversation and the dialogue flowing as well as the funds into development in this area is so important. As Ian says, we’ve made strides. Not far enough, but we’re getting there. Again, your biggest concern when it comes to the tangible risks and your greatest hope going forward?

Dr. Yuhyun Park:
Yes, we are gaining the momentum on this. And then we’d love to see more practices, more holistic practices, just describing the kingdom. But at the same time, it is very important for us to notice that now Web 2 and Web 3 and metaverse and generative AI, everything is just going to boom. What is going to be like to our children? We don’t have an answer. So it is very difficult for us to stop the speed of technologies. But at the same time, we have to be mindful about that. These changes will change the dynamic on the human’s life, especially starting with our children. So with that regard, I think it is very important for us to really focus and mobilize our effort to understand what is coming risk because we have to be more ahead of the curve. That was His Excellency talked about this morning. We can predict. We have enough smart people in this room and also who is actually working together. Think about what’s coming, new risk to our children and our living room. It’s not about somebody else’s issue. It is my issue, your issue, our issue. So we need to really more proactive and that would be the very important part that we need to have the more provocative discussions.

Moderator – Rebecca McLaughlin:
Thank you. So the final word to you as a parent but also in your capacity, of course, at the Council. What concerns you the most but also gives you the greatest hope of how our conversation will hopefully have moved on when we meet this time next year?

Dr. Maimoonah Alkhalil:
I guess what concerns me most is the unknown and the unexpected. Knowing the challenges now, we are putting the, you know, what measures we could in place and we are willing to follow through and make sure that all these plans are implemented. However, AI is something that we are honestly watching very closely and I believe that and that is why I’m talking about the role of family and about focusing on the child and keeping this conversation going nationally and internationally and having these reports coming up annually where we could as best look ahead and prepare these children for a future. We honestly don’t know what it would look like or what it will be but prepare them as much as we can personality-wise in regulating and facing obstacles as they come. And therefore, I mean, I would like to end on a happy and optimistic note but I do realize that we have a lot to do still.

Moderator – Rebecca McLaughlin:
And just as practical tools for those in the room who may be interested, where can people find out more information and are there specific apps or protective tools that they can use to help monitor their children’s activity? What would you recommend?

Dr. Maimoonah Alkhalil:
Yeah, I recommend seeking knowledge everywhere honestly. There are wonderful reports coming out, there are short videos for parents to see and as part of the framework, there will be a very huge awareness campaign following the social and behavioral change approach. And so there will be a lot of, as part of this framework, a flooding of awareness snippets where just parents know what kind of parental controls they should be aware of, how to approach an issue, pinpointing and finding, detecting elusiveness or sadness or disinterest in family affairs in the child and trying to approach it in a way that would allow the child to open up. And so I believe knowledge seeking is very important on all platforms and we’re happy to use the family affairs platform as a source for this information.

Moderator – Rebecca McLaughlin:
Thank you so much. Your respective agencies have lots of information too, all of that can be found online. Ladies and gentlemen, my incredible panel, please thank them for their contribution here today.

Dr. Maimoonah Alkhalil

Speech speed

163 words per minute

Speech length

1416 words

Speech time

521 secs

Dr. Yuhyun Park

Speech speed

140 words per minute

Speech length

1152 words

Speech time

493 secs

Iain Drennan

Speech speed

156 words per minute

Speech length

1026 words

Speech time

394 secs

Moderator – Rebecca McLaughlin

Speech speed

175 words per minute

Speech length

866 words

Speech time

298 secs

Omnipresent Smart Wireless: Deploying Future Networks at Scale

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Nisha Pilai

The Cybersecurity Forum 2030 covered a range of topics relating to future networks and the associated cybersecurity challenges. Nisha Pillai, the keynote speaker, expressed the urgent need to prepare for future networks and address cybersecurity issues. It was argued that the emergence of networks like 6G and next-generation networks would significantly amplify cybersecurity challenges. The discussions highlighted concerns and the importance of critical evaluation.

Nisha Pillai also questioned the effectiveness of 5G and whether it had truly fulfilled its promises. Panelists were asked for their opinions on its achievements and limitations, leading to a thought-provoking debate. The potential of 6G networks to revolutionize the Internet of Things (IoT) was emphasized, with predictions that they would have a substantial impact on various sectors such as healthcare, smart cities, and energy.

Data management and personal data protection emerged as key concerns. The collection of large amounts of data for citizen services raised questions about how this information, particularly personal data, would be handled and safeguarded. Strengthening data protection measures and responsible data handling were highlighted as crucial.

The need for collaboration between the private and public sectors, particularly regarding cross-border data flow, was emphasized. Recognizing the global nature of data exchange, participants stressed the importance of cooperative efforts to effectively address cyber risks and ensure the smooth functioning of networks.

The significance of cybersecurity and standardization was also underscored. Participants, including Mr. Ben Amor, agreed on the need for standardization to mitigate potential risks associated with artificial intelligence services and applications. This highlighted the importance of establishing uniform cybersecurity protocols and practices.

Lastly, government support and international cooperation were identified as vital for expanding digital connectivity. With a significant portion of the global population still unconnected, there was a need for extensive efforts to improve digital connectivity. Mr. Ben Amor emphasized the role of international cooperation in addressing cyber risks and overcoming barriers.

In conclusion, the Cybersecurity Forum 2030 provided valuable insights into future networks, focusing on cybersecurity, the effectiveness of 5G, IoT, data management, collaboration, standardization, and government support. The discussions emphasized the need for proactive measures to address cyber risks and ensure responsible network development.

Bocar A. BA.

The analysis reveals important points about the development and deployment of 5G and 6G networks. One argument is that there is a misconception that 5G is an evolution of 4G, when it is actually a revolution. It is argued that 5G has been wrongly promoted as a consumer platform, when its main purpose is to serve enterprise, ports, airports, and vertical industries. This misunderstanding is supported by the fact that each market has developed 5G with a different strategy.

The early deployment of 5G in GCC countries is seen as a positive development, showing their proactive approach to embracing innovation in connectivity.

Regulatory breakthroughs in connectivity are considered crucial for advancing broadband networks. Economic regulation directly impacts a nation’s GDP, and regulators in the Middle East are moving towards the 5th generation of regulation, recognizing its importance.

Sustainability and carbon neutrality are emphasized as crucial aspects of 6G technology development, potentially helping combat carbon emissions and reduce consumption.

The telecom industry is shifting its focus towards sustainability, considering environmental and social factors in addition to cost and profitability.

Challenges in terms of cybersecurity are expected to increase with the development of 6G networks. The need for enhanced cybersecurity is recognized.

Harmonization between stakeholders is essential for the successful deployment of 6G. Standardization, scalability, and interoperability are key factors in achieving harmonization.

The infrastructure of 5G, 5G advance, and 6G networks will lead to an increase in cross-border data transfer, posing major cybersecurity challenges.

The standardization of devices connected to 5G and beyond networks is a crucial issue due to the billions of assets that will be connected.

Effective governance involving governments, private sectors, and third parties is crucial for managing and regulating network infrastructure.

Telecommunication networks have demonstrated their resilience during the pandemic, supporting social and professional life, remote schooling, and withstanding a surge in demand.

Enhanced cybersecurity and user empowerment are emphasized with the introduction of 6G.

Operators’ investment is crucial in tackling the challenges brought by 6G networks, and incentives can motivate operators to invest more.

Government involvement in promoting cybersecurity and building capacity is encouraged.

Telecom operators play a significant role in providing secure networks and contributing to cyber safety.

There is a need to connect the unconnected portion of the world population, to reduce inequalities and promote inclusion.

In summary, the analysis highlights various important factors related to the development and deployment of 5G and 6G networks. These factors include understanding the true nature of 5G, the early deployment of 5G in GCC countries, regulatory breakthroughs, sustainability and carbon neutrality, the shift towards sustainability in the telecom industry, cybersecurity challenges, stakeholder harmonization, infrastructure implications, device standardization, governance, network resilience, user empowerment, investment, government involvement, telecom operators’ role in cybersecurity, and global connectivity. These insights provide valuable information for stakeholders involved in shaping the future of connectivity.

H.E. Kyriacos Kokkinos

The consensus among the speakers is that 5G technology has not fully delivered on its promise. While it is still in the development phase, there has been no large-scale deployment of 5G in the industrial sector and no revolutionary use case in the consumer market. This has led to a negative sentiment towards 5G due to unfulfilled expectations. However, there have been some technical successes and ongoing developments in 5G.

On the other hand, 6G technology is anticipated to be the next major advancement in connectivity. It is expected to offer incredible speeds that are 100 times faster than 5G and work in conjunction with other technologies like satellite communication. The speakers highlight potential advancements in sectors such as education, healthcare, and business with the integration of 6G and advanced AI.

An ethical and responsible approach to 6G technology is emphasized to ensure its positive use and avoid potential negative consequences. The importance of collaboration in cybersecurity is also highlighted, along with the need for harmonization and regulation across stakeholders.

Furthermore, the overlap of human and machine reality with the advent of 6G and the metaverse is discussed. This raises questions about the humanization of machines and the mechanization of humans, with potential implications for society.

In conclusion, while 5G has not fully met expectations, it is still in development and has shown some technical successes. Anticipation for 6G technology is high, with expectations of groundbreaking advancements in various sectors. However, ethical considerations, collaboration in cybersecurity, and harmonization and regulation are crucial for responsible implementation. The potential overlap of human and machine reality poses intriguing possibilities and challenges that need to be addressed.

H.E. Eng. Mohamed Ben Amor

The analysis comprises multiple speakers discussing various topics related to breakthrough technologies. One key point highlighted is the positive impact of 5G. It is argued that 5G has brought significant advancements in terms of data speed, low latency, and high density. This breakthrough is seen as crucial for the development of new technologies that require these specific capabilities.

Additionally, the Internet of Things (IoT) is identified as a major connectivity breakthrough with wide-ranging applications in sectors such as energy, healthcare, and smart cities. The speakers emphasize that IoT has expanded the scope of Information and Communication Technology (ICT), making it a fantastic tool for improving the lives of citizens. The potential benefits of IoT for citizen well-being are highlighted, with the assertion that it has the ability to significantly enhance quality of life.

However, concerns are raised about the cybersecurity implications and data privacy issues associated with next-generation networks like 6G. The speakers express worries about the increasing collection of personal data and its management and protection. The large-scale collection of personal data poses significant cybersecurity risks and reinforces the need for effective data privacy regulations. It is strongly suggested that regulations play a crucial role in managing and protecting personal data.

The importance of international cooperation is stressed in order to address the risks and challenges posed by new-generation networks. The speakers highlight the role of cybersecurity bodies and advocate for a unified approach to tackling these challenges. Additionally, the urgency of standardization in digital technologies, particularly in the context of artificial intelligence (AI) and cybersecurity, is emphasized. The speakers argue that the high risks associated with AI services and applications could potentially endanger lives, underscoring the need for standardized protocols and practices.

Finally, the analysis concludes by asserting the need for regulations at national, regional, and international levels. The speakers argue that regulations are essential for safeguarding national and international security interests. The importance of creating a regulatory framework to govern data management usage and protect personal data is emphasized, aligning with the goal of achieving peace, justice, and strong institutions.

In summary, the analysis explores various aspects of breakthrough technologies. It highlights the positive impact of 5G and the transformative potential of IoT. The analysis also sheds light on concerns regarding cybersecurity and data privacy, emphasizing the need for effective regulations and international cooperation. The urgent need for standardization in digital technologies, particularly in the context of AI and cybersecurity, is acknowledged. Lastly, the analysis underscores the importance of regulation at various levels to ensure national and international security.

Session transcript

Nisha Pilai:
Kokkinos, former Minister, Research, Innovation and Digital Policy, Cyprus. Nisha Pillai, Moderator, International Moderator, former BBC Presenter. Excellencies, ladies and gentlemen, welcome to Deploying Future Networks at Scale. My name is Nisha Pillai and I’m delighted to be your moderator this morning and I’m especially delighted to take you into a journey, on a journey I should say, into the future. Are you ready? Buckle yourself up, it’s going to be quite a wild adventure. Why are we looking into the future, beyond 2030, right at the start of day two here at Cybersecurity Forum 2030? Well, I’ll tell you why. Because the future will be with us before we can blink. We’re talking about the networks like 6G and other next-gen networks. If you think we have cybersecurity issues right now, well, wait for the future. We ain’t seen nothing yet. Ladies and gentlemen, I’m delighted to say we have with us some extremely eminent panellists who’ve been doing the thinking on our behalf. Let me introduce you to them. We have His Excellency Kyriakos Kokkinos, former Minister Research, Innovation and Digital Policy for Cyprus. Welcome. We have His Excellency Engineer Mohamed Ben Amour, Director General of the Arab ICT Organisation. Welcome, Mr Ben Amour. And finally, Mr Bokar Bah, CEO of the Samina Telecommunications Council. Now, gentlemen, before we begin, I’d like to put this question to you. Where have we got to right now? Let’s forget 6G right at the start. Has 5G delivered on its promise? Has 5G actually given us new use case scenarios? Or is it just a quicker version of 4G? Mr Kyriakos.

H.E. Kyriacos Kokkinos:
All right. The question is almost a closed question. The answer is no. It did not deliver to the promise yet. Is it gaining traction? Probably we’ve made the mistake to overemphasise and consider this a revolutionary rather than evolutionary from the 4G. But so far, for many reasons that we can discuss down the road, 5G is still on the making. So tell me quickly, two disappointments then. It’s just higher speed, but on the industrial sector, there are no 5G deployments at scale. And on the use cases on the consumer market, did not yet deliver anything that is considered to be revolutionary.

Nisha Pilai:
Okay. So that’s quite a sceptical response from our first speaker, Mr Kokkinos. I wonder what Your Excellency Mr Ben Amor thinks. Has 5G delivered on its promise, on its hype? Or is it just a quicker version of 4G? Good morning, everyone.

H.E. Eng. Mohamed Ben Amor:
Thank you, Nisha, for this question. 5G is a breakthrough. It has brought answers for the development of the new technologies that need a lot of data, speed, low latency, and big density. So coming from this point, yes, 5G brought the right answer and it’s a breakthrough.

Nisha Pilai:
Okay. So a much more positive response there from Mr Ben Amor. So I wonder what we’re going to see here from Mr Bokar. Maybe somewhere in the middle. What do you reckon, Mr Bokar?

Bocar A. BA.:
Thank you very much. And good morning, ladies and gentlemen. To answer your question, we have to look at it from a different perspective. Each market, each country has developed 5G according to a certain strategy. Now, when we look at in the region, in the GCC countries, we have deployed in the early days among the first in the world, 5G, if I remember around 2019. Now, there is a misconception about the evolution from 4G to 5G. 5G in many countries in the world has been perceived as an evolution of 4G, which is absolutely wrong. 5G is not only an evolution, it’s a revolution in the sense that it’s an ecosystem. It is not a natural progress from 4 to 5. It’s an environment for connecting everyone, everything, everywhere. As such, 5G is an infrastructure to support the development. So it is not meant initially for the consumer. It’s meant for the enterprise, port, airport, and the vertical industries. Now, based on that, I’ll be very short. Many countries have promoted 5G as a consumer platform, a simple evolution from 4 to 5. The way we address it in Saudi Arabia, in Qatar, in Oman, in Bahrain, in UAE, is an infrastructure, an environment. So for the moment, we are on our path to deliver on expectation. It’s just a recent development.

Nisha Pilai:
Okay, thanks very much. So much more nuanced answer there. Depends on where you are, basically, I think, is what Mr. Bokar is saying. So I’m going to put a question now to Mr. Kiryaskar. Since he was the most skeptical about what 5G has or has not delivered, will 6G meet expectations? Will it be the revolution?

H.E. Kyriacos Kokkinos:
Okay. By saying no, it did not deliver up to the expectations, it doesn’t mean that 5G failed. On the contrary, it’s been a very successful deployment technically, and also there are some use cases that are still… Remember, 5G is just three years old. Indeed. And it’s also infected by COVID. So, and we are working now on the bridge towards the 6G. We’re talking about 5G advanced that will come in a couple of years. And 6G will come around to 2030. So yes… At the earliest. I’m sorry, at the earliest, at the earliest. And then deployment, the real benefits will start probably 12, 15, 20 years down the road. So yes, we need to work on the standards. We need to work on what to expect and what not to expect from 6G. For sure, one thing for sure is that 6G will deliver speeds that are incredible. 100 times faster or more compared to 5G. That is one aspect. But it’s not just that. It’s much more than just high speed. In new use cases, we will move from virtual reality to extended reality. We will see a network of networks and also remember that 6G is just a wireless connectivity technology that will be working hand in hand, not in competition with other connectivity technologies like satellite communication, space communication. So from a user and industry perspective, we need to examine what can we do to make the best use of this wireless connectivity.

Nisha Pilai:
Wireless connectivity, ladies and gentlemen, super, super, super fast. Speeds like we’ve never seen before. And potential that we’re only just beginning to think about. It’s still in its research phase. Thank you very much, Mr. Coquino.

H.E. Kyriacos Kokkinos:
This is an omni-channel approach.

Nisha Pilai:
An omni-channel approach. I love that phrase. We’ll examine it further in a minute or two. Let’s take a look, a very quick leap into the future now. And I’m going to ask all our panelists, what is the connectivity breakthrough that you’re most, most looking forward to and anticipating? And I’m going to start with Mr. Ben Amor, if I may. Can you give us one example of something you’re really looking forward to with these omni-channels?

H.E. Eng. Mohamed Ben Amor:
Yes, I would say that all of new technologies are fascinating, but if you want that I speak about only one, let’s say Internet of Things, because Internet of Things has widened the scope of the ICT for many other sectors, energy and oil, healthcare, smart cities. So I think that the Internet of Things, IoT, is a fantastic tool today in order to make the life of a citizen easier, and it’s in the sense of the well-being of citizens.

Nisha Pilai:
So making theInternet of Things, which is already underway, but into something which is all-pervasive and touches many different sectors, that is something that the future 6G networks are very likely to make into a reality. Thank you very much, Mr. Ben Amor. May I put the same question to you, Mr. Bokar? What is the connectivity breakthrough in the future that you’re most looking forward to?

Bocar A. BA.:
Well, interestingly, there are a few of them. It will not be fair to stick to one example. I would like you to stick to one example, please. When we talk about the broadband network, especially the new generation of Gs, we need to look at it from a different perspective because it calls for a collaboration between the different stakeholders across the value chain. One breakthrough is on the regulatory front. We have now most of our regulators, at least in the Middle East, attaining to what we call the G5 regulation, the fifth generation of regulation. Regulators have a tendency now, which is right, to provide economic regulation that will impact the GDP of the nation. That’s one. On the technological side, as mentioned by my colleague from Cyprus, 5G, advanced 5G or 5.5G and heading towards 6G. But the beauty here is we are starting to be very mindful about the carbon emission, reducing the consumption, and these are the possibilities offered by 6G. On the industry side, we used to work much more on optimizing the cost, profitability. Now, telecom operators, investors are looking into the sustainability, which is a different dimension. And the last one…

Nisha Pilai:
And is that more likely to be delivered with 6G? Sustainability issues, decarbonization…

Bocar A. BA.:
Sustainability, number one, from the investment perspective, but sustainability in terms of technology development, and as I was talking about, carbon neutrality. And the fourth point, which is very important, is anything we discuss needs unlocking access to capital. We need investment. We need financing. We need funding. And for that also, the capital market with the investors are coming with new financial instruments to be able to support the broadband development. So, four major breakthroughs, policymakers, regulators, technology development, the industry as a whole, and the multilateral development bank.

Nisha Pilai:
And bringing it all together is going to be key to the deployment of future networks, and I’d like to ask you, Mr. Kokkinos, your thoughts on that. What are the key advantages or idiosyncrasies of 6G that are going to affect its deployment on scale?

H.E. Kyriacos Kokkinos:
All right. I believe that we need to see these through the lenses of AI. One key difference… You need to see it through the lens of AI. Yes. Why? Because in parallel to this technology breakthrough of connectivity, we see advances of generative AI, especially since the developments of the last year. And I believe that 70 years down the road, at best, probably later, where 6G will be deployed, AI will be at the stage where 6G will have AI in a native manner, both in terms of managing the technological infrastructure, as well as in terms of using the use cases, the business models that will be invented through the use of 6G. So, we need to understand that, and that will affect the education sector, will affect the way we live, we transact, we communicate at business and social and personal level. So, I believe that we will see a deployment of 6G at scale with native AI embedded into it that will facilitate, will ignite unthinkable today things. And another aspect which is not AI-related, probably, is the fact that we see a lot of developments on the healthcare sector. I believe that through 6G, we will be able to make our dream come true in terms of predictive and proactive healthcare. We might see sensors floating on our bloodstream. Yes, science fiction, but look 20 years back. Things that we live today were… science fiction back then. So I believe that 6G will ignite unthinkable use cases that will… It’s our responsibility to approach 6G in an ethical and responsible manner for positive use cases and not negative use cases.

Nisha Pilai:
Let’s hope you’re right, Mr Kokkinos. So I want to put the same question to you from a telecoms point of view, Mr Bokar. What is it about the distinguishing features of 6G that are going to determine its deployment on scale?

Bocar A. BA.:
Well, as mentioned, 6G is still in the lab. It will not be here before 2030. A lot is required between the different stakeholders. Standardization, scale, interoperability, and we will be having a lot of challenges in terms of cyber security because now the frontiers become much more blurred. It opens lots of…

Nisha Pilai:
What do you mean by that? The frontiers will be more blurred and therefore it will have cyber security implications?

Bocar A. BA.:
Well, let’s look at the network design and the architecture from 5G going forward. We are starting to have a generation of infrastructure-based, software-based infrastructure. So 5G, 5G advance and 6G, we will see the cloudification flourishing. As such, in terms of investment and network deployment, we have to address the issues of cross-border data transfer. We will have to address, I would say, GDPR, so the protection of data and the people. So we are coming to a stage where in terms of challenges that we will be facing in terms of cyber security will be major. Now, cyber security is not only about technology. It’s less about technology, much more about strategy and mindset. This is something probably that we will be able to address. So the governance is extremely important between the different stakeholders. Government, private sector, and all the rest could fall into what we call the third party. One important aspect, the edge and device that will be connected to the network. We will be having billions of digital assets that might be compromised on the network. So we have intelligent network from 5G to plus. The problem that we are facing is the standardization of the devices that will be connected to the network. We have endless examples.

Nisha Pilai:
Indeed. Mr. Ben Amor, can I ask you, what do you think the cyber security implications are going to be of next-gen networks like 6G, especially since data collection is going to happen on an unimaginably greater scale, as we just heard from Mr. Bokar? How can they be managed?

H.E. Eng. Mohamed Ben Amor:
Yes. As you said, all the new technologies are driven by data. Now we are collecting more and more data for each service that we are delivering for citizens, for people. So the implication regarding the cyber security issue is the data and principally the question of privacy, because we are collecting many data, many personal data, and there is a concern about how we will manage this data. And in my point of view, I think that it’s very important to create some kind of regulation regarding the data management usage and to protect mainly the personal data. This is the most important fear regarding the new technologies and the big amount of data that they are managing.

Nisha Pilai:
Indeed. So Mr. Koukinos, we were just hearing from Mr. Bokar about the importance of governance. It’s going to be even more important, because you’re going to have data flowing across borders, for instance, more private sector, more public sector interaction. So can I ask you, what do you think it’s going to mean in terms of collaboration, in terms of bringing parties together? It sounds like there might be a geopolitical aspect.

H.E. Kyriacos Kokkinos:
Oh yes, absolutely there are geopolitical aspects as well. And in the history of connectivity, since the Marconi time in the early 90s, 1900s, until recently, unfortunately connectivity, I’m sorry, cyber has not been a team sport, and it should be. Throughout our journey of getting a digitally connected world, we made two design errors in this journey. One is that cyber was always coming as an aftermath. An afterthought, yes. An afterthought. And the second is that we forgot in our excitement to create this connected world, and we have an always connected world, we have a design error that we forgot to include an off switch. We cannot get switched off. And 6G will not have an off switch either. So cyber is very important to come in a collaborative and proactive way rather than an afterthought. From the beginning. From the beginning, because the risk is increasing exponentially, or the challenges, I wouldn’t say risk. And there are chapters on cyber security around the to be connectivity of 6G that we did not get right. We did not get structured. So collaboration, and this is the purpose of this, it’s among the key purposes of this event yesterday and today, is how do we get collective wisdom together? How do we collaborate across borders, because connectivity is boundless.

Nisha Pilai:
Are you saying that we need to begin to think about harmonization and regulation? Across different stakeholders and countries much earlier.

H.E. Kyriacos Kokkinos:
And at multiple levels. The technological layer is one thing, but remember, 6G is just one network of many networks that will be connected together. So we need to have a holistic architecture. We need to have a technological understanding how each one is impacting the other, how each transaction. And one of the key aspects that I wanted to touch, about two or three years ago, when Metaverse was announced, I started thinking, okay, are we talking about humanization of machines? Or mechanization of humans? And it was a kind of philosophical or ethical question to sociologists and psychologists. But now with 6G, that is becoming a reality, potentially. And we need to also touch it from the technology point of view, because the virtual reality will become a reality, and even an extended reality. So cyber is a team sport that we need to start working today, and through the standards, the architecture standards, at all levels, privacy, interoperability, et cetera, is important.

Nisha Pilai:
Hence this discussion here at the Global Cyber Security Forum, ladies and gentlemen. Mr. Bokar, can I ask you, from the telcoms point of view, how can telcos bring together their skills, their experiences, to ensure that the deployment of the next gen, including 6G networks, tackles cyber security problems from the outset?

Bocar A. BA.:
I think it’s a great opportunity, and everybody in the room worldwide can attest that we have lived the pandemic, the health crisis, early 2020. And we have seen that our entire social professional life was supported by the robust telecommunication infrastructure that has been provided by the service providers, resilient, and our children keep going to school, connected to the hospitals. So the network infrastructure is extremely important, and we have to make sure that it remains resilient. To be able to do that, we need the support and cooperation from government to ensure that we keep having this resilience. 6G is coming with all the advantages presented by Kiryakos. We need more cyber security, but we need less cyber control. What do you mean by that? More cyber security, but less cyber control? More cyber security in terms of instrument to be able to provide trust to the network, but we need to empower the users. That’s what I meant by less cyber control. We need more security, less cyber control. That’s number one. Number two, with all these challenges brought by the new network of networks, operators need to invest much more, and they need to be incentivised. So, from the private sector point of view, investment, solid and robust network. From the government, we need government to champion that approach, to stimulate human investment, capacity building, awareness with the population. Telecom operators, last point, have an opportunity to provide an extremely valuable value proposition, because today they are dealing with the enterprise, they can provide an entire secure network, and we also have to be mindful about another specific aspect, which is cyber safety. Child online protection, because schools are being connected. Now, one last point, Nisha, if you allow me. We still have 36 per cent of the world population not connected. More or less about 2.6 billion people. Not connected at all. That is one of the SDGs’ objectives by 2030, to bring the whole planet online. So, by having 2.6 billion people connected to the current connectivity process, we are adding more challenges into the network, and we need to build trust. So, I believe, from my perspective, that telecom networks have a great role to play, provided that they are supported by the entire ecosystem.

Nisha Pilai:
Mr Bokar, thank you very much. Mr Ben Amour, I’m going to give you the final word. What do you think are the highest priority actions, from a security point of view, that will need to be undertaken to protect the environment of the new-gen, next-gen networks?

H.E. Eng. Mohamed Ben Amor:
Yes. Let me say first that we are shifting more and more from the traditional world to the cyberspace. And when getting in cyberspace today, new digital technologies, applications, services, help a lot of people to make their lives easier and easier. But they create a lot of challenges, a lot of risks, and this is the role of cyber security bodies. That’s why I think that there is a need of, first, international cooperation, in order to have the same and concerned point of view regarding the risks and challenges and the way how to face them. There is also a very big need for standardization, because, you know, today with artificial intelligence services and applications, the risk is very important and the life of people is under the hand of some people that might not be good as we want. So there is a need for standardization. And the third thing is regulation at the stage of national, regional, and also international.

Nisha Pilai:
Thank you very much, Mr. Ben Amor. So what I’ve taken away from this conversation is the importance of thinking about cyber security early and not as an afterthought, as we heard from Mr. Kokkinos. The importance of standardization, we just heard from Mr. Ben Amor, and the importance of government support and international support to growing networks, new networks, at an early stage. Because remember, ladies and gentlemen, at the same time, we’re trying to expand the number of people who are in the connected world. A third of the global population isn’t yet. So while we’re bringing in all these new possibilities and virtual reality, we’re also bringing on lots of global citizens into the potential of the digital world. So the challenges are great indeed. I found it really interesting. I hope you have too. Let’s put our hands together to thank our eminent panelists, Mr. Kyriakos, Mr. Ben Amor, and Mr. Bokar.

Bocar A. BA.

Speech speed

132 words per minute

Speech length

1248 words

Speech time

568 secs

H.E. Eng. Mohamed Ben Amor

Speech speed

118 words per minute

Speech length

439 words

Speech time

222 secs

H.E. Kyriacos Kokkinos

Speech speed

137 words per minute

Speech length

1088 words

Speech time

477 secs

Nisha Pilai

Speech speed

163 words per minute

Speech length

1183 words

Speech time

435 secs

Supply Chain Fortification: Safeguarding the Cyber Resilience of the Global Supply Chain

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Ryan Chilcote

Summary:

According to the global cybersecurity community, there is a strong belief that a major cyberattack is imminent. Michael’s comment hinted at the possibility of an upcoming cyberattack, further raising concerns. Cyber threats target both individuals and nations, indicating that no one is exempt from the potential dangers.

Ryan believes that nation-states pose a greater danger in terms of cyber threats compared to individual hackers. This reflects the increased sophistication and capabilities of nation-states in carrying out cyberattacks. It is crucial for nations to remain vigilant and enhance their cybersecurity measures to protect critical systems and infrastructure from cyber warfare.

The importance of focusing on the security of supply chains and collaboration is emphasized, particularly in relation to artificial intelligence (AI). Addressing the challenges associated with AI and supply chains requires collaborative efforts. The Global Cybersecurity Forum (GCF) recognizes the need for collective action in addressing these issues.

One potential pitfall related to AI is the inclusion of sensitive information in text transcripts. Anecdotal experiences have highlighted concerns about privacy and security when using AI transcription software. For example, the software transcribed the entire conversation, including parts before and after the call, and shared it with all participants. This raises significant questions about the protection of private and confidential information and the overall security of AI systems.

In conclusion, the global cybersecurity community is increasingly concerned about the growing threat of cyberattacks. Strengthening cybersecurity measures and fostering international collaboration are crucial to mitigate these risks. It is also essential to address the potential pitfalls associated with AI, such as the exposure of sensitive information, to ensure privacy and security.

Dr. Saad Saleh Alaboodi

The disruption of the global supply chain in the cyber context is already an issue, with targeted attacks on various sectors. For example, the Shamoon attack on Aramco in 2012 had a profound impact on energy supplies. Additionally, the healthcare sector has been severely affected, as seen with the propagation of COVID-19. Furthermore, targeted attacks on the IT supply chain, such as the SolarWinds attack in 2020, pose significant challenges.

On a positive note, emerging technologies such as AI, quantum computing, and mobility tools are becoming increasingly fundamental to businesses and organizations. These technologies are no longer just plugins or interfaces, but offer opportunities for innovation and optimization.

However, the adoption of emerging technologies also presents risks. For instance, misuse of generative AI can lead to the creation of disinformation, with adverse consequences. Furthermore, disruptions and potential misuse in the adoption of these technologies must be carefully managed to prevent harm.

Business models that leverage emerging technologies, like robotics and drones for packaging and delivery, have the potential to drive significant innovation. It is crucial, however, that these models are implemented securely, especially in times of peace.

The adoption of emerging technologies also necessitates a shift in required skill sets and talent development. Decision-makers must be equipped to make decisions on a larger scale and at a higher speed in order to accommodate the influx of material brought into the decision-making process by emerging technologies.

To ensure supply chain security, international collaboration, robust regulations, and information sharing are crucial. Collaboration among “good guys” must be as efficient as that of “bad guys” to effectively counter cyber threats. It is also important to inject sovereignty in policy-making and industry to uphold supply chain security.

Securing the cyberspace is vital as more assets and items are being digitized and pushed from the physical space to the cyberspace. This shift towards securing the cyberspace leads to the security of the economy and the prosperity of nations. Some tech companies have already started the shift towards sovereignty, recognising its importance in securing the cyberspace.

Moreover, it is suggested that tech companies should focus on building sovereign versions of their technology and offerings, as this is seen as the future. The sovereign version of hyperscaler cloud services might soon become the default version, significantly impacting the global ICT markets.

Efficient integration between the physical and digital supply chain spaces is necessary for optimization in operational supplies, including cost, performance, and delivery. The intertwined relationship between different domains across the value chain can have catastrophic consequences in times of crisis. Therefore, there is a need to establish efficient integration between these two spaces to maximize benefits.

In conclusion, the disruption of the global supply chain due to cyber attacks is a pressing issue. While the adoption of emerging technologies presents opportunities for innovation, it also introduces risks that need to be vigilantly managed. Furthermore, ensuring supply chain security requires international collaboration, robust regulations, and information sharing. Securing the cyberspace is essential for the prosperity of nations, and tech companies should consider building sovereign versions of their technology. Efficient integration between physical and digital supply chain spaces is crucial for optimization and resilience.

Amin H. Nasser

The rapid digital transformation of our world has made us more vulnerable to cyberattacks, and the energy sector has become a prime target. Last year, approximately 97 zettabytes of data were generated globally, with a predicted increase to 175 zettabytes by 2025. This exponential growth in data provides cybercriminals with more opportunities to exploit vulnerabilities and gain unauthorized access to critical systems.

Aramco, a notable company in the energy sector, recognizes the importance of building resilience against cyberattacks. They have implemented a comprehensive defense strategy focused on safeguarding their operations. Aramco has established cybersecurity standards for all their service providers, creating a security-oriented ecosystem that strengthens their overall defense against cyber threats.

Artificial Intelligence (AI) is a powerful tool with enormous economic potential. Generative AI alone could contribute between $2.6 trillion and $4.4 trillion annually to the world economy. However, along with these economic benefits, AI also presents unique risks. To mitigate these risks, guidelines and controls have been established to promote the responsible and secure implementation of AI technologies.

Aramco’s commitment to cybersecurity is also reflected in their emphasis on continuous innovation and comprehensive cybersecurity measures. They believe that by actively pursuing innovative solutions and incorporating robust cybersecurity practices, they can ensure the safe and continuous supply of energy. The digital transformation of Aramco’s business has brought significant benefits, highlighting the importance of maintaining a secure digital ecosystem.

In conclusion, the rapid digital transformation has increased our vulnerability to cyberattacks, particularly in the energy sector. Aramco’s approach to building resilience through a comprehensive defense strategy and setting cybersecurity standards for service providers is commendable. It is crucial to guide the deployment of AI with strict guidelines and controls. Aramco’s focus on continuous innovation and comprehensive cybersecurity underscores its commitment to the safe and uninterrupted supply of energy.

Michael Ruiz

The analysis highlights several significant points related to cybersecurity and supply chain disruption. First, there is widespread belief among cybersecurity experts and business leaders that geopolitical instability could trigger a major cybersecurity supply chain disruption in the next two years. This consensus reflects a concern about the vulnerability of supply chains to global political tensions.

Furthermore, the global cybersecurity community predicts an imminent cyber attack, with particular focus on the threats posed by nation-states and evolving cybercriminal organizations. Nation-states are considered more dangerous due to their significant resources, while cybercriminal organizations have evolved from operating as individuals to working together as conglomerations or consortiums of bad actors.

To address these imminent threats, there is an urgent need to protect supply chains from cybersecurity threats and to enhance cybersecurity in Operational Technology (OT) networks, which are considered less mature than their IT counterparts. It is argued that best practices from both OT and IT need to be combined, and organizations must have a comprehensive view of their security posture. This entails ensuring visibility of all assets in the OT environment and bridging this information back to IT.

With the propagation of AI technology, new challenges in cybersecurity have emerged. It is cautioned that AI technology is often adopted earliest by bad actors to overcome security barriers. Consequently, there is an increasing need to stay alert to more sophisticated attacks resulting from AI.

The analysis also emphasizes the importance of policies for AI and cybersecurity, and the significance of public-private partnerships in developing such policies. It is recognized that policy-making lags behind innovation and that partnerships between corporations, governments, and global forums are crucial for finding effective solutions.

In addition, the analysis highlights the need for a multi-layered approach to cybersecurity, involving local, regional, and global efforts. It is argued that local regulations and solutions, along with regional strategies and global solutions, should work in tandem to address the complexities of cybersecurity. Notably, recent collaboration among 40 countries to refrain from paying ransom in malware cases demonstrates the importance of aligning strategies from a local to global level.

In conclusion, the analysis underscores the need to proactively address cybersecurity challenges posed by geopolitical instability, nation-state threats, and evolving cybercriminal organizations. It highlights the importance of protecting supply chains, enhancing cybersecurity in OT networks, leveraging best practices from both OT and IT, and adopting a comprehensive security posture. The potential risks associated with the proliferation of AI technology are also emphasized, as well as the necessity of developing policies and engaging in public-private partnerships to mitigate these risks. Finally, a multi-layered approach to cybersecurity at local, regional, and global levels is advocated for comprehensive and effective solutions.

Christophe Blassiau

The analysis explores the impact of emerging technologies on critical infrastructure and cybersecurity. One perspective suggests that major transformations and mega trends in critical infrastructure have the potential to bring about both opportunities and challenges. These transformations include an increase in decentralised energy production in homes, buildings, and cars, as well as the implementation of smart technologies like buildings and factories, leading to connectivity and data intelligence. Furthermore, the sustainability agenda promotes decarbonisation, which is another significant aspect of this transformation.

On the other hand, there is concern that such major transformations and mega trends put critical infrastructure at risk. Increased connectivity and data intelligence can create a major attack surface with vulnerabilities that attackers could exploit. The systemic approach of these transformations also raises the possibility of cascading risk, where an attack on one element of the infrastructure could have a domino effect, impacting other interconnected systems.

In the realm of supply chain and operational technology cybersecurity, emerging technologies are seen as reshaping dynamics. These technologies enable more automation, sustainable initiatives, and increased operational efficiency. The integration of operational technology (OT) and informational technology (IT) within the same environment is a significant development. However, challenges arise due to increased exposure of assets, demanding operational excellence and the need for a human-centric approach. Bridging the gap in terms of skills becomes crucial in addressing these challenges effectively.

The analysis also highlights the impact of artificial intelligence (AI) on various aspects. While AI has been used for data tracking, preventive maintenance, and advanced analytics, the advent of generative AI poses a major shock. The technology of generative AI was introduced without considering the potential risks, and there is a concern about the need for regulation and standardisation to ensure AI safety and security. The importance of regulatory measures to guard against impersonation, deepfake, and information manipulation is emphasised.

Collaboration in cybersecurity is deemed essential, as the current approach of assessing cybersecurity through security questionnaires is seen as inefficient. Furthermore, the analysis stresses the necessity of standards and frameworks in the field of cybersecurity. The need for a trust ecosystem in cybersecurity is also highlighted, with cybersecurity being based on the pillars of security, sovereignty, and survivability.

In terms of securing critical services during crises, efforts need to be taken to ensure that critical services can continue to operate even in times of crisis. Respecting data rights and intellectual property is deemed crucial, with the need to protect the data of every citizen and the intellectual property of every nation.

Overall, the analysis provides valuable insights into the impact of emerging technologies on critical infrastructure and cybersecurity. It underscores the importance of understanding the opportunities and challenges associated with these technologies, while also emphasising the need for regulatory measures, collaboration, and the protection of data rights and intellectual property.

Session transcript

Ryan Chilcote:
Now, sir, I mean, which is correct, first and last name, yeah? Well, I hope you all enjoyed that discussion. I certainly did. Very interesting, the role of education in tackling the problem of cybersecurity going forward. And we now move it on with a very special guest. Please join me in welcoming the chief executive and president of Saudi Aramco, Mr. Amin.

Amin H. Nasser:
Your excellencies, distinguished guests, ladies and gentlemen, it is a pleasure to join you once again at the Global Cybersecurity Forum here in Riyadh. A lot has changed since last year, and there is much to speak about. In this era of hyper-connectivity and digitalization, new technologies are rapidly transforming how we work and get things done. Digitalization enable us to complete tasks in seconds that once took countless hours. The Internet of Things has turned every piece of equipment into a smart device. The software we use today provide us with real-time access to data to make better and faster decisions. To put these changes into perspective, last year the world generated approximately 97 zettabytes of data, which is equal to 97 trillion gigabytes. As the world continues to digitalize, the volume is predicted to reach 175 zettabytes by 2025. All of this can be a force for good, helping businesses to be faster and better serve their customers. However, the rapid transformation we are witnessing has also made the world more vulnerable with increasing risks of cyberattacks. While every industry faces threats, the energy sector in particular is an attractive target for those who want to do harm. We play a critical role in the lives of billions of people. We supply the products that the world economy needs to make modern life possible, enabling everything from transportation to manufacturing. Any large-scale disruption to the steady supply of energy would have an immediate and significant impact around the world. At Aramco, digitalization has made us more agile and has helped us to deliver energy more safely, efficiently, and sustainably. To safeguard against the risk of cyberattacks, we have implemented a defense strategy focused on building resilience throughout the entire ecosystem because one weak link can hurt everyone. It is for this reason that we created a supply chain cybersecurity program, which established strident cybersecurity standards for all service providers. Throughout the entire lifecycle of engagement, each entity would do business with, must demonstrate, they uphold these cybersecurity standards and best practices. And to help extend cybersecurity capability across our affiliates in the kingdom and around the world, we have established Sibrani Solutions. This venture offers specialized cybersecurity services to help businesses protect their operations and data. We have also partnered with the Georgia Institute of Technology to create a master of science cybersecurity program with a cutting-edge curriculum. It has already produced 140 graduates with specialized cybersecurity expertise, with many more to come. At the same time, we know that cyber threats are rarely localized to any one organization or industry. Our collective security requires close collaboration between all stakeholders, regionally and globally. As part of that, we are a founding member of the World Economic Forum Center for Cybersecurity, which was established in 2018. We are also a strategic partner of the Global Cybersecurity Forum Institute and a founding partner of the new Operational Technology Cybersecurity Center of Excellence. Through this new center, we aim to shape the future of operational technology cybersecurity for any sector that uses industrial control systems. While we have made great progress with these and other initiatives in cybersecurity, there is another C word that we must be careful about, and that is complacency. It is absolutely critical that we keep our guard up. That’s why we must carefully assist every current and new technology to identify whether there can be a potential pathway for hackers to breach our system and address any vulnerabilities before the technology is deployed. This approach enables us to harness the powerful potential of new digital innovation while mitigating their risk. Which brings me to my next point, the power of AI. It’s new, it is exciting, and it is a game-changer for many industries, including energy. With generative AI tools now part of the daily life for hundreds of millions of people, the economic potential is truly astounding. According to one recent study, generative AI could add between $2.6 trillion to $4.4 trillion annually to the world economy. But as with all major innovations, it has its own unique risks, and some governments and businesses are taking a cautious risk management approach in the use of generative AI. As we consider these powerful new tools, it is important that we assist them as carefully as we have every other technology. The kingdom has already established robust AI control and guidelines to advance AI capabilities in a safe, secure, and responsible way. Moving forward, further collaboration between all stakeholders can help to establish international standards and best practices that keep pace with the rapid development of AI. This forum is a great opportunity to carry on that work. At Aramco, we believe that continuous innovation backed by comprehensive cybersecurity measures is critical to our future. Our digital transformation has brought vital benefits to our business. As we continue to adopt new technologies, we will uphold our commitment to cybersecurity and safely supply the world with the energy it needs today, tomorrow, and long into the future. Thank you.

Ryan Chilcote:
Well, good morning again. You don’t need me to tell you that the importance of securing the global supply chains and global supply chains in the context of our digitized world is of unparalleled importance. It’s actually quite easy to understand how one problem at one link in what have become and are even increasingly becoming extraordinarily long and complicated supply chains can have devastating consequences down the road. So, in the next 35 minutes, we are going to delve into the vulnerabilities in global supply chains, and because we’re at the Global Cybersecurity Forum and we’re focused on how to collaborate and solve problems, how we do that. We have some extraordinary panelists for this conversation. Let me begin with CEO of Cite, Dr. Saad Al-Boudi. Thank you very much. Michael Ruiz, Vice President and General Manager for Cyber Innovation at Honeywell. And Schneider’s Christophe Blasio, in charge of Cyber and Product Security. Thank you all for joining us. Michael, if I could start with you. Obviously, we’re here to talk about how we increase the resilience of global supply chains in the context of our increasingly digitized world. So, set the stage for us. How big of a problem, how dangerous is the situation we’re in today in the world when it comes to that job?

Michael Ruiz:
Absolutely. It’s a huge problem today. When I think about this, I tend to think about it in two major buckets. Nation-state bad actors that are looking to move a political agenda and cyber criminals. As we’ve done surveys in this area, 93% of cybersecurity experts today and 86% of business leaders that were surveyed believed that we were, because of the geopolitical instability in the world today, that we’re on the precipice of a major cybersecurity supply chain disruption within the next two years. And we need to do something in order to be able to protect those supply chains.

Ryan Chilcote:
Extraordinary. Christophe, I want to come to you. But before I do that, let’s just do a sound check. How can everyone hear in the room? At the back of the room, are you hearing okay? Everyone hearing all right? Fantastic. So, we shall continue. Okay. So, Christophe, let’s put that problem in the context of global supply chains. Obviously, extraordinary that we have near unanimity in the cybersecurity community that we’re staring down the barrel of a very big gun and we’re looking at a major disruption in the next couple of years. So, what does that look like in the context of global supply chains? And what other vulnerabilities should we be on the lookout for?

Christophe Blassiau:
So, if you think about the critical infrastructure or maybe energy sector more particularly, there is massive transformation and mega trends. So, first, this is becoming increasingly decentralized. So, think about it. Every home, every building can produce its own energy. Think about it. Any car will charge itself to get electric. So, decentralization is a big trend. The second one is digitization. So, everything is becoming smart. Smart buildings, smart cars, smart factories. So, this connectivity to get data, to get intelligence is really also transforming the landscape. And really, the third one is our sustainability agenda to attack climate change is pushing for decarbonization as well. So, these three mega trends is really pushing us to connect everywhere, to have data for more intelligence, and it’s creating a major attack surface with a lot of open doors for attackers in a very systemic approach with cascading effects, with cascading risk. So, this is really pushing an agenda. more pressure for critical infrastructure over, and vis-a-vis the critical infrastructure at risk.

Ryan Chilcote:
Thank you. Saad, if I might turn to you. So we’ve heard a little bit about the future and the trajectory. To what extent are disruptions of the global supply chain in the cyber context already an issue?

Dr. Saad Saleh Alaboodi:
Thank you, Ryan. It’s great to reconnect with our friends, partners, and guests from all over the world here in Riyadh again as part of the GCF. If you look at the recent years, I believe the world has gone through different scenarios of multiple hits that are impacting the global supply chain in one way or another. Starting with the targeted attack on the global energy supplies. One famous attack was the Shamoon attack targeting the IT infrastructure of Aramco in 2012. This was followed by another attack targeting the OT infrastructure of Aramco in Rabu facility in 2017. And another hit, which was employing the emerging technologies, in particular drones, targeting the energy supply in 2019 for the facilities of GIG and HRACE. So if you look at these three attacks, they’re targeting the global energy supplies from IT to OT to the emerging technologies. Another category of hits is coming from different industry, from the healthcare. Well, we’ve seen how COVID-19 in early 2020 propagated in a way that never been anticipated, impacting all other industries, not only the healthcare, and impacting both ends of the spectrum of the supply chain, the supplier or the manufacturing and the consumers as well. And another hit in the same year, which was the targeted attacks on the IT supply chain, the solar ones in 2020, was a famous one, putting more than 30,000 of entities from both public and private globally in all industries at risk of data exposure, with more than 18,000 entities confirmed to have installed the malware. And then after that, the fourth category can be the economical slowdown that we have seen over the past three years. It’s impacting, in particular, the production and the supply of goods. And the last hit is the unfortunate scene that we see in the geopolitics today. So all these different hits, and in the aftermath of all these issues, is the adverse social impact, especially on our young generations. So I believe if we look at all these hits, they are, in a way or other, either the cause or the consequence of the global supply chain issues. And that means that the global community needs to do something much more, something better when it comes to sharing and charting the priorities. I believe the announcement that was done today for establishing the GCF Institute Center of Excellence for addressing the challenges of the OT technologies is a very important move, because the OT technologies today exist in all the critical grids of the global supply chain.

Ryan Chilcote:
Very interesting. Thank you very much for that. Michael, I have two quick questions for you. First off, if you could expand on your initial comment there, where you told us that the global cybersecurity community is pretty convinced we’re going to get an attack very soon. And you said that, you know, when you think about cybersecurity, you think about two different entities, individuals and nation-states. I guess, and maybe everyone here in the room would agree with this, that when we talk about this, the bigger danger out of those two groups is nation-states. Is that right?

Michael Ruiz:
When we think about this problem, I mean nation-states are absolutely, you know, the more dangerous or concerning component, because they have significant resources that they can bring to bear to prosecute their missions and to accomplish what they want to do. But we’re also seeing cybercriminal organizations that are starting to collect. So cybercriminals are no longer individuals. They’re really conglomerations or consortiums of bad actors working together. And these are not mutually exclusive groups. In a lot of cases, we’ll start to see nation-states also act like cybercriminals, in that they’re looking to be able to create funds, increase their war chest in order to be able to go after, you know, larger targets.

Ryan Chilcote:
And we heard Saad there give us some great examples of attacks that we’ve already seen. I wanted to ask you, in the context of security of energy supply, I immediately thought in the United States of the Colonial Pipeline, which was disrupted, cyber disrupted, a few years back. What does that incident tell us about the threats? Was that an issue of operational technology, an interruption of that?

Michael Ruiz:
Yeah, I think when we look at things like Colonial Pipeline, what we see really are IT systems that are being affected at this point. We still haven’t seen any major significant OT cybersecurity threat. But imagine the difference between being able to shut off a building system that shuts off a pipeline and creates disruption, versus actually going ahead and trying to weaponize that system. Shutting off a pump at one location or a valve at one location, increasing pressure, and now having a pipeline explosion that becomes both an environmental catastrophe, as well as a supply chain disruption, is a huge problem that we need to really be thinking about. And OT cybersecurity systems today, by and large, are way behind from a cybersecurity perspective than their IT counterparts. The level of maturity just isn’t there. One alarming statistic that I’ve seen is that in OT networks, an intruder will spend 200 days observing the network before taking any action. Imagine a situation in an IT network where we would have a bad actor working unobstructed in an IT network for 200 days. That would be unconscionable to us today. But that is what we’re seeing in the OT cyberspace.

Ryan Chilcote:
Extraordinary. And, Saad, it was so interesting to hear about the Center of Excellence that the Institute is going to be running, which will really be a leader when it comes to providing the newest operational technologies. Christophe, if I could turn to you, since we’re now on to the subject of emerging technologies, how do you see emerging technologies reshaping the dynamics of the global supply chain and operational technology cybersecurity?

Christophe Blassiau:
So, here, it’s about connecting assets. At the end of the day, it’s making assets under visibility, speaking about data, speaking about intelligence for operational efficiency, for more automation, for a sustainable agenda. So, here, it’s not only an OT space or an IT space. I don’t like to oppose the two because, at the end of the day, in an operational environment, you have both OT and IT in the same environment. So, it’s connected to the cloud, the intelligence is at the edge, we’re sending data to the cloud to get some insight, analytics, predictive maintenance, efficiency. And here, that’s really a gap and I’m happy that we have initiative to bridge this gap in terms of skills because we need people in charge of cyber, but also in charge of the operational environment. And this attack, like Saad was mentioning, is just becoming kinetics, becoming physical, that pushes every government, every nation, to protect its citizens and critical infrastructure with a national security strategy. So, these emerging technologies are a great opportunity for more visibility on every asset that we want to talk to data and to the cloud, but at the same time, it’s bringing some challenges, both in terms of exposure, which means that we need to have an operational excellence there, and we need to have a human-centric approach in this operational environment with the right skills to be able to manage them. And that’s really the challenge that we have.

Michael Ruiz:
What I would add. So, Christophe makes a great point. We need to think of IT and OT at the same time and the convergence of the two. That’s what success looks like at the end of the day. However, OT environments that they operate are fundamentally different. They work on a different model. If I have a petrochemical plant, I can’t just reboot it if I have a problem. I can’t isolate a network. I’m in the middle of a control process and I have to go through it. I have to maintain accessibility to all of the control systems, and so, therefore, the problems in OT and IT are somewhat different in how you can resolve them or find resolution to them. But we do need to find that convergence between the two spaces and be able to take the best practices from both environments, bring them together. But I think that we still have the challenge that OT cybersecurity or OT networks in general are just at a far level, far lower level of maturity today than their IT counterparts. And there’s a lot of work that needs to be done in order to be able to bring that to bear, starting with just being able to have full visibility of all the assets that are sitting in the OT environment and being able to bridge that information back to the IT space so you can have one comprehensive view of the entire security posture for that organization.

Ryan Chilcote:
The accounting is just not there yet. It’s not. Saad.

Dr. Saad Saleh Alaboodi:
Well, I believe that the emerging technologies has been overwhelming all of us today in this room. However, the thrilling combination of AI, quantum computing, and the mobility is increasingly empowering the world in a way that we’ve never seen before. And it’s becoming like the brain and muscles for making disruptions to businesses from both the public and private sector organizations. And I believe in the very near future, these technologies will be an infrastructure technology as opposed to plugins or interfaces as we see today. However, the adoption of these technologies will lead to pushing more assets from the physical space to the cyberspace, creating so much opportunities for innovation and optimization, but at the same time, leading to sometimes devastating risks and scenarios. So, if we look at an example at the risks of generative AI, well, with good intentions, it can be the hallucination of the algorithms and analytics. But with bad intentions, it can be fabrication of a truth. So, this is true disinformation. So, imagine the consequences. And if we look at another example, which is Amazon model, we’ve seen live experiments of Amazon deploying robotics on the shop floors for doing the packaging and using drones for doing the delivery. So, imagine the world today with an expanded view of this model where clients and consumers browse the internet and market stores for doing the purchase all the way to employing drones and robotics for doing the packaging and delivery. That’s a tremendous opportunity for innovation, but only if done securely and in times of peace. Otherwise, it will create unprecedented consequences if done insecurely or in bad times. Another angle to the adoption of emerging technology, and I think His Excellency, the Minister of Education, Ben-Yan, shed some light on this, is related to the impact on the paradigm shift of the skill sets that we need today. I believe decision-makers today, they need to be prepared to make decisions at scale and speed at the same time. And the reason is very simple, because the emerging technologies is bringing so much material, so much material to the decision-making process. So, we need to co-op with this level of complexity of understanding of the data and advanced technologies. And that will be, I believe, also very impactful on the way we do the skill set development and the talent development today.

Ryan Chilcote:
Thank you very much. Michael, we heard Saad there talking about everyone’s favorite emerging technology. What challenges does AI and generative AI, which we’ve already been talking a bit about here this morning, present in the context of the global supply chain?

Michael Ruiz:
Well, look, I love AI. My academic career was in advanced analytics and evolutionary computations, genetic algorithms. So, this is an area of passion for me, and I’m amazed at how this technology that was in labs and sitting in innovation centers within organizations have now propagated into the world, and the explosion of AI is amazing. It keeps me up at night as a person that worries about my clients’ OT environments. These kinds of technologies are often adopted earliest by bad actors, trying to be able to move up a very steep barrier curve that they’re trying to overcome. And I think we’re going to see some more incredibly sophisticated attacks that are going to come out, and many more attacks. So, I think we’ll continue to see more and more attacks on a year-over-year basis. The level of sophistication is going to increase, and there are going to be more coordinated attacks, because the level of planning that a generative AI model would allow you is pretty amazing.

Ryan Chilcote:
Thank you, Michael. Christophe, what would you add to that? And I guess if we think about solutions, which we want to get into now, is there an opportunity with AI in addressing some of these cybersecurity challenges, if we kind of turn things on their head?

Christophe Blassiau:
Yeah, we need to add a positive note to the AI threat, of course. And hoping that we are as fast as the bad guys to cope with new technologies and to defend ourselves, of course. And at the same time, AI is new and not new. So, we have been using AI to track data, to do some preventive maintenance, to do some advanced analytics. What is really new is really the shock of generative AI of this year. And we have been saying for the last 10 years, before you introduce a technology on the market or in an environment, so you check the risk before. And for once a year, we just put the technology out there without having any clue about the risk that it can pose. So, that’s an interesting play, and then we are running behind the topic. But it’s a major shock because, of course, it will transform every company on very standard elements like customer relationship, like R&D, like coding. So, if you code, and Michael, you were saying that you are coding, so the coding experience will be very different now with GNI and GPT or BARD or others. But the key point with this innovation, with this opportunity for ourselves to defend our critical infrastructure for developing new technology… My hope and hope will not be enough. I think regulation we will come strong here and We see a you a I detect, but we saw so two days ago The u.s. Executive Order for AI safety and security from mr. Biden so It needs to come with some galleries and innovation Need to obey this guardrails on security and privacy In beauty environment in protecting the data because it’s not only personal data. It’s about sensitive operational data That that we care about This is about also taking care about impersonation deepfake or information manipulation So we see a rise of perfect fishing Back to awareness that raise the bar for every of our employee citizen to be aware And we need to guardrail These this topic we with regulation Hoping also that regulation will be harmonized Between country there. So a lot of hope and at the same time Certainly an opportunity to create tech champion Take films in every location of the world Because AI adoption will be very different here and there there is a sovereignty agenda in every region as well Let’s make sure the sovereignty agenda is not opposing with the benefit of AI Everywhere on the planet.

Ryan Chilcote:
I’ll let you on in a little secret. I discovered some of the perils of AII use some software that transcribes my zoom calls. So it just takes the voice and turns it into text and also backfills it with information Summarizes the call what I didn’t realize is That it summarizes it transcribes all of the text even before the other participants in the call have joined the call And after the calls over it sends it to them so We are learning as we are living when it comes to AI, which is very dangerous. So let’s talk about the way forward side You know, what do you think are the most important areas for us to focus on when it comes to ensuring the security of supply chains and The big focus here at GCF is how we can collaborate to address these problems before they blow up in our face

Dr. Saad Saleh Alaboodi:
Ithink taking all these discussions and different views in mind I believe for the way forward we need to inject sovereignty in In both streams the stream of policy makers and the stream of industry players and then on the policy makers front I believe we need more robust regulations and international collaboration and info sharing It’s unfortunate truth today That’s that the bad guys sometimes are more efficient and info sharing and collaboration Than the good guys and on the industry players front, I believe Sovereignty is becoming an innate need for securing the cyberspace and as we see today more economical Assets and items are becoming digitized and they are being pushed from the physical space to the cyberspace So the security of the cyberspace leads to the security of the economy and and therefore the prosperity of nations So although these domains and notions are distinct, but their impact on each other is very intertwined So I believe embedding sovereignty is the new logical step in the evolution of technology all the way from the inception and design of technologies to development to operations and That will lead to the operational control and data a province and for tech companies I believe the new logical mindset is to inject sovereignty as well into their solutions Now some of the companies have started this shift already Cisco For example a few weeks ago. They announced their move towards establishing regions in other nations to cater for some of the Regulatory aspects when it comes to the security Amazon last week announced the establishment of a European Hyperscaler cloud seven cloud for the European Union, which is separated from the Amazon public cloud another example also by Microsoft announcement In early October where they said that they announced that they will make a sovereign version of their hyperscaler cloud They will make it public and available to other nations by end of this year so There is a clearly there’s a mind should I think there is a need for a mindset shift and to take companies toward the 70 and If you reflect on the example of the cloud computing for example Today that the cloud can the global cloud markets is about 10% of the global ICT markets So that’s around six hundred billion dollars out of the six trillion dollars globally for the ICT market and I believe the market share between the sovereign version and The public version for any tick offer for any tick offering is a zero-sum game So what is changing is actually the distribution of the market share between the sovereign and and the public? Version as and in light of this, I believe very soon we will start seeing the sovereign version of the hyperscaler cloud is the default version on the expense of the On the public version and the impact of this and tick companies is very obvious. I believe the way forward for tick companies is to inject sovereignty and to their offering because that’s the way forward to sustain their businesses and Maintain their market shares outside their home country at the end of the day This is a win-win to both ends of the spectrum of the supply from the technology vendors. They will Maintain their business and from the consumer side They will have a much better offering when it comes to the mitigation of risk and sovereignty

Ryan Chilcote:
And just very briefly a clarification for me that I wanted to get to earlier but left aside We we heard the former foreign secretary of India Ambassador Sham Saran talking about you know, what we’re really trying to do is is wrap our analog minds around complex Problems in the digital space. So what just very briefly what’s the difference between solving a problem in the physical real world if you will analog Supply chain space and the digital space

Dr. Saad Saleh Alaboodi:
Well, I think you know the challenges and opportunities into the global supply chain when it comes to to to these two spaces is Is very interesting. Well in good times when both spaces are functioning Well, there is there are tremendous opportunities for optimization with respect to the operational of supplies when it comes to the cost performance and delivery And this is as a result of the gained benefits of the integration between these two domains And also Depending on the reliability of the building blocks of the global Supply chain grids, of course is starting with them with the energy grid as the fuel for the rest of the value chain All the way to electricity grid logistical grid and the last but not least the data grids as well however in bad times Imagine this intertwined relationship between these different domains across the value chain. The impact can be catastrophic. So I believe that’s that’s a big distinction between Between the digital space and the cypress space because at the end of the day it can be like few lines of code traversing the cypress space Not going multinationally and checked by border customs Can cripple the grids of critical supply chain in other places and sometimes maybe in other jurisdictions

Ryan Chilcote:
Thank you Michael question for you and we heard Christophe they’re talking about how he hopes that we’re faster at solving the problems that AI can present when it comes from the perspective of cybersecurity then AI is itself so is that actually a Real problem because if you think back to our plenary session we heard Jose Manuel Barroso talking about how it took the European Union Nine years to agree on GDPR. And so they got that beautiful scale of whatever it is 1.2 billion people But it took a long time to get the standards Do we have the standards in place when it comes to AI and is the solution like we heard from? The former president of Estonia that the private sector gets us there because maybe governments can’t

Michael Ruiz:
We definitely don’t have the policies today, I mean policy lags innovation every day all the time And and it’s I think it’s forums like this like the global cybersecurity forum Public-private partnerships that are going to be crucial for us to be able to create the level of policy needed I absolutely agree that we need to be able to kind of bridge that gap between the way our analog laws and our digital Implementations or innovations need to come together All the things, you know as the president Estonia said, right? We need to just agree that all the things that are wrong in the analog world are also wrong in the in the cyber world And then we need to move forward from that perspective So I think that there’s a road to get there with better more informed policy I think the challenge that we run into is that we have to do it in this public-private partnerships We have to be able to have bring corporations Governments globally interconnected forums like this together to be able to solve those kinds of problems

Ryan Chilcote:
Krista

Christophe Blassiau:
Yeah As we say you are not alone So you are not alone with your security posture. You are not alone With your suppliers with your customer with the cyber agencies with a critical infrastructure, so I’m really advocating for more symmetry in that domain meaning that We should collaborate with our suppliers in the way we collaborate with our customer There is a very inefficient way of working or assessing cyber security these days is to send a security questionnaire This is still happening Okay, and everybody’s answering a cyber security questionnaire for multiple customer multiple suppliers, etc So there is really a call for collaboration horizontally in the supply chain and at the same time roles and responsibilities on technology vendor technology suppliers and responsibility also of the operators or Operational environment that are using this technology. So we’ve digitization as we saw before And we saw also that during the pandemic Environment that should be locked down behind Close gates are just open For attackers, so that’s the first thing. So there is open Opportunity in the technology play as we saw I Had one of my engineer showing me that it was possible to reverse engineer a firmware or software With AI, so it was at the same time really interesting and super scary So I think we need to balance these two things together and innovate with this in mind Not to do a big mistake and at the end of the day It’s That’s why I value this forum and I thank GCF and NCA for that. This is a perfect example where we need to collaborate on standards on framework on the intercept when it comes to To answer really to harmonize. I see there is a session this afternoon on harmonizing the standards to speak the same language because when it comes to a Catastrophic attacks or incident that we were mentioning There will be a worldwide response to such an attack We cannot hide behind a regulation or a standard a and standard B, etc And at the end of the day, it’s about trust in the ecosystem between Suppliers to source securely some technology Operator customers agencies really on three pillars So one is security. Of course, we need to mitigate the risk of such cyber attack on critical infrastructure The second one is on sovereignty because we need to protect data of every citizen and we did also to protect Intellectual property of every nation and the third one is the third S on survivability of resilience whenever things happen and Things will happen or is happening already. We need to make sure that critical services are able to operate even in term of crisis, so Having the B plan Ready before it happen. It is mandatory for all of us

Ryan Chilcote:
Michael I’m gonna give you the last word square the circle for me here because We have sort of competing views. We heard from our panelists this morning that they’re concerned about You know the our fragmenting world polarized world getting in the way of international collaboration when it comes To cyber security and yet we have a real-world example. What just in the last 48 hours 40 countries coming together agreeing To not pay ransom in cases of malware. So how optimistic are you and because we’re about solutions We just got 30 seconds or I’ll give you a couple extra seconds What’s the what’s the practical thing we can do to forge this collaboration and what should we be working on?

Michael Ruiz:
Look, I think I think it starts with the fact that we have to recognize that there are layers to this problem You need local regulations and local Solutions you need regionalized solutions and then you need global solution and they all come together and interplay at some level I think what we’ve done is that we we kind of believe that we can operate as one global society And I think that’s great and to the degree that we can make that happen That’s wonderful But I think we also have to have regional strategies and regional solutions and local strategies and local solution The problem is too big to try to tackle all at once and working both from the bottom up in the top down allows us to be able to align in the middle and get to a Better end state at some future.

Ryan Chilcote:
All right, Michael Ruiz vice president and general manager for cyber innovation at Honeywell Schneider’s Christophe Blasio in charge of cyber and product security and CEO of site. Dr. Sad all booty Thank you very much. Please join me in giving me giving a big round of applause For our panelists. Thank you

Amin H. Nasser

Speech speed

110 words per minute

Speech length

920 words

Speech time

504 secs

Christophe Blassiau

Speech speed

164 words per minute

Speech length

1441 words

Speech time

527 secs

Dr. Saad Saleh Alaboodi

Speech speed

180 words per minute

Speech length

1741 words

Speech time

581 secs

Michael Ruiz

Speech speed

206 words per minute

Speech length

1257 words

Speech time

366 secs

Ryan Chilcote

Speech speed

155 words per minute

Speech length

1438 words

Speech time

555 secs

Hello from the CyberVerse: Maximizing the Benefits of Future Technologies

Table of contents

Disclaimer: It should be noted that the reporting, analysis and chatbot answers are generated automatically by DiploGPT from the official UN transcripts and, in case of just-in-time reporting, the audiovisual recordings on UN Web TV. The accuracy and completeness of the resources and results can therefore not be guaranteed.

Full session report

Ahmed Al-Isawi

The analysis of the provided data highlights several key points regarding the concerns and potential of emerging technologies. One notable concern raised is the potential impact of hacking in a highly digital world. Ahmed Al-Isawi, a renowned expert, showcases his worries about the security risks associated with hacking. He started as a hacker himself and understands their potential. He currently holds the responsibility for security in a digital city being developed in Neom.

Another concern is the need to foster innovation. It is emphasized that fostering a team that can innovate is crucial. A NASA study is mentioned, which indicates that children as young as five years old possess a 95% capability to innovate. Additionally, inculcating the right skills, knowledge, and values in teams is highlighted as being vital in cultivating a culture of innovation.

The role of Artificial Intelligence (AI) in enhancing cybersecurity is also discussed. It is stated that AI can be used to monitor a large supply chain, detect anomalies, and respond to them within a limited timespan. However, it is important to note that the traditional methods and solutions may not suffice in solving modern cybersecurity problems. Ahmed Al-Isawi argues that if organisations continue to rely solely on traditional methods, they will fail to fully exploit the potential of AI in securing their systems.

The application of AI in breaking traditional boundaries is presented as a positive aspect. By employing AI innovatively, it is suggested that AI has the potential to overcome traditional limitations. Moreover, the shrinking turnaround time for detecting and reacting to cybersecurity incidents is highlighted, indicating that humans alone cannot cope with the short timeframe and that AI can play a significant role in addressing this challenge.

The metaverse, a virtual space, is explored in terms of its cybersecurity challenges and potential benefits. One notable challenge is the issue of user protection, as observed in the case of Second Life, an early example of a metaverse that faced problems with bullying and harassment. However, there is also optimism regarding the potential use of decentralised digital identities to improve behaviour in the metaverse. It is proposed that having people identified in the digital world may lead to better behaviour.

The importance of interdisciplinary cooperation and involving more than just cybersecurity experts in protecting the metaverse is emphasised. Authorities such as the police are suggested to contribute to maintaining order in the digital space.

Advancements in education through the use of the metaverse are highlighted. It is suggested that the metaverse enables school experiments to be conducted in a safe virtual environment and may lead to cost reduction for schools.

Regarding regulatory frameworks, it is argued that current regulations may not be sufficient to protect emerging technologies such as AI. The asymmetric nature of emerging technologies, where AI is expected to be used by approximately 60% of employees, raises concerns about the lack of policies to regulate its use.

Another concern raised is the potential for AI to produce faked or hallucinated information, especially with the development of generative AI. As a result, the need for AI to provide transparency and explain its processes is stressed.

It is noted that while regulations are important, they alone will not solve everything in the context of preserving values in an uncontrolled metaverse. Other factors such as education, parenting, and cultural and religious values are deemed necessary for value preservation.

The human element within the digital ecosystem is identified as crucial in preserving values. Humans are often considered the weakest link in a digital ecosystem, and education and parenting are seen as vital in addressing this issue.

Lastly, the significance of open-source development and public accessibility in advancing technology is highlighted. It is suggested that open-source contributions and public exploration of technology can help accelerate advancements, as closed-door development has been slowing down progress.

In conclusion, the analysis sheds light on various concerns and potentials related to emerging technologies. It underlines the need for heightened cybersecurity measures, fostering innovation, and acknowledging the role of AI in enhancing security. Moreover, it highlights the challenges and benefits of the metaverse, the need for updated regulatory frameworks, and the importance of the human element and open-source development in the digital ecosystem. Overall, this analysis provides valuable insights into the complex landscape of emerging technologies.

Adam Russell

During the cybersecurity discussion, the speakers addressed several key topics. They first highlighted the increasing complexity of transactions and data storage worldwide. With more transactions occurring daily and a growing volume of data being stored, the need for robust security platforms and tools is increasing.

The participants also expressed concern over the persistent threat of attackers finding ways to penetrate networks, even with advanced security measures in place. They specifically mentioned the introduction of ransomware as a method employed by attackers. Despite advancements in cybersecurity, attackers are still able to exploit vulnerabilities and gain unauthorized access to systems.

To combat these threats, organisations are increasingly turning to artificial intelligence (AI). AI is being used to quickly gain context on adversaries and reduce the time it takes to detect potential cyber attacks. By leveraging AI technologies, organizations can enhance their ability to identify and prevent these threats.

The emergence of quantum computing was another significant topic of discussion. Although quantum computing brings various benefits, it also introduces cybersecurity risks. However, the speakers stressed that at present, quantum computing does not pose a threat to encryption systems. Nevertheless, they highlighted the importance of exploring post-quantum cryptography as an opportunity to address these future risks.

The importance of collaboration and teamwork in strengthening cybersecurity was also emphasized. Participants acknowledged that different facets need to work together, as everyone brings their unique expertise to the table. By collaborating, stakeholders can bolster the technology and its security, ensuring a more robust defense against cyber threats.

In virtual spaces, regulation and safety measures were discussed. Speakers underscored the need for flexible, ecosystem-specific policies to ensure safety while promoting innovation. They cited the example of Second Life, which successfully implemented user-friendly regulations to safeguard users and encourage innovation. The notion of a “metaverse of metaverses” was also introduced, highlighting the existence of diverse ecosystems with their specific safety measures.

Regulation was seen as crucial for the security of critical systems and the safety of users. However, the speakers cautioned against rushing into extensive regulation on top of artificial intelligence (AI) and large language models. They expressed concern that excessive regulation could impede technology adoption and hinder a country’s ability to harness its potential.

The importance of partnerships and international cooperation in combating global cyber threats was emphasized. The participants cited ongoing efforts to combat child safety issues, tackle ransomware attacks, and establish public-private partnerships with companies that host substantial amounts of data. Collaboration was viewed as key to addressing the evolving landscape of cyber threats effectively.

In conclusion, the discussion on cybersecurity highlighted the challenges and opportunities brought forth by the increasing complexity of transactions, data storage, and emerging technologies. The participants emphasized the need for robust security measures, including the use of AI and exploration of post-quantum cryptography. Collaboration, regulation, and partnerships were viewed as vital tools in fortifying cybersecurity and safeguarding critical systems and user safety.

Moderator – Lucy Hedges

During the discussions, the speakers delved into the complexities of emerging technologies, focusing on AI, cybersecurity, and the virtual world. They acknowledged that AI is a technology that has barely scratched the surface of its potential benefits or detriments. While it has been a long-standing technology, it is now gaining mainstream attention.

One of the main points raised was the need to find a balance in how AI is used due to its potential impact, both beneficial and detrimental. The speakers noted that the full extent of AI’s benefits and dangers are still not fully known. This highlights the importance of carefully considering and managing the deployment of AI technologies to harness its potential advantages whilst mitigating the potential risks.

The discussions also highlighted the significance of teamwork in innovation and effective cybersecurity. A diverse team with different skills and perspectives fosters innovation and strengthens technology security. By collaborating and working together, different facets of a team contribute to building a more robust environment for enhancing technology security.

While AI can be effectively leveraged to enhance cybersecurity, it was also acknowledged that emerging technologies, specifically AI, present significant cybersecurity challenges. The rapid advancement and complexity of AI technology create new vulnerabilities that must be addressed to ensure the security of digital systems and infrastructure.

The negative aspects of the virtual world were also discussed, particularly experiences of harassment and bullying in platforms like Second Life. It was argued that there is a lack of preventative measures and punitive actions in place to address such behaviors. Thus, there is a need for regulation to prevent and punish bad behavior in the virtual world, ensuring a safer online environment.

Additionally, the discussions highlighted the intertwining of digital and physical lives, emphasizing the need to regulate these experiences. As digital lives become increasingly connected with the physical world, effective regulations must be put in place to protect individuals and maintain peace, justice, and strong institutions in both realms.

The importance of developing emerging technologies in the public domain was another noteworthy point raised. By allowing everyone to “play with” and experience these technologies through open source support, there can be faster knowledge generation and advancement than with traditional research approaches. This aligns with the goal of accelerating progress and knowledge sharing in the field of emerging technologies.

Overall, the discussions were neutral to positive in sentiment, with recognition of the potential benefits and challenges associated with emerging technologies. The speakers encouraged finding a balance, fostering teamwork, addressing cybersecurity challenges, regulating the virtual world, and promoting the development of emerging technologies in the public domain. These discussions shed light on the intricacies and complexities surrounding these topics, urging stakeholders to approach these technologies with caution and responsibility.

Chante Maurio

The analysis provides a comprehensive overview of various perspectives on the benefits and challenges of AI and emerging technologies. One key finding is that AI technology advancement has both advantages and drawbacks. On the positive side, it provides the opportunity to process large sets of data and use them in predictive ways. However, there are concerns that this advancement also allows bad actors to be trained at a faster rate. Furthermore, it enables less skilled individuals to build capabilities that they would not have otherwise been able to acquire.

In terms of emerging technologies, the analysis highlights the challenges they pose not only in terms of technological advancements but also in talent acquisition. To overcome these challenges, some argue for the utilization of AI to substitute for certain analysts and upskill existing ones. This approach is seen as a way to address the talent gap in this rapidly evolving field.

Education and proper educational programs emerge as crucial factors for the success of the global economy in mitigating the risks associated with emerging technologies. It is believed that these programs can help individuals and organizations navigate the complexities of this evolving landscape and ensure the development of necessary skills. Additionally, global harmonization of regulations is seen as vital for preventing issues of equity and competition that can arise from uneven adoption and control of emerging technologies.

The timing of introducing frameworks, standards, and regulations is also deemed critical. If introduced too soon, regulations may hinder technology’s potential. Thus, it is recommended to carefully consider the best time for implementing regulations to strike a balance between innovation and regulation.

Ethical considerations are viewed as an important aspect of tech regulation, and it is suggested that they should be managed alongside the implementation of technology. Technicians must not overlook the ethical dimensions while focusing solely on technical requirements. This recognition highlights the need for an inclusive and comprehensive approach to tech regulation.

In terms of cybersecurity, the analysis emphasizes the importance of education and training. Numerous resources, such as technical documents and standards offered by the National Institute of Standards and Technologies, free cybersecurity training provided by organizations like the Global Cyber Alliance and the Cyber Readiness Institute, and training and certificate programs offered by testing and certification organizations, can facilitate education and training in this field.

The analysis also recognizes the significance of communities, forums, and the exchange of ideas. These aspects are seen as essential for collective learning and the development of innovative solutions in response to emerging technologies.

The importance of introducing frameworks and standards at the right time into an ecosystem is underscored. While baseline standards are required, the adoption of these standards remains somewhat fragmented. It is acknowledged that certain additions and deviations from the standards may have purpose and necessity, but they should be mapped back to the baseline to ensure coherence and interoperability.

Finally, the analysis highlights the importance of international collaboration in aligning standards. Organizations such as IEC, ISO, and ISA are commended for providing forums that facilitate collaboration and cooperation in developing and aligning standards.

Overall, the analysis reveals that while AI and emerging technologies bring about numerous opportunities, they also pose challenges that require careful consideration. Ensuring proper education, timely regulations, ethical considerations, cybersecurity training, community building, and international collaboration are identified as critical factors in navigating the evolving landscape of these technologies.

Session transcript

Moderator – Lucy Hedges:
Hello from the cyberverse, maximizing the benefits of future technologies. Adam Russell, Vice President, Cloud Security, Oracle. Ahmed Al-Isawi, Director of Cybersecurity Governance, Risk and Compliance, GRC, NIAAM. Chanti Maurio, Vice President and General Manager, Identity Management and Security, UL Solutions. Lucy Hedges, Moderator, Technology Journalist and TV Presenter. All right, well that’s the introductions out of the way. Hello everybody, hope we’re all good and the energy levels are still high even though we are getting towards the end of the day. So we are about to dive into why it’s so critical to understand and act upon the implications of a digital future in order to prepare for it from a cyber security perspective. And ultimately lay the foundations of a stable and secure cyberspace for future generations. Now I have a brilliant bunch of esteemed panelists to my left who are very well versed to talk in this area and divulge their expertise while navigating through the current progress of emerging technologies like quantum computing, AI and the metaverse for example. And why we need to develop mechanisms and policies to maximize the benefits and opportunities presented by these future technologies. So first things first, hi guys, how are you? Adam, Chanti and Ahmed, all the way over there, hi. So we’ve got 35 minutes to talk about quite a complex topic so I’m just going to dive in with my first question to get things going. So I think a great place to start would be to paint a bigger picture, to really contextualize the conversation. So how has the cyber security landscape evolved with the advent of emerging technologies like quantum computing, AI and the metaverse? Adam, I’m going to throw that to you first.

Adam Russell:
You’ve heard a lot of these topics today from the advent of AI, how we can use AI for security purposes and some of the risks with AI overall around safety. Overall I think the world’s growing more and more complex. You’re seeing more transactions on a daily basis, more data being stored throughout the world. And yet we’re growing more of our security platform and tools globally. We’re introducing new techniques to prevent hackers from attacking our data, introducing ransomware across the landscape. But the data shows that we’re continually enabling attackers into our networks, unfortunately. So with the advent of AI, what we’re seeing is ultimately the attackers and what my organization is ultimately doing from a security operations perspective is utilizing AI for gaining context quickly on adversaries and bringing down that meantime to detection, ultimately. And we’re introducing those into our tool sets and then ultimately leveraging them to protect customers globally.

Moderator – Lucy Hedges:
Yeah, it really is quite fascinating just how fast everything’s evolved. Companies like Oracle really have to be at the top of their game in order to not only help yourselves but help customers and businesses as well. Do you guys have anything to add to that question?

Chante Maurio:
Lucy, maybe, well first of all thanks to NCA and GCF for having me here today. And maybe just to add, with any technology advancement there’s positives. And as you spoke to, there’s the opportunity to go through large sets of data and use it in predictive ways. There’s also the challenges that come with that on the other hand. And with AI, it enables the bad actors to be trained up faster. It takes maybe a less skilled individual and allows them to build capabilities that they would otherwise not have been able to. So it increases the threat in many ways.

Moderator – Lucy Hedges:
Go on, Ahmed.

Ahmed Al-Isawi:
First of all, assalamu alaikum wa rahmatullah wa barakatuh and really thankful for being here. Thanks for NCA and the Global Cyber Security Forum for making this happen. And actually, to be honest, I’m terrified. Yeah? I’m really terrified. Why? Because I started as a hacker. And I know exactly how hackers are thinking and how devastating they can be. If we live in, for example, a city that everything around us is digital, we have sensory around us, speaking things about us and about our personal lives. And for example, I’m thinking about the line being developed in Neom. I’m responsible over there with the rest of the team led by Mr. Al-Masferhizi over there and the rest of my colleagues. We have the responsibility of protecting this future and protecting the livability of the residents who will come there, the companies that will come, the business that will come there. So actually, I’m terrified because I’m standing on the front end of all the advances that human science and innovation came to. How really we can take it further and protect the future for this? How can I protect the future of my children when AI helped them in solving, for example, their homeworks? Are they getting the right education, for example? Are they really developing their skills while the AI itself is helping them? How originality are we keeping in the future generations? Things like that. But actually, I’m terrified, a little bit terrified.

Moderator – Lucy Hedges:
Yeah, I think you have a right to be terrified. AI is something that’s been around for an incredibly long time, but it’s only really being pushed into the mainstream, I’d say now. I think it might be fair to say that. And it’s incredibly complex. We’re still really only scratching the surface of how we’re going to benefit from this technology. Is it going to be detrimental to us? Is it going to be incredibly beneficial? And what is the best way to really balance that? So with that, what would you say are the most important cybersecurity challenges of these emerging technologies? And there’s a lot of emerging technologies, but is there anything that stands out to you guys?

Adam Russell:
I think ultimately, right now, we have a true opportunity to forward look on post-quantum crypto. That is an emerging topic that’s been discussed for the last 10 years. But we’re finally coming to a reality where a large percentage of the tech companies globally are introducing quantum computing, leveraging qubits. And there’s a lot of energy in the cybersecurity spectrum through NIST in the U.S., as well as organizations globally in Germany, as well as in the kingdom here, looking at mechanisms to safeguard their data and their encrypted data against threats. NIST just recently announced the selection of PQC algorithms on signing and digital signatures, as well as the encryption mechanisms under the hood, such as Kyber. And although quantum computing isn’t breaking our encryption today, it’s nice to see that it’s not a fear-mongering effect in the cybersecurity spectrum. We’re actually taking this as an opportunity rather than as a challenge.

Moderator – Lucy Hedges:
Yeah, so you’re seeing things moving in the right direction right now. I can see you nodding away, Shantae. Have you got anything to add to that?

Chante Maurio:
Maybe to supplement it, you’re talking about the technology aspect of the challenges. And when we think about these emerging technologies, in addition to the technology challenges, there’s the people challenge. There’s the talent challenges that we all have in the marketplace. So finding ways to overcome the talent challenge, whether it’s being using AI to substitute for some of the analysts and upskilling the analysts. We talked about that in an earlier session today. Or whether it’s putting together proper educational programs. It’s going to be very important for the success of the global economy in coming back against some of the risks that are created.

Ahmed Al-Isawi:
I can expand to what my colleague panelists shared here. Maybe the challenge for us as we are in the leadership right now is how can we lead others to innovation and beyond innovation. We need to look at the problem not only from technology but also from other dimensions. Like how us as leadership can we lead others. There’s a very interesting study. I think it was conducted by NASA itself. They tried to understand how much innovation, how much percentage of innovation in different age categories. For example, they found in the children of age 5, they have 95% capability to innovate new things. For example, in age 31, I think, it became much less, like 5%. I’m at age 44. I’m wondering how much innovation I can bring to the table. Maybe my responsibility and the challenge on me right now is how can I foster a team that innovates. How can I multiply this within my team? How can I drive them through this journey? Of course, I believe whatever challenges will come in front of us, if we have the right skills, knowledge and values in our team, in the thinking of our team, in the design thinking, for example, for the future, we can embed cyber security from the initial stages of the ideation itself and then drive it through. One of the things we are discussing in the big projects being developed in NEOM, what’s the right ontology of things? How can we bring security in the ontology of things themselves? There’s many challenges, but maybe if I can conclude with this, the challenge on us as the leaders of today to build the future, to lead to the future.

Moderator – Lucy Hedges:
I think a key word just to pull out of what you just said there is team. This is a collaborative effort. You feel that compared to maybe the younger generation, you don’t feel as innovative, you’re not as innovating as much, but you are really strong when it comes to leadership. You’ve got so many different facets working together. That’s what helps build a stronger environment in order to bolster this technology and in order to make everything and all these emerging technologies more secure. We can’t do things by ourselves. We all bring different things to the table. I just want to take it back and go back to talking about AI for a sec. How can AI be effectively leveraged to enhance cyber security and what are the considerations for AI-driven threat detection and response?

Adam Russell:
I think AI has been discussed quite a bit, Lucy, throughout today’s conference. Yes, it’s an underlying theme. But I don’t think it’s a general panacea that we should all fear. It’s a tool that we can leverage to better protect our networks as well as our people and data. You’re seeing the advent of AI being applied to security operations as well as ultimately, as an example, the supply chain security aspect. At least in the context of many startups globally, what we’re seeing is they’re looking at mechanisms to perform semantic analysis as well as detect adversaries that are pushing back doors into the most popular third-party libraries. They’re leveraging that social network graph on vector databases as well as ultimately the database backends and your normalized relational databases. You can use large-scale language models to detect when there’s a single developer developing on log4j, as an example, or a developer sitting in country X that your particular country doesn’t trust anymore. It gives us the ability to better understand our supply chain that today is more complex than ever and make decisions on that in a real-time basis. Then you can apply that into your build system, so you can evict actors or say, I no longer want to take a dependency on that particular third-party library. AI is a powerful tool in that context, but it’s also a threat intelligence component on the supply chain in this context.

Moderator – Lucy Hedges:
It’s like AI versus AI, isn’t it? Like I said, it’s this collaborative effort. You use it to enhance current infrastructure. Anyone got anything to add before I move on?

Ahmed Al-Isawi:
Definitely, AI has a big role to enhance the cybersecurity, but I think the biggest limit is our imagination. How can we imagine doing… Traditionally, we need big space for a sock. I think through Metaverse, for example, and AI, it can be something different, something more innovative, like breaking the traditional boundaries. One of the things we always say is that if we keep trying to solve modern problems using traditional methods and solutions, whether it’s governance, whether it’s policy, whether even the idea of the solution itself, we will never be able to secure the AI nor use the AI itself. But just to name a few examples, like if we have this huge supply chain and we need not only to monitor ourselves internally and our digital cyberspace, we need to keep an eye on even the supply chain, supplying to us the technology goods or whatever kind of product or service. If we can use AI to pick up anything that’s happening there and directly reacting to that, we have only a matter of minutes before something bad happens. Even the turnaround time for detection and for reacting to the incident is becoming much, much shorter. We have a much shorter window. I think for the recent studies, it’s around eight minutes just as a window to react. As a human, we cannot. We have to use AI in that.

Moderator – Lucy Hedges:
That’s such a great example of just the power of artificial intelligence and how it can really enhance businesses. You mentioned the metaverse, which segues me in nicely to my next question. Of course, the metaverse is this virtual interconnected digital universe and it presents a host of quite unique cybersecurity challenges with the convergence of all these various technologies that are living within this digital world. What unique cybersecurity challenges does the metaverse present? What security measures are necessary to protect users and their data in these virtual environments? Who wants to go?

Ahmed Al-Isawi:
One of the earliest metaverses is called Second Life. It’s a game. I think, sadly, and you will be very lucky if you pass the first five seconds without being bullied or harassed in Second Life. This is one of the challenges. As a cybersecurity, we can protect the infrastructure, but who will protect the individuals inside that digital world? Human-to-human interaction is something very important, in my opinion. We can use the emerging technology to protect the emerging technology itself, like the idea of using decentralized digital identities inside the metaverse itself. Once this person is identified, of course, will behave much better than without identity, I think. But still, not only cybersecurity experts should work to protect this new technology. Even other domains should also contribute to protecting, like the police, for example. For example, making sure that the morals of people interacting inside this digital space is at a good level.

Moderator – Lucy Hedges:
Well, that’s just it. I think in the real world, obviously, we know when we do things wrong. We’ve got police, we’ve got law enforcement in place, and punishments for people that do bad. But I think in the virtual world, there’s nothing there at the moment. You can be bullied, you can be beaten up, all types of things can happen. And sometimes we brush it under the carpet. It’s digital, but it’s like, no. Eventually, our digital lives are merging with the physical, and these experiences really need to be regulated, and there needs to be some kind of enforcement in place that’s going to punish people or reprimand them for the negative behavior that they enforce in these environments.

Adam Russell:
I think, at least me personally, I grew up in the advent of the internet where it allowed innovation, a lot of freedom, a lot of independence to hack, like my fellow panelists. I grew up as a hacker in the underground. I know when things can go awry, but I think there’s further discussions today around cyberpaths. Ultimately, we all need a little bit of exploration and independence to innovate and look for opportunities. to build new tools on top of protecting our citizens within the metaverse. So to take an example on Second Life, Second Life became extremely important and ultimately popular because it enabled hacking, it enabled that conversation without a domination of regulation within that space. It was almost like a tit for tat game within the metaverse within Second Life where they built in regulation, they enabled users where they felt safe in different ecosystems. And if users broke that trust, they are cordoned off into an area that they could ultimately have some freedom that wouldn’t disrupt other users. And so within the metaverse, I think there’s not gonna be a one size fits all within a safety domain. You’re gonna have to look at a metaverse of metaverses, so to speak. And it’s gonna give users choice and opportunities for innovation. And you’re seeing that, at least in the context of social media, there’s an explosion of social media networks. And I think we’re gonna see this within the metaverse as well.

Moderator – Lucy Hedges:
Yeah, yeah, absolutely. Anything to add?

Ahmed Al-Isawi:
One? No, go please. Metaverse has its negative things, but also it’s contributing, I think, very greatly to the advancements. Like, for example, having or doing school experiments in a metaverse. Imagine that. This will not only cut costs on schools and education, but it will also give everybody or students the chance to have this experience in a very safe environment. Like how chemicals will react with each other. Imagine this in schools. I think this is a very fantastic idea.

Moderator – Lucy Hedges:
Would’ve made my education a lot more exciting, that’s for sure. Yeah. So Shante, my next question’s for you. What are the potential challenges that might arise from the uneven adoption and control of emerging technologies, particularly when it comes to cross-border contexts?

Chante Maurio:
So interesting question. Really interesting word that’s in the question. So when I hear control in the context of cross-border scenarios, I think regulations. And as we all know, regulations lag adoption in many cases. And so when there’s uneven adoption and uneven control of emerging technologies, the potential exists to create market confusion for companies as they are trying to navigate various market requirements around the globe. And as they’re trying to navigate these market requirements, it can create an issue of equity of access for particular citizens and geographies as well, while also creating competitive imbalances for companies and countries as well. So the implications of an uneven adoption really run the gamut and they stretch from the citizens to the companies from a commercial perspective as well as to the countries as well. In the absence of regulations, I’m always a proponent of frameworks and standards. We have a wide variety of frameworks and standards today in the cybersecurity market. And there’s really been a waterfall. I think we were speaking about it before in the green room, really a waterfall of regulations just in the last 12 to 18 months in this space. But because of this, UL Solutions, the company that I work for, we’ve long been a supporter of harmonization. And so global harmonization is really what’s going to allow companies to navigate the adoption of standards around the world and allow for access by the citizens so that there aren’t the equity concerns or the competition issues in the future.

Moderator – Lucy Hedges:
Yeah, absolutely. Well, while we’re in the realm of regulation, what are the key considerations for governments in developing adaptable regulatory frameworks that really foster innovation while addressing concerns related to data privacy, cybersecurity, job displacement? Ahmed, I’m gonna throw that at you. Given your insights in cybersecurity, GRC and regulatory frameworks, it’d be great to hear from you on this.

Ahmed Al-Isawi:
So I think there’s a fundamental challenge because regulations usually work well in a symmetric world. Like establishing the baseline that every infrastructure or organization should operate on. But the problem is, fundamentally, cyberspace and emerging technology are so destructive. They are making the world, they are pushing the world to asymmetric nature. How can you protect something that is a flying object, like a drone? There’s no cable from here to the drone, it’s all waves in the air. How can you protect that, for example? And I think having more than 19 year of experience just in crafting policy and frameworks and trying to use automation as much as possible, I think there’s no framework as of now that can protect this emerging technology. We are learning as we go, we’re trying to protect as we go. But one of the issues, for example, one of the studies expects that next year, around 60% of the employees will start using AI in their work, if not already. Are we ready from policy for this? Should we allow our employees to use AI? What are the implications of that? Everything is going into that, and how can we protect this data? How can we enable our employees to benefit from this powerhouse of AI tools? And I think we should start from now exploring this area. We used to say, bring your own device, but next year we’ll say, bring your own AI. Think about that. Everybody’s using AI in their digital phones or smartphones, or even in their work. How can we regulate that? How can we protect the data? How much data can we allow our employees to put in AI? Of course, for me, I always feel much safer if I have more control, enforcement around that. But still, there’s also an issue that, do you know that AI itself, or generative AI itself, can hallucinate, can fake information for you? And this is really proven. If you take a fake website, and you give it to ChargeGBT, and you ask a question, you will receive an answer. Even if the website does not exist, AI hallucinates, it’s a hallucination. And this is very, very dangerous, because if we become dependent on AI, how can we judge whether the AI itself is reliable? And this brings us to the point of, AI explains to us the steps of solving the problem or generating the answer itself. This is what I think.

Moderator – Lucy Hedges:
Yeah, and do you think it’s down to the convoluted and multi-layered nature of all these emerging technologies as to why we’re not really nailing this regulatory framework yet? Kind of to your point, it’s kind of not a one-size-fits-all, but regulation really needs to happen, because the technology’s moving at such a fast pace. We need to make sure that we’re protected in all senses of the word.

Adam Russell:
I’ll keep it short, and I’ll pass it to you. I think regulation should occur for the security of our critical systems, as well as the safety of our users. But we should be wary about how fast we wanna push regulation on top of AI, our large language models. I think right now we’re just in the midst of innovation. Innovation. And if we start pushing regulation too quickly, you’re going to end up in a world where there’s going to be that lack of adoption or lack of ability for a country to ultimately intake that technology. And there’s gonna be that varied approach ultimately. So at least what we’re doing at Oracle is we’re giving enough platform tools for our enterprise users globally to make the decisions on how they wanna regulate the data that they have in their ecosystem, giving them flexibility. Because flexibility’s ultimately the key that we wanna provide.

Moderator – Lucy Hedges:
Yeah, absolutely.

Chante Maurio:
So I mean, I think maybe two comments. One, on what you just said. I think there’s a reason, there’s many reasons, right? There’s many reasons that regulations lag, innovation and adoption. And one of those reasons is because if you put regulations in place, and the ecosystem understands that too soon, then you’ll stop innovation and you’ll stop the technology from reaching its potential and its capability. And so what the overall ecosystem around technology creation and adoption has to think about is what is the right time? What is the right time to introduce framework standards regulations into an environment? And we talk a lot, now to Matt’s comment earlier, we talk a lot about the technology, the technical requirements that are going to exist in regulation. And there’s also this ethical piece that maybe us as technicians don’t think about all the time. I know you do, because I’ve heard you mention it many times in today’s conversation, which is wonderful. But we do need to think about how we are going to manage through, and I know that there’s many big brains thinking around this already, but how do we manage through all of the ethical considerations as well?

Moderator – Lucy Hedges:
Yeah, go on, Ahmed.

Ahmed Al-Isawi:
I think, you know, regulations will not solve everything, because there’s a value dimension that we need to preserve. You know, especially for our kids when they go into an uncontrolled metaverse. It all comes back to us as human beings as having values, cultural values, or religious values, et cetera. Now, how can we, you know, values cannot be regulated by frameworks. It’s something that goes back to the education, parenting, for example. So we have to solve this problem from multi-dimensions, not only from regulations. We can secure servers, no problem about, but how can we secure the human? You know, we always say that humans are the weakest link. I don’t like that, but literally, we have to work on the human element in this ecosystem.

Moderator – Lucy Hedges:
Yeah, so to your point about, you know, education is education that needs to happen, what steps, or what better steps can be taken to educate and train individuals and organizations on cybersecurity best practices, and all the unique threats that come along with all these emerging technologies? What can be done in this department, do you think?

Chante Maurio:
I think one really fortunate thing about new and emerging areas is the people, and the individuals that are attracted to it. So the people that are attracted are people that are individuals that are very curious, lifelong learners, individuals that are constantly seeking knowledge, problem solvers, individuals that like to solve problems, and like to solve puzzles, which leads to the innovation that you talked about earlier. And so that’s wonderful. Those are critical aspects of education, because at the end of the day, we are all responsible for the ownership of our own learning and development journey. And so I think that first and foremost is important to note. Once you have individuals in place, and once individuals are attracted to a particular domain that are knowledge seekers, that are lifelong learners, that are problem solvers, there’s a number of resources available to support these individuals and organizations to better understand cybersecurity, for example. Using that as an example, in the US, the National Institute of Standards and Technologies offers a number of technical documents and standards, such as the NIST Cybersecurity Framework and the Cyber Essentials Toolkit to support it. If we zoom out and we think more from the global environment, you have the Global Cyber Alliance and the Cyber Readiness Institute that offer free cybersecurity training for small businesses, addressing some of that equity conversation that was alluded to earlier. And then testing, certification, inspection organizations like UL Solutions and the International Society of Automation offer trainings, personnel qualifications, certificate programs to support. And then honestly, forums like the one we’re in today, these are fantastic, right? They build communities, they allow us to exchange ideas, and they allow us to learn and grow together. And so these forums are also incredibly important.

Moderator – Lucy Hedges:
Yeah, absolutely, I agree. You guys up to add anything? I think she nailed it. No, we’re nearly out of time, guys. So I’m just gonna, to wrap up our conversation, my final question to you is, how can international cooperation and collaboration, back to this key theme of collaboration that we’ve been talking about, how can this help establish common cybersecurity standards and norms in the face of global challenges that are posed by all these emerging technologies? So final thoughts.

Adam Russell:
I think ultimately, we need to strengthen more and more of our partnerships. I think we’re doing that today through combating child safety on the internet, as an example. I think we’re doing a lot already. We could do even more on stopping ransomware attacks. So we’re setting these standards, but I think it extends beyond just the standard body. We need to start looking at partnerships. A large percentage of private enterprises are hosting a large percentage of our data, such as the ghouls of the world, even Oracle as an example. It creates that opportunity for public-private partnership, and I just want to thank GCF and NCA for allowing us to participate, and I’m looking forward to future collaboration.

Moderator – Lucy Hedges:
Yeah, absolutely. This is a brilliant example of that international collaboration that we’re just talking about. Shante, would you like to add anything?

Chante Maurio:
I think I spoke earlier about the importance of frameworks and standards introduced at the right time into an ecosystem, and once they’re introduced, what’s important is that we begin to see baseline standards that kind of come bubble to the top, and we’re beginning to see that, for example, in industrial IoT with IEC 62443, and in consumer IoT with EN 303645, and so adoption, though, of those baseline standards is still somewhat fragmented, and at the same time, when they are adopted, there’s additions and there’s deviations that are created along with them. Those additions and deviations are absolutely necessary. There’s reasons, and there’s a purpose for them. What we need to be able to do is then map those back to the baseline so it’s readily visible how these anomaly requirements actually map back, and so there’s organizations out there that can support that, IEC, ISO, ISA. Those organizations, those standards bodies create great forums for the collaboration and the cooperation to really align around those, and I think that’s going to be critical moving forward.

Moderator – Lucy Hedges:
Yeah, absolutely, and last but not least, Ahmed, let’s hear from you.

Ahmed Al-Isawi:
Yeah, maybe I can contribute to my esteemed panelists here. Like, you know, this is a new domain, but it’s really in close doors, or behind closed doors. It’s not a public domain knowledge. It’s not well supported by open source, so we have very, very limitation in this area, so maybe if it can be supported by open source being also provided to the public so everybody can play with it, can experience it. We reached to this point only after long journey of trial and error. If this advanced technologies, emerging technologies being developed behind closed doors, I understand the potential and the intellectual property behind that, but the real advancement is when everybody can play with it, can practice it, and this will generate knowledge lots faster, more faster than the traditional research.

Moderator – Lucy Hedges:
Yeah, absolutely, and on that note, I’d like to thank my knowledgeable panelists. Thank you so much for taking the time for sharing your insights and expertise. Please, round of applause for these guys. They did an amazing job. Thank you. You know, there’s still, we’re moving in the right direction. So much incredible things are happening in this space, but still a lot of work to be done, but the bonus, the positive takeaway is that we are moving in the right direction, and that can only be a good thing. So, thank you once again, all right, and thank you.

Adam Russell

Speech speed

150 words per minute

Speech length

1196 words

Speech time

477 secs

Ahmed Al-Isawi

Speech speed

145 words per minute

Speech length

1888 words

Speech time

781 secs

Chante Maurio

Speech speed

158 words per minute

Speech length

1310 words

Speech time

496 secs

Moderator – Lucy Hedges

Speech speed

210 words per minute

Speech length

1487 words

Speech time

425 secs