Security Unfiltered

The Future of AI and Cyber Resilience with Matthew Gorge

January 08, 2024 Joe South Episode 137
Security Unfiltered
The Future of AI and Cyber Resilience with Matthew Gorge
Security Unfiltered
Help us continue making great content for listeners everywhere.
Starting at $3/month
Support
Show Notes Transcript Chapter Markers

Discover the intricate dance of cybersecurity and compliance in a world where geopolitical fragmentation is the stage, and Mathieu Gorge, CEO of VG Trust, is our guide. With the finesse of a seasoned expert, Mathieu navigates through the complexities of maintaining continuous compliance amidst the shifting sands of cyber threats. We delve into a realm where the distinction between the compromised and the unscathed is paper-thin, exploring the sobering implications for companies ensnared in the webs of intellectual property wars. This is a conversation that goes beyond the surface, revealing the stark realities of operating within contentious digital territories.

Join us as we merge the worlds of business strategy and cybersecurity, showcasing how digital skirmishes often foreshadow physical battles. As we dissect the trends reshaping the security culture within organizations, Mathieu illuminates the surprising involvement of private sectors in managing critical infrastructure. We tackle the personal responsibility that comes with the surge of connected devices and the burgeoning industry of hacking as a service, emphasizing the need for continued education to safeguard our digital footprints. Here, the importance of personal vigilance becomes as clear as day.

Looking ahead, we confront the future of AI security, unearthing the challenges in workforce planning as we anticipate the evolution of skillsets and the rise of AI governance. The conversation then pivots to an examination of the robustness of our critical infrastructure in the face of cyber onslaughts, with a spotlight on Europe's pioneering Digital Operational Resiliency Act (DORA). Mathieu leaves us with a treasure trove of insights into ransomware intricacies and the criticality of proactive resilience, steering listeners toward resources that bridge the gap between cybersecurity concerns and corporate leadership.

Support the Show.

Affiliate Links:
NordVPN: https://go.nordvpn.net/aff_c?offer_id=15&aff_id=87753&url_id=902


Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today

Speaker 1:

How's it going, Matthew? It's really good to have you back on the podcast here. It's been I mean, it's been probably 18 months since you were last on the podcast. I'm really excited, you know, for what we, what we have in store today.

Speaker 2:

Yeah, great. Thank you very much for having me back. Time flies when you're having fun, I guess yeah.

Speaker 1:

Yeah, it doesn't seem like it's. It was that long ago, but it's just like one thing after the next. You know in our lives where it's like traveling and just constant, constant, go on different topics.

Speaker 2:

Yeah, indeed, and 2023 has been particularly busy on many, many fronts, everywhere.

Speaker 1:

Yeah, absolutely so you know before we dive into you know 2023 recap. Why don't you tell my audience you know who you are, what your expertise is and all that good information? Right, because you know, maybe maybe there's some new listeners that haven't heard you before and I just want to make sure that everyone knows. You know who you are and what you provide in the field.

Speaker 2:

Yeah, sure. So my name is Matthew Gorge. I'm the founder and CEO of VG Trust. We're a software provider of GRC solutions and we've got an award-winning solution called VG1 that allows you to prepare for, validate and manage continuous compliance with about a hundred security frameworks worldwide, specifically anything that has to do with data privacy, information, governance and compliance. So, as you can imagine, all the usual suspects PCI, hipaa, gdpr, nist, iso and so on and so I've been in cybersecurity for longer than I care to admit, and probably been about 25 years. I started when cyber was not called cyber, it was called network security, and then it became content security, internet security, data security, privacy, and now we're in the area of global compliance and global security.

Speaker 2:

I'm involved with a number of security think tanks, including the VG Trust Global Advisory Board, which is a non-for-profit think tank with about 1,350 members from 30 countries, and we talk about what's happening in the industry under chatroom house rules. One of the things I would say straight off is that my view is that security, if you work in security and you do your job correctly, nobody knows your name, but if something goes wrong, you become public enemy number one very quickly within the organization and it carries a stigma in the industry moving forward, and I think that as a community, we need to look after each other and we need to make sure that we share best practices, not just by saying this is what you should do, but also saying you know what? This is where I make mistakes. I'm going to share that with you so that you don't have to make my mistakes, and hopefully you will share your mistakes with me so I don't have to make them.

Speaker 1:

Yeah, that's a really good point and I think that that kind of experience is often overlooked. I used to work for a company and they were bringing in a new VP of security and he was recently at, I think, two or three other places back to back that were breached, and they were big, huge breaches, like the target breach and a couple other ones like, I think, home Depot. It could have been a very unfortunate situation, right, like this guy just came into the role, they get breached and it's pinned on him and whatnot, right. But everyone internally was like, ooh, are we sure we want to hire that person? And the only person that actually stood up for him was the senior director that would be reporting to him and said, you know like, well, why wouldn't we want that experience in house? We haven't been breached, he's gone through it, he knows what happens, what can happen and how to handle that situation.

Speaker 2:

Yeah, and you know, the reality is that there's only two types of companies out there the ones that have been breached and the ones that don't know who's been breached. And it's nearly better to understand that you have been breached so you can better the systems, the processes, the security awareness, the culture of security, so that you can make that an ongoing journey. I always say that security is a journey and not a destination. So by the time you reach compliance with regulation one, two, three or XYZ, your ecosystem has evolved. You know, maybe people have left, maybe you've acquired a business, maybe there's a new system that has been rolled out, and so your risk surface changes all the time.

Speaker 2:

And so to say that we are secure right now, you might be secure for a millisecond, but everything is dynamic, and so you need to work with people that understand that, and somebody that's already dealt with a breach most likely we'll understand it better than somebody that hasn't. That doesn't mean that the scales are not equal. I'm just saying that having to deal with a breach from a PR perspective, a technology perspective, a legal perspective, you know, an internal perspective is something that, unless you've lived it, it's difficult to grasp. Now you can get amazing training for it and you will be much better at dealing with it if you've had the training. But unfortunately you won't really grasp it until it happens to you, if that makes sense.

Speaker 1:

Yeah, that's a really good point. You know you bring up there's two kinds of companies companies that know that they've been breached and ones that don't know that they've been breached yet. Right, do you have an interesting question. That's probably a loaded question. When companies have, you know, subsidiaries or, let's say, branch branches in China or more adversarial countries, right, do you assume that they're already breached and they it's more of an internal breach at that point? Does that make sense? Because if, how China and Russia just throw out a couple adversarial countries out there, how they typically operate is that when you operate in their country, they own all the IP that you create there. So, do you see it that way or not necessarily?

Speaker 2:

So I don't necessarily think that they've been breached or spied on, but what I would say is that if you need to understand your ecosystem and what I mean by your ecosystem is anything that's behind your firewalls, your hybrid workforce, your applications, your third parties, fourth parties, anybody that interacts with your systems, even from time to time, even sporadically and if some of those subsidiaries or branches or people are based in a country that is at risk, then you need to run a tabletop exercise as to what it would mean if you could no longer get to that data, if you could no longer get the physical assets, the hardware, for instance, back into your own country, if you could no longer talk to the regulator, the local regulator, because what might happen and that happens specifically with Russia and Ukraine is that from one day to the next, suddenly it became super difficult to get your data and even if you have a backup of your data, because of nationalization of Western assets in Russia, for instance, you will never get that data back and you can be sure that at that stage, that data is going to be analyzed. So there are currently about 10 plus really bad conflicts worldwide and of course, you hear about Russia and Ukraine, and you hear about Israel and Gaza, but there are a few others that don't really make the headlines the same way. You need to map out where you do business and the impact on your business and I think that's where the boards are really starting to wake up suddenly in 2023 and into 2024, that you can't just assume that because you do business in a country right now and it's all solid, there's the right policies and the right backups and so on, that you don't have to actually plan for the worst. And I believe that, as humans, we are generally thinking optimists, sometimes too much, and it's a case of understanding what am I ready to lose? Have I trained for that? Because, as I said, if you've trained for it, even if you haven't really experienced it, but if you've done a tabletop exercise as to oh, from tomorrow onwards, we can no longer do business in Taiwan, you know what that means and you've prepared for it. And I think that it leads me to the World Economic Forum's Global Cyber Security Outlook 2023 report, where, essentially, they list all of the top risks that organizations need to deal with, and the first risk is not the advantage of AI, it's not the rise in security breaches, it's the geopolitical fragmentation. So, in other words, what's happening in countries where you may or may not do business will actually impact your business. Again.

Speaker 2:

I go back to Russia invading Ukraine. You can see a huge rise in ransomware attacks coming from Russia into countries that are openly supporting Ukraine. You can. It's reasonably well documented that there were a number of critical infrastructure attacks on the Ukrainian critical infrastructure assets in the nine months leading up to the physical attack and then that kind of dropped about two weeks before the physical attack and now it's back up and so we are monitoring, as an industry and threat intelligence then the countries that are getting the most attacks right now, because it could be it's not guaranteed, but it could be a sign of physical attacks.

Speaker 2:

And I think that you know I'm not telling anyone to forget about privacy and so on, absolutely not. You need to continue working on that. But I do think that right now is a good time to go back to basics and to say what is my ecosystem? What am I protecting? What am I willing to lose in 2024? What can I absolutely not afford to lose in 2024? And that will drive your threat intelligence and your protection strategy into the new year, I guess.

Speaker 1:

Yeah, it makes a lot of sense. You know, I actually I took part in a tabletop exercise before and those are extremely good at identifying the areas of improvement. It's really interesting. You know, they'll come up with a different scenario and you have to work through it and everyone on the call, you know, has a role. I've seen it from both ends, where everyone knew what they were doing and then the other side was, you know, no one really knew what they were doing and, you know, in this tabletop exercise, the company was breached for an entire week before security even knew about it, right, and it's a really good tool that organizations should, and typically do, use to really, you know, identify those gaps and actually it's really important to shore them up once you identify them. You know, now, looking back into 2023, what were some of your top items I guess that happened in 2023 that you think may be setting the stage for 2024?

Speaker 2:

Well, from a technical perspective, the rise of ransomware attacks, the scaling in the number of attacks against CEOs and C-suite. From a social engineering perspective, that was extremely visible. What we saw as well is a number of key executives being prosecuted and, in very limited cases, being jailed for not doing the right thing with regards to privacy, and that can be a game changer. We saw a number of new regulations coming out in specific areas and we saw obviously the advantage of NIS-2 in Europe with regards to critical infrastructure protection. We saw a number of new data privacy regulations in I think about six states in the US, which is good, but they're not all exactly going the same direction. So I think we unfortunately were still a long way away from the federal equivalent of GDPR. So we obviously saw the Ukrainian-Russian conflict going on and the impact of that.

Speaker 2:

We now have the conflict between Israel and Gaza and, ironically, a lot of cybersecurity funding comes out of Israel every year, not just to the US, but also to Europe and to Asia, and that funding is probably going to slow down, meaning less money to invest in cyber, also meaning more attacks on Israel.

Speaker 2:

Also meaning potentially another equivalent of shadow IT coming out of Gaza and people supporting Gaza.

Speaker 2:

So it's a very dynamic environment. But we also saw a rise in that idea of security culture and that is mentioned as well in the World Economic Forum report where we see more and more business people trying to engage with security and compliance people to understand what they can and cannot do and for them to work together. And then we go back to where we were saying at the beginning, that if you work in security, nobody wants to talk to you and, generally speaking, it's because you're either telling the business no, you can't do this because you're going to put aside a compliance or you're going to increase our risk surface beyond what we can accept, or you're like, hey, you go to the board and say, hey, the business wants me to do that, can I get another million dollars to make it happen securely? So it's a difficult one. But now we're seeing that trend where more and more business people are talking to security and compliance and we're going to see a bit more of that in 2024, into the next two to three years.

Speaker 1:

Yeah, it's a lot to unpack there. One of the things that you brought up previously that I have talked about is the fact that now we're seeing a lot of digital attacks or cyber warfare attacks, before kinetic attacks ever take place. Do you see that ramping up at all? Because I feel like there should almost be, like you know, a watch group that is saying like, oh, we're seeing an increased, you know specialized attack in you know Europe or wherever it might be, and you know, kind of like, put out the watch on that. Because I feel like everyone in cybersecurity is aware of that, they understand that and they know the implications of that. But it's much more difficult to get people outside of cybersecurity to fully grasp the concept of, oh, like, they're going to take down my phone network before they, you know, send troops in, right?

Speaker 2:

Well, so you know, the issue with critical infrastructure assets is that as citizens, as Joe public, we believe that this is the responsibility of the government, and what we do not understand is that, depending on the survey you look at, but generally speaking, it's between 70 and 80% of critical infrastructure assets like electricity, power, food, transportation and so on is actually owned and are operated by the commercial sector, by private companies, and the part that is actually managed purely by the government is, generally speaking, only the army and the police systems, because even hospitals and specifically in the US, you know, half of the hospital systems are actually private systems Not so much in Europe, but still parts are still actually private. And so what you want to do is you want to bring the awareness level, with Joe public, that everything starts with them. And which actually leads me on to another. I suppose another issue here and we are seeing that more and more over the last few years is that concept of your own critical infrastructure. So right now, most of you have three, four connected devices on you, you know, a smartwatch and maybe a personal cell phone, a business cell phone and an iPad or whatever, and that's before you even get into your car, which is completely connected, and then you get to your house and so on. And so if I educate you, either as the industry and or the government, as to the value of that, and I can say, well, if you take care and if you, if you are careful, nobody's going to be able to drive by and order whatever they want by hacking into your phone that is linked to your fridge, that has a system that allows you to connect to, to Walmart or wherever, to replenish everything, and now I can buy different things and get them sent to my home instead of yours, and that is the problem. But it's not like threatening. But let's say I hack into your HVAC system, your air conditioning system, where it depends on where you live, but like, if you live in Michigan in the middle of summer and you can't get cool air, or in the middle of winter and you can get heating, that will become critical. And so I think that what we need to do is we need to like if people do that at home, they're more likely to pay attention at work, and vice versa. So it needs to be a continuous cycle of educating them on both sides.

Speaker 2:

I do think that again it goes back to that idea of the your risk surface. So my risk surface before, when I was walking, was just social engineering my watch was not connected, I didn't have a cell phone, or my cell phone was so dumb that you couldn't even hack into it. Now I'm like walking a tax service and everywhere I go it keeps growing. So I need to, I need to train people to understand hey, do I really need that connected wallet? Do I really need this? Do I really need that? Do the benefits outweigh the risks?

Speaker 2:

So if I have a connected wallet and I lose it, I can connect to it. That's great, I can see where it is. But if, for some reason, there's no, there's only default settings on it, I might be able to connect to your, to your wallet, and then, once you're home, I use the wallet to piggyback onto your, your computers, and then to piggyback onto your VPNs that go from your computer to your workplace. You can see where I'm going and it's not that far fetched, to be honest. I mean, I'm not that technical and I think I could. I could do a demonstration reasonably easily. Not that I would do that, by the way.

Speaker 1:

Yeah, it's. It's actually a lot easier than what people would assume, in my opinion. You know, like I, I'm not a hacker by any means, and I could absolutely pull something off like that, especially in 2023, where these exploits and packages are kind of already pre-made and you just kind of find the right one and get in.

Speaker 2:

You raised a good point. You know, the 2023 has seen a huge increase in hacking as a service where you go to the deep web and it's not even digging too deep and you can buy a kit where you create your own ransomware or your own DDOS and so, and you literally configure it the same way as you configure your iPhone. And you know, for some of them, they actually have a customer service line where they provide better customer service than normal companies, and so I think that you know the level of skills that an attacker needs to have keeps going down whilst the attack surface keeps going up, and so you can see where you can see that's creating a huge vacuum, and as an industry, we need to work together, and I think that I applaud all of the work that's been done in 2023 around teaching kids how to code, teaching them cybersecurity or the sense of security from primary school up to, you know, up to college, because if we don't catch them now, they're gonna be our next security people or our next head of IT or a head of database in like five years or 10 years, and they're just gonna be walking targets with my name on it you know at the back, my company name, and so I don't want that to happen. So I actually not only do I have a duty, but there's definitely something in it for me to do. That, which actually leads me on to another point.

Speaker 2:

I'm in the process of writing my second book around the life of CSOs, but not around, not, generally speaking, around. You know, the certifications that they have, but I asked them all the same 15 questions in the same order, about work-life balance and the threats that they see out there. And one very interesting question that I asked them is do you think we are creating the right succession plan for when you get out of the industry? Either you're gonna go do something else or, you know some of you have been in IT or in cyber for 20 years. Maybe you're gonna want to retire. How do we extract the level of experience that you have so that we can document it and pass it on to new people, and are we actually creating people with the right skills? Is the curricular out there too outdated for the new threats? And I see a divide. So I set out to do a hundred interviews. I'm about three quarters into it right now, but I see a divide.

Speaker 2:

Some people say, no, actually we are doing the right thing. Others are saying I don't think we are and I mostly don't think that we have the ability to pass on our knowledge, which I think is an interesting point Because if you think about it, you know people that became network security managers in and around early 2000s would have had maybe five to 10 years experience in IT already. So these people are all coming up to retirement in the next five to 10 years. So we're gonna have that cliff of skills going down. I'm not saying that new people don't have skills.

Speaker 2:

Some of you absolutely do, and in fact they're probably faster at some things that we all these don't. You know we take time to process, but we understand the value of process, whereas younger generations, they, want everything faster because they never grew up with the idea of, you know, waiting for a file to download. That's unknown to them. So why would you wait five days to do the right thing, to find out where the breach came from? You have a hunch, you go after it, and in going after it quickly you actually destroy all the legal evidence that an older person would have found within 10 days but would have been able to use. I think we have a bit of a challenge there in the next five years five to 10 years.

Speaker 1:

Yeah, everyone always talks about the talent shortage or the talent gap, and not a lot of people are bringing up the fact that a lot of the people that are in leadership roles or have been, you know, very experienced in their job for the past you know 10, 15 years they're all retiring fairly soon. You know, I actually got brought on at a company to replace someone as their security expert that had been at the company for 25 plus years. They were retiring in the next, you know, six or nine months, something like that, and you know that knowledge dump right that we had to go through I mean, it's every day for you know nine months. Why did you make this choice? What was this situation? Who did you work with on this? Who do you trust within the organization? All of those sorts of things, you know.

Speaker 1:

And now this company that I came and worked for, they had a very forward thinking view. You know they were very good at thinking ahead, planning ahead, and so things like that were always on their roadmap of who's retiring when. What skill sets do we have to pick up? What skill sets do you know we need to augment and replace and things like that. But not every organization is thinking like that. That's a huge challenge that's gonna be coming up very shortly. It's almost like a different. It's almost like a different problem from the talent shortage that we already have.

Speaker 2:

Yeah, and I think you know there are a lot of talented people that are coming on the market. That's not exactly the problem. The problem is did we give them, as an industry, the right pointers so that they can either learn what we need right now or have a basis that's good enough that they can be molded into what we need? Because obviously there's no point in creating an expert in forensics if we have enough people in forensics. But equally, if you know forensics really well, you'll be able to add value in incident response, in purple team and so on, so you'll be able to reshape your knowledge. But I think the worry is more about are we creating people that have too narrow of a scope and that scope is valid today but may not be valid tomorrow, and will they manage to retrain? I'll give you an example very topical.

Speaker 2:

Let's go back to 2018 for just one second. 2018, gdpr was enacted and so, overnight, millions of people were GDPR experts. They just added that to their LinkedIn or their resumes or whatever. Today, everybody is an AI expert and, more worryingly, a lot of people are AI security experts. So that's great. At least there's an interest.

Speaker 2:

But the challenge is not really just in AI security, as in securing the code and securing the LLMs and so on, because there's emerging technology on that.

Speaker 2:

It's about AI governance, and there are very few real AI governance courses out there that allow you to grasp the real risks and the way to embrace AI in a way that allows you to govern the process and to deal with issues, and so what I wouldn't want to see is tens of thousands of AI cyber experts being born over the next 12 months and they're actually not trained the right way and they actually add no value, but they think that they're going to be able to get jobs, because they're probably not going to be able to get the jobs they want and they may actually not add value or not add as much value.

Speaker 2:

So I think it's really important for us, as the industry, to work with third-level universities, to go and do guest lectures. I do guest lectures for various universities. It allows me to keep my finger on the pulse, to understand how younger people think, what they want to learn, the questions that they ask and so on, as opposed to saying oh, the next big thing is AI risk management. Maybe it is, maybe it isn't. I mean, the thing with AI is we don't exactly know as an industry, and anybody that tells you they know, take it with a pinch of salt, because it's such a fast-moving target that we don't exactly know just yet.

Speaker 1:

Yeah, that's a really good point that you bring up. It's a lot easier to add these key terms to your LinkedIn or to your resume than it is to actually create the skills and get the skills that are needed to actually fulfill that AI security title. I feel like the only people that they're harming is themselves. Because they get a job. Maybe they fool someone at the job because they know a little bit more than what the person interviewing them does, so they get the job and then they get that job and they fail at it. It's just one failure after the next and they're constantly trying to play catch up, especially in an advanced area like AI security. That really isn't even defined right now. One we don't know where AI is going. Two, ai security is something that we're just starting to talk about now.

Speaker 2:

Right and I think it's great to have an interest in AI. It's great to understand chat GPT, but AI is not chat GPT. It's way bigger than that. I think that right now there's good expertise in the market around the data that you can feed AI and the risks that you take and how to mitigate those risks and how to classify the data and maybe have a filter and train people and so on, but in terms of the full architecture of AI, the coding that goes in the AI, coding that goes into your standard code, and how to keep track of that and actually manage that process, it's still early days Now.

Speaker 2:

That said, in the last two years, there's been about 35 new AI-related regulations and standards that came out. There's been stuff like, for instance, the EU AI Act. There's been other things coming out from the industry and it reminds me of the beginning of the cybersecurity industry where, believe it or not, back in 2005, there were a lot of industry standards that came out. Some of them were driven by Vandu, some of them were driven by associations and so on, and we're seeing that right now. But you have to remember that if you dial back today, according to the UCF, the Unified Compliance Framework, there's about four and a half thousand regulations around privacy, data and security, but the reality is they all dial back to about 20, and then when you look at those 20, they really dial back to ISO, NIST, CIS, GDPR, potentially PCI as a restricted one, and a few on the software security side. So it's very likely that we will have the same with regards to AI. So I would keep a watch on that if I was interested in working in risk management for AI.

Speaker 1:

So where do you think? What are some key areas that you think are going to be really booming, that people need to pay attention to in 2024?

Speaker 2:

I definitely think we're going to see some attacks on personal infrastructure. So there are already vendors coming out with ways to help you secure your infrastructure at home all of your stuff that's connected. I think we're going to continue to see ransomware. I have absolutely no doubt there's going to be a few new zero day attacks every year. That's what happens. We are seeing, as always, attacks on government, but mostly financial institutions. It's also interesting to see what's happening in the UK with regards to PSD3, and everything that has to do with authentication and strong authentication Identification. So I would suspect there's going to be continued investment in that. I think we're also going to see ridiculous things being connected. I heard that example the other day of a vacuum cleaner, completely connected, that actually goes and vacuums on a regular basis but actually maps out your property. So now you know that Matthew has a two bedroom or three bedroom apartment on one floor or two floors, can you imagine where this is going?

Speaker 2:

I do believe that a number of attacks are going to be automated, but I also do believe that a number of counterattacks are going to be automated using AI. That's the good side of AI. That's good because, whilst a system is able to deal with the noise. The actual analysts can deal with the real attacks or the attacks that require more thinking. We are going to see more regulation, of course. We're going to see some new AI regulation.

Speaker 2:

We're going to see some updates to EU GDPR. There's a chance that the UK is going to lose their adequacy because the UK GDPR currently is recognized as being equivalent to European GDPR, but the ICO, the Information Commissioners Office, has already taken steps to go a different direction than the European Data Protection Board. So if they lose their adequacy, that will mean that from an EU perspective, transferring data to the UK will be the same as transferring it to the US or Mexico or Australia, and you can see that, the evolution of that. So are we going to see a digital Pearl Harbor, like we all think might happen at some stage? I don't know that 2024 is the right year for that, but I do believe that the geopolitical fragmentation is not going to go away and we're just going to have to learn how to deal with it. So I wouldn't be surprised if people offering red teaming and purple teaming and table top exercises will make the fortune in 2024. And it probably wouldn't be a bad thing for the industry.

Speaker 1:

So what are some areas that our current AI policy and governance is lacking in? Because I feel like this field is advancing pretty rapidly and, per usual, the governance of it and the policy behind it is lagging behind. So what are some areas that we need to pick up and pace it?

Speaker 2:

Well, there are a number of best practices and checklists that are available. So the IAPP the International Association for Privacy Professionals came out this year with a very good document that has, I think, about 65 keywords and key topics that you need to look at in your AI initiatives from a technical and a policy and a training perspective. So you're going to see more of that. There are, as I said, a number of vendors coming out with interesting technology about how to make sure that whatever you do using AI doesn't actually impact on the generic codes of your software. So I think we're going to see some more of that, and it wouldn't be. I think what we need is like a NOASP top 10 and a SandStop 24 AI, and it's coming. I think there are a few out there that are just industry driven, but there's going to be some more. I would urge people to try and grasp the idea of AI governance. There are some very good AI governance forums coming out right now. I spend a lot of time attending those events and I'm fascinated at the conversion the two cybersecurity and AI trying to meet somewhere in the middle. It's an interesting thing to watch, because cybersecurity is very at this stage there's a risk, or there isn't a risk, we can mitigate the risk, or we can't, because we understand it reasonably well but we don't really understand AI. I think another thing to keep in mind is if you're familiar with the Cloud Security Alliance, the CSA, they are basically saying that protecting your AI systems and infrastructure will follow a similar trajectory to what we've learned about the Cloud.

Speaker 2:

Initially, everybody was saying, well, I'm not moving to the Cloud, too dangerous, I don't know what's there. Then, eventually, you see, you have no choice but to move some critical elements of what you do to the Cloud. But now there's good practice, there's ways to protect it, there's continuous compliance. I think that the CSA says that it's going to follow a similar path, and they may well be right on that. We're not going to be able to not embrace AI, but we need the right structure that organizations can use. If you think about it, very small organizations or mid-sized organizations are well able to embrace the Cloud now because there's so much expertise out there. That's where we need to get to with AI, or at least with mainstream AI. I'm not talking about Terminator and that type of stuff. I'm talking about what we're trying to do right now, which is to use AI to automate the mundane and other tasks, that our time would be better used to do something else.

Speaker 1:

Yeah, that makes a lot of sense. You brought up the prospect of potentially cyber Pearl Harbor or something like that. From my perspective, what I'm thinking about, that I'm thinking of an attack that is very large in scale, that changes the world forever, right in a very tangible way. Is that how you see it? What do you think it would take for something like that to happen, like a power grid going down for a month, or what does that look like to you?

Speaker 2:

Pretty dark, actually Use the pond, but it could happen depending on where you're based. I get up in the morning and I'm happy to be alive, and I'm happy that I have electricity and I have water and so on, and I don't want this to change. And so I believe that also, we've gone from just pure critical infrastructure protection to critical infrastructure resilience. You look at Dora, for instance, and for the banking industry in Europe, the Digital Operation Resiliency Act, if I got that right. But anyway, dora is all about making sure your critical systems are resilient. So will they go down for a day? No problem, it'll be a pain, but it's okay. For a week, it'll be a major pain, but it'll be okay for a month. That will have societal effects, that will have issues with potentially, after a while, riots and social unrest and so on, and so we can't really afford to do that. So it's interesting that idea of you.

Speaker 2:

Know, we all understand now that we need to protect the critical infrastructure of the cities, of nations. Now we need to understand that we need to protect our own critical infrastructure because it's a backdoor to the rest. But we need to talk about resilience, right, how do I make my way of living resilience. How do I make my way of doing business resilience If my e-commerce site goes down? Am I out of business? So, am I 50% out of business, and for how long? And how long can I sustain that?

Speaker 2:

And, by the way, that is why some organizations decide to pay ransoms. And you should never pay a ransom by default, because if you pay it, you may not get the right information back or the key back. It may not work. But also, you advertise yourself as somebody who's going to pay, so you're going to remain a target right. But the reality is some companies are like ah, do you know what? On the grand scheme of things, we're better off paying, and so, but with critical assets, you can't always think like that, you know. So I think we need to move towards resilience. We've spent enough years developing good risk assessment methodologies and looking at all of that. Now we need to get to the next level. How do I make this a continuous proactive thing and I make my ecosystem resilient and my staff resilient and myself resilient?

Speaker 1:

Hmm, yeah, you bring up a really interesting point and that is something that I myself even see as being often overlooked is the resilience factor of deploying this revenue-generating application that is generating I don't know a million dollars a day. Well, what happens if that web app goes down? You know, do we have HA set up? Is it failing over to the same location? Because if it's failing over to the same location, it's probably not a good idea. All of these things are often overlooked or put on the back burner, and so that we'll get to it, you know, eventually. Well, in the meantime, when eventually is coming, you know you can have an attack that takes it down completely and it's like oh, that thing that we said we were going to get to eventually never came because it was already at risk. You know, well, you know, matthew, we're coming to the end of our time here, unfortunately, but you know, before I let you go, how about you tell my audience, you know, where they could find you, where they could find Vigitrust if they want to learn more.

Speaker 2:

Yeah, sure. So, first of all, thanks again for the opportunity to talk to you today. So you can find information about Vigitrust at Vigitrustcom V-I-G-I-T-R-U-S-Tcom, you can find information about myself, Matthew Gorgecom, in one word. I've also published a book called the Cyber Elephant in the Boardroom, published by Forbes and Best Seller on Amazon, and you'll find it on Amazon and it's all about translating cyber risk into business risk, primarily for non-technical people. And, of course, I'm very easy to find on LinkedIn and I actually love networking. I love meeting people from the industry. There is not a day that I don't learn something new about cyber, and I've been at it for 25 years and it's a great industry that way.

Speaker 1:

Awesome. Well, thanks, matthew. I really appreciate you coming on and I hope everyone listening enjoyed this episode. See you everyone.

Geopolitical Fragmentation and Cybersecurity Impact
Cybersecurity Trends and Succession Planning
Challenges and Future of AI Security
The Importance of Critical Infrastructure Resilience