Security Unfiltered

Unlocking the Secrets of Effective Insider Threat Management With Joe Payne From Code42

March 25, 2024 Joe South Episode 148
Security Unfiltered
Unlocking the Secrets of Effective Insider Threat Management With Joe Payne From Code42
Security Unfiltered
Help us continue making great content for listeners everywhere.
Starting at $3/month
Support
Show Notes Transcript Chapter Markers

From the fizz of soda pop to the buzz of cybersecurity – witness Joe's extraordinary journey through the tech landscape. Our latest podcast episode takes you through a narrative of unexpected career leaps and the critical role of intelligence in the digital age. Joe, a former soft drink enthusiast turned cybersecurity guru, shares tales from his early tech escapades to his pivotal role at iDefense, where his collaboration with government agencies like the NSA, painted a picture of a complex and ever-evolving cyber battleground.

Unravel the twisted tale of Anthony Lewandowski's controversial departure from Google, highlighting the precarious nature of intellectual property and insider threats. This episode peels back the layers of corporate espionage, discussing how companies like Code 42 are at the vanguard of detecting unsanctioned data movement. We tackle the ethical quandaries posed by the race for innovation, where the lines between ownership and fair competition often blur. The discussion showcases the intricate work of safeguarding expansive AI datasets and the careful balance of advancing technology while maintaining integrity.

Navigating the murky waters of data security, we dissect the nuanced approach required for managing insider threats. Balancing an investigative eye with the necessity for swift action, our conversation sheds light on how companies toe the line between education and enforcement. As we discuss the limitations of traditional DLP systems and the emerging prominence of cloud monitoring, you'll gain insight into how organizations are adapting to the challenges of ensuring data remains in the right hands – all while contending with the unforeseen risks AI platforms like ChatGPT might present to data security. Join us for an exploration into the strategies that fortify companies against the tides of insider threats.

Support the Show.

Affiliate Links:
NordVPN: https://go.nordvpn.net/aff_c?offer_id=15&aff_id=87753&url_id=902


Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today

Speaker 1:

I was going. Joe, it's great to finally get you on the podcast. You know, I feel like we've been trying to get this thing scheduled for quite some time now, but you know, with with with our schedules being so hectic, you know, I'm glad that we could finally get on.

Speaker 2:

I think it's you know, you and I spent a lot of time with the hairdresser, I think, and so getting our appointments all lined up without a podcasting, that was probably the difficulty here. Yeah.

Speaker 1:

Yeah, and this head of hair. You know it doesn't maintain itself, it is necklace.

Speaker 2:

Yeah.

Speaker 1:

So, joe, you know why don't we start with you. Know your background right, how you got into it, how you got into you know kind of the security side, what made you want to go down this route, was there, you know, something that you maybe discovered earlier on in your life that you know kind of took you down this path. When you're looking back on it, or what was I like?

Speaker 2:

It's my story to a little different to probably a lot of your tests, because I my first job out of school was actually the soft drink business and I was working for Coca Cola in marketing and you know I was having a lot of fun and soft drinks, you know commercials and all kinds of crazy stuff. But I remember at the end of one year we grew. It was like we had a great year and we grew 6% and I was like, hmm, it got like in a different industry it grows a little faster than beverages and I was always interested in tech, and this was the mid 90s and tech wasn't made back then. I need to think about it. You know there was no social media. There was very little internet. Very few people had email addresses. When I first joined in tech, I'll just tell you a funny story about myself my first email address was Joe and Karrosh, which is my wife's name, at AOLcom, and at the time we thought, well, every family would just have one email address, you know, like you have one physical address, and so we had a shared email address. So that's how far back I go in tech.

Speaker 2:

But then I got interested in security in the early 2000s. It was a murdering space, an exciting space and I was the CEO of a company called East Charity the first SIM products way back in the day. And what really turned me on as security was when I worked at iDefense. At iDefense we worked doing threat intel and our customers were all of the three letter agencies in the government, all the largest banks in the world, and at the time I also got security clearance. And so when you get connected to the three letter agencies and you're looking at all of the things that are happening in cybersecurity, it's tentacle could be exciting. And iDefense was a lot of fun because we had white hat hackers. We were also tearing up software around the globe with our white hat team just trying to get ahead of the bad guys. So it was a lot of fun back then and I was getting interested in security ever since.

Speaker 1:

Yeah, it's interesting working with the government. Earlier on in my career I worked for a very small company that did Enhanced 901 software and so the government was a huge consumer of our product and somehow I finangled my way into leading the entire tech side for our government clients and I mean it really does just like open up your mind into what's possible. And I feel like it's a different way from government employees, because the government employees are typically very siloed and as a contractor you can come in and say no, I need to know what's going on over here. I know you normally don't tell people, but this impacts my work of how I provide this product, and so I need to know these different pieces. And so you learn probably way more than what you should actually learn in some cases.

Speaker 1:

And it's always interesting because I was going to a facility for maybe a year or two and I never knew what they did. I mean, you're in the middle of the mountains West Virginia, something else is around, like a town was built around this thing to support contractors coming in and I have no clue what we're doing. And one day I guess my handler that's what I called him my handler at the facility. He was like Joe, do you not know what we do here? I said I have no clue. I literally have no clue. And he pointed to these satellites that were, these giant satellites on the on premise and told me one of the small things that they did. And I'm like my God, where am I?

Speaker 2:

Well, Joe, you just there's a couple of great points to kind of for first. So I'll say my area of expertise now in cyber is running cyber threat and insider risk, and so you just identified a couple of things around why it's so important. Because, to your point, like contractors in particular do you get a lot of knowledge around things, and so contractors will end up biggest risks on the federal side. But the other side I think it's just as important to touch on and I often have to talk to the old people on my teams about this and that is that the United States government is, in my opinion, the best in the world in cyber. I mean the best in the world.

Speaker 2:

What's interesting about that is that when most people think of government employees, they think of people that work in agencies and are stereotypical. Oh, these government employees, they don't work that hard, they just shut along. It takes a long time to get the government to do anything. You know all these stories et cetera, but what I learned in working with our Intel community, our long course community around cyber, is that we literally have the best people in the world at the NSA in particular, at the FDI, a little bit at the CIA, in the military, we have the best people in the world. In cyber, we are nobody's better than us, and that level of understanding and respect is really important for all of us that work in a cyber community, and so I'm glad you brought that up, something that I think we can be really proud of, and it's something, honestly, that's just very safe as a country.

Speaker 1:

Yeah, you know, it's very interesting that you bring that up, you know, because sometimes, as from the outsider perspective, right and I'm talking about people that aren't even in cybersecurity it looks like to them that we're so far behind everyone else that we're not capable. There's too much bureaucracy. And I will say this there probably is too much bureaucracy, right, but at the same time, I know the crowd of security professionals and we need those rules, because if we don't have those rules and you just say, you point us towards China and you're like hey, you've got the power grid, you know go have fun, right, like yeah, it's going down and they won't be able to recover from it, but like that's something you know you want to really tie it, tie into other things and really control, right, and it's a.

Speaker 1:

It's fascinating that you bring up. You know that we have that talent pool there, because I actually had on someone maybe two years ago at this point and the episode was never released because the military asked me to not release it and you know I do not want the United States government showing up at my door for any reason. And you know he talked about how the training that he was going through is a two year program, for you know, one of the three letter agencies. It was a two year program and he said, literally every single day, at any point in time, you can be cut. If you write an inefficient line of code, if you, you know, do something that you shouldn't have done, if you put it in the wrong language, whatever it might be, you know you can be cut right there. That's the standards.

Speaker 1:

And he said you know only a very small amount of people actually make it through and you typically have like a six to 12 month ramp up time before you even go to the school and you know those standards. They come into play in the real world because now you know what the bar is, you know you. You kind of know what you're expected to do and what you're able to perform at and whatnot, and that's expected of you. You know day in and day out, and I feel like I've always felt like we've had the best cyber capabilities, for sure. And if you just look at the things that have leaked, you know, though, like eternal blue and how many, how many zero days came out of internal blue, right, and that's, you know. That's 10 years ago at this point, Well, it's interesting.

Speaker 2:

So you know, first of all, and work, I will tell you that in working with the NSA, I interacted as much with lawyers at the NSA as with the, with the cyber experts. That's how careful they are at following our laws and so you know they're not. You know they're not doing things that are the older, very, very, very careful. That's something that they share. But you mentioned there's not enough data sharing and stuff like that.

Speaker 2:

All the leaks we had in the US have really been because coastline alumn, the cyber community, you know, created tools to share broadly the intelligence and there's over a million people that have street clearances, that have access to some of our nation's most you know, sensitive information and that you know. The Jack Tech Sarah, you know, leak recently is because it just takes one of those people to use that information the wrong way out of a million people. So that's a half every month. So the price that we're paying right now on those leaks is that we give a lot of access to people and because we want to share that data and, you know, one a bad thing to the change and can cause that change. So it is an interesting balance.

Speaker 2:

I know there's a big back and forth and you tell those community about that at all times. But honestly, I only wanted it. We all need to dwell on it. I just find that point out to your list, or is that like if you're young and you're like not sure about these things, you haven't come? Just know that the people that are protecting us from a serious death by someone that has to the world, and that's a good to know and it's also important to respect.

Speaker 1:

Yeah, absolutely, you know to dive into the insider thread a little bit more, right, do you think that these Snowden leaks kind of set the stage for you know all the other leaks to follow. Right, how big was that, how meaningful was that from a you know quasi insider, you know expert knowledge, right, like you probably have a little bit more knowledge on that leak than anyone else. Right, like, what was?

Speaker 2:

that like that was a devastating leak. I mean, that was a devastating leak for the United States. You know, people lost. Some people lost their lives over that leak because there was so much information that she ended up in hands that it shouldn't have ended up in. And I think it was also devastating for Booz Allen and their reputation, because he was a Booz Allen contractor and it was so basic. I mean, all thumb drives were supposed to have been, you know, all external drivers were supposed to have been blocked, you know, and they just hadn't gotten to that facility in Hawaii to put those controls in place, and so it was easily preventable. It keep with policy would improve. So it was careful, but holding up the level. Look, here's the thing about insider thread and insider leaks. We're always going to have them and so you have to really put controls in place. But you know, back to this conversation about sharing data, you've got balance tag with the need to share and collaborate and if you step outside the document for just a second, we go into corporate, which probably most of your listeners are corporate.

Speaker 2:

I think what surprises people is how pervasive data is making out of organizations today. So we all might say, oh, you know so, do so, but haven't done anything so stupid, or how could we be so stupid? I've got that happen and up, up, up, up, looking your own backyard I mean we, when our company was specialized in helping organizations on insider threads we first come in. What surprises our clients is that the amount of data moving out of the organization is incredible. Literally every single person that will reach your organization for a new job takes a lot of critical data. Literally every single one. 60% of people admit they took data from their last job and are using it in their current job. That's the people that admit it. So, even if I said it's probably more like 90%, and so it's really. You know, what's interesting right now is that the whole fn of AI and the datasets around AI people are finally starting to pay attention and say, oh, wait a minute, do I need to be worried about my source code leaving or my large ASS leaving, or you know whereby, your customer lists or et cetera, etc. So we're seeing a real assertence in people because of the focus on AI datasets. But, to be fair, that's just a very small to the I for what's actually leaving and most organizations. We see a lot of source code, customer lists, hr data. We see that all the time. So it is an area I'm glad is finally getting some attention.

Speaker 2:

We can talk about sort of different stories around that and how it's happening, why, but on the corporate side, just to touch on the, you know, on the stone side, on the corporate side, the breach that Millie sort of woke everybody up in the last few years is Anthony Lewandowski, who took data from Waymo which is all their self-driving car data and, just like Snowden, took just a troll of the information, lewandowski in the last week of his his time at Google or Waymo online, who walks to all that data, and he just took it on an external hard drive and then used it for a year before they realized it.

Speaker 2:

And they only realized it because one of the suppliers accidentally sacked Waymo and said, hey, we can put confirmation on this part. And the people at Waymo was like wait, that's our parts. Oh, we did order this and it turns out it was. It was he just literally taken the exact schematic for his new company and was saying that part and so you know it was all. When he discovered accidentally by the folks at Google Now was sort of the best, that's the snowman taste on the corporate side and you know he ended up. That's what was interesting in the end for your listeners is he went to prison for that and was serving time in jail when, in the last days of his presidency, president Trump parted Anthony Lewandowski with no little to no explanation of his phone. So interesting sort of story that kind of keeps going.

Speaker 1:

Yeah, that's really fascinating. I actually I didn't. I don't think I knew about that. You know, is Waymo still around?

Speaker 2:

after that. Yeah, waymo's still around. It was an massive issue between Waymo and and then Uber because Uber. It looks as though Uber basically encouraged Lewandowski to take that data, stand up his own company, and then Uber bought that company six months later. And so Anthony Lewandowski was at Uber when they discovered this and Uber was working on sell all the self-driving car stuff. So they it was a long lawsuit. They ended up settling in the end, but then when they sell, after they sell the lawsuit, then criminal trial came and so it was a big deal. Some people think it's. You know, one of the reasons Travis lost his job was we're all around the same time. So it was. It was a big. It was very controversial at the time. Dead pass.

Speaker 1:

And that's really fascinating because you know, like, how does that even, how does that even come about? Right, like you know, you're working for Google on this project. A competitor, a direct competitor, like they, must have approached him at some point during his tenure at Google to approach him with this idea. You know, I maybe I'm just not in the insider threat space, right, but I mean that's a, that's a kind of corporate espionage, that that I I never would have thought of, I never would have imagined that a competitor would go to that length. You know, I could see them, you know, buying the company, right, but the for the idea to come into place with this guy, to take the code form a company, start, you know, probably develop it with it, try and order the parts that were proprietary, you know, for what he was using at Google, I Mean that's just a different Love that I feel happens every day.

Speaker 2:

We had, we beat, we beat. We had a new client just a couple of weeks ago at code 42 where they had Employee that left and stood up his own companies in his source trust. The way it happens is people that are Smart and people that create new things All are capable companies. They, they, a lot of them believe why created this design. So you know, I'm gonna take it with me to my next job. They'll. And that's not the way a lot your property works. When you work for a company, when you join, you sign an document says anything I create for this company the company owns, and so we see this every day was sourced to right now most software developers. They have their own get cover, get lab accounts, your repos. You're gonna talk to developer friend hey, do you own your own repo? Oh, yeah, I guess we have in your ego of that projects that are working on my whole career.

Speaker 2:

Okay, well, those products for companies we are you know, let's does this company source code we have? I mean, it's not like I took it. I left the copy. The company's using that copy. I took the copy of the source code With me and no, we see that time and time again and it's interesting and this ties back to the AI stuff I was out there about.

Speaker 2:

So the way we look for the way most developers Take source code is this kind of interesting and very straightforward, is they? Essentially, you know, every day when the developer pulls their code down from the corporate repo, you know that corporate code is locked up tight it's. You know you gotta have the right stiff divider machine to get into it. Sometimes it's all like back authentication, but like you're not getting in that vehicle unless you're authorized to know. Right, most companies are really good now and so well. Once the developer takes the code down to their local machine, that's when they start buying code. They got their test cases running. They got a bunch of the ends etc, a writer code, and then they check it back into the forklbeat pop. Well, you know, in order to check it back in, you use it commands to push that code. It's called a git push. She pushed that code back up is the recalls. Oh, what a lot of developers do is, at the same time, they're gonna take the bill, get pushed to their own personal repo and you know if they're sitting at home and they're off the corporate network which was developers are these days or there's our what. So, wherever they are, most organizations are completely blind to that get pushed to the corporate, to personal repo, and we built technology code 42 so we could see that. And unless we did, that was her seeing some resource Good at filtration. We were literally like, oh my gosh, we didn't know. We didn't even know that this was happening. At the same time, the same capability also sees when you move large datasets. You also use the git commands to use large, large datasets. So we also started seeing large datasets getting moved around and that's where the AI I was. That comes in a play where people moving large datasets to train LLMs for AI usage and you're gonna see back in the same capability we law. We're let a company that has this capability out there today. We lost it about two years ago and so the timing was just lucky. We didn't, we weren't anticipating that was that half of it. With the AI and the large language models I know, which I have a lot in the a stop a few years ago, but but again we see this kind of huge movement To that again.

Speaker 2:

So back to the afternoon and ask you story like yes, he took his designs, yes, he took his files, let's see, toss up. I mean, he went to prison for this, so he's obviously had to pay the price. But in his mind he probably was like, well, this is so fine, that's it. No, this is my stuff. Like I do this, I want to forget, just like if a whole team of people who were bonus and the damage done to the people you leave behind as an insider, when you take Everybody's, you know the work that everybody did and then you go take it somewhere else is real.

Speaker 2:

And no part of what we often do as an insider risk company is we help educate the organization on like a. Hey, we're not allowed to do an. I'll ask you why is an adder? You know well, you know why is it important? Is it? Is it the company being mean? Asking you your information? Might, but people often don't sort of understand that the impact that they have is on their old tears is Let it biggest impact cash that you know occurs on inside risk hmm, yeah, it's.

Speaker 1:

I feel like it's almost like a, it's a miss in the brain or something like that. You know, because from my perspective you know I'm in it right, I work a day job in cybersecurity there's a lot of Potentially you know, proprietary information that I have taken down in notes, in my, in my one note. But there's a lot of Things that I'll that I'll learn right for the very first time and I'll take note of it and it'll be, you know, not proprietary information, right, it's a configuration in AWS or whatever might be, and I'll take note of it so that I remember it. And when I leave the company, if I don't know it by heart, I'll just take that little, that little piece of information and put it into my personal one note, right, but it's not proprietary, it's not anything that's you know, customer, the company or anything like that. And for me to do the mental gymnastics, right, to kind of rationalize taking proprietary information from company, I mean I don't want to go to prison.

Speaker 2:

And it doesn't usually end up in prison. I think part of what's going on here is that most organizations today have some form of data protection but are Are. You know, our ability to exfiltrate data from our company is expanded so rapidly in the last five or six years that people just don't think Daniel is paying attention or they don't think that any of that car works. And it is probably more the former than the latter, because if they really think about any car and they feel guilty about it and they and awesome, they won't do it, which is good. But most companies today just can't see. If you upload a document to Dropbox from your house, you know you're not on the company network, or you know and you just Gmail to yourself. These are the most common ways that people take data and they're free and most people are like, well, no, I don't think my company sees that, you know I can still print and all kind of stuff, and so you know, you know our. Our philosophy as a company is we coin in terms of secure the collaboration culture, and the collaboration culture is how we all live today. We use stack, we use teams, we use T drive and one your I. We have all these tools like Salesforce, that they're cloud based, last share information and our work clothes are shared and we can collaborate on presentations together and that's awesome, like it's a so much more productive as organizations, right. And so he insecurity is not to get in the way of that and to just brought a little bit of security around it so that people don't feel like, hey, nobody's watching, nobody cares, and you'll find it in one of the ways that we do that, that's how it is that the impact is, and somebody good employee, normal employee, been working, you know, in their job does something that they're not supposed to do. We'll send that. The system automatically send them a video. Is this hey, we noticed that you uploaded a file to Dropbox. We don't use Dropbox as our sharing platform here at X company, you know, please? So what that does is it puts the employee on notice that well, somebody's actually paying attention, but it does it a very nice, polite way, etc. And if you let people know that we're watching a score, most people are steal from you and most people behave the right way.

Speaker 2:

And I'll say, if I last week I was doing a presentation in San Francisco and I got there with my laptop at the conference and they were like, oh, you can't use your laptop, catholic, but instead I could update fine on, use yours and know like we need your presentation. And the only way I could get it done was was a thumb drive. So I put my thumb drive in and had me using adapters. Of course, matt, matt doesn't allow forever. Eight and my days. The presentation is them and I you know it was I one minute before like my kickoff. So it was like fast, Like I don't know, I did my presentation that afternoon.

Speaker 2:

I am in knots. I got an email but I was a black from my security team, the little video attacks and said you know, no use, found rise to your code 42 to take that. So you know, risky way to XLT day it. Please watch this video. I had watched the video all the way through. You know I did the very tenth and check that I have lost video and all that kind of stuff, stuff. So that's how controls you place working in a line and you like don't, don't do that. And just by doing that, now I know if I were to leave the company, like if I moved a bunch of data, they would pay it, they would notice and they would pay attention. So that kind of show, you know, that really reinforces a way to keep people from actually doing something.

Speaker 1:

So I mean, that's a great. That's a great like reinforcement of a, you know, a learned behavior right. But how do you, how do you stop that developer from uploading to their own personal you know, get repository right? Because I don't, I actually don't know.

Speaker 2:

Yes. Well, from a technical standpoint you don't want to try to solve that. And then you hear the other one is, if you get in the way of a software developer as a security person, which one? You is the leaser job? And I'm telling you, this is just right. You do not want to get in the way of the 300 software developers you got to your company, or 500 or 1000 or whatever, and that's two, though, slow down their machine. If you slow down your developer's machine, hell is going to pay by the CTOs that you go to the CEO and say where. And also, you don't want to get in the way of them using get commands. That's the. That's how you do their job every day. So that's why the approach we take in is not to to stop that, but it's to monitor that and to and, of course, correct behavior.

Speaker 2:

So, if so, the first time a developer moves something to a open source product that's unsanctioned, which is also how the that we see that a lot too, which is, hey, meet, work, we're contributing to 70 open source products, but it turns out that when we actually start monitoring, we see code moving to a lot of hundreds of open source products. It's a very regular thing that we see at code 42. When we work with our clients, the first thing you do is you send that video to say, hey, no, do that. And then the second time the developer does it and you have a conversation and then if it happens again, then then you go to their boss and say, hey, no, developer is doing X, y and Z. But what you don't do is overreact and say fly that developer, set down their machine, cut off all software, that people from pushing code around with gate tomehans. That's exactly the home response for a security team to have, and I think it's it's it's right.

Speaker 2:

That's one of the big differences between insider threat and external threat, because when you have a piece of malware come in, you know the first thing that you do is it's killed by the sledgehammer. You don't isolate that machine, you're going to disconnect it, you're going to quarantine it. That is not what you do with insider threat, because you really wanted to investigate and understand what is the person doing. Why are they using drought loss? Why don't Joe use a thumb drive? You know, and it turns out Joe used a thumb drive because he don't know, because he was giving a presentation and he had to in order to get it done, even though it's against our policy. We use a thumb drive. We understand why we made that decision in that situation and we didn't come down and get Joe with a sledgehammer right, because it's always a bad actor when malware gets in. You know it's always a colleague when we're talking about insider threat.

Speaker 2:

The threat is fast moving and propagating when it's from the outside. But insider risk does it spread? Just because you, joe, lose the files around doesn't mean so hot or Joe is going to move some files around. That's not how it works. You have time to investigate and to understand what's happening. Right, when it's malware, you know the security team can handle it on its own every time. But when it's an insider, you're going to need HR, you might need legal, you're going to need a little bit of help to do that. So these are very different situations.

Speaker 2:

It's never accidental when you get malware in.

Speaker 2:

It's always it's sometimes.

Speaker 2:

It's sometimes accidental for in fact, it's, about 50% of the time, an insider risk event.

Speaker 2:

This was just an accident. Like you never need to educate the hackers, the bad guys. Like that's a waste of your time and money when you always want to educate your own employees, because that brings down the number of events for you last time in it. And then, last thing, I think the interesting one is you can't. You can get fired for not reacting fast enough to a malware threat in your environment, but with insider risk you can get fires for reacting too fast and so it's like it's a totally f*****g-less-it sign. So one of the things that we always work with our clients on, if you're in security, you really need to process that is how is this? Is that the folks that are running CrowdStrike and look at all the internal, the external activities, are not necessarily the same people that you want to have running your internal, because they can be, but it's a. You have to turn your brain around and understand which problem that you're working on external or internal because your response is and the way you handle it are so very, very different.

Speaker 1:

Yeah, that's a really good point that I was going to bring up. It sounds like you almost need two separate teams because to that normal security team everything looks like a nail when you're used to being the sledgehammer. I'll tell you right now if I saw something like insider threat or whatever, I would have immediately, without question, quarantined the machine, started doing a forensics investigation into the guy or whoever it is. I would immediately go the nuclear route. Okay, we're going to make sure this person can't do anything.

Speaker 2:

Unfortunately, that's an exact story from my customer base. One of my customers, ciso, has called me and said Joe, I'm really angry with you. Your team handed my team a gun and didn't teach you how to use it. And it was the very first thing incident they had was with the senior vice president at this organization, and the security team did exactly what you described a warranty machine, a toboa machine and then they interrogated the senior vice president and was hugely embarrassing for that executive. But it turns out the executive was just actually doing their job and there was a little bit of a misunderstanding about why I'm supposed to be sending information to that particular vendor in this case. And the security team oh, that's an untested vendor and oh, yes, had they handled it the right way, it would have just said oh great, we've learned from that and we'll add that to our trusted list or wherever. But they put on a team machine, they isolated it and then they interrogated a person and so, yeah, that happens. And so part of what we've learned is like who are we going with our customers? We have to have a lot of conversations and the funny thing is is most of our customers are not, you know, securing folks, like you've worked with them all year, most of all like we have this message that we know you want to tell us. We're very familiar with all the stuff, and we have to tell you listen just a little bit in us, because your methodology is so different.

Speaker 2:

So many people in security would like to think like Fedor, say, mom, let's just block, let me just block all this. You know exfiltration, and if you just block it, then you know we all worry about it and it doesn't really work with insiders, right, you know it doesn't really work as first-worlds like well, so you're gonna block external jima. You know a lot of it. So how are your employees going to? You know, are they after a second machine around with them and they have to do it? How will that go? Re-employment oh, tough, tough today, but okay, well, fed, we know, you know, let's see if that, if that plays out well in terms of your employees. Just don't they find that you, you got a nail or a proton nail where they start, you know, going around you. So what always finds a way, and so, oh, we tell our customers.

Speaker 2:

As I look, your employees in most of the organizations we work, your employees are better have a gene nail account. They might have a dropbox tab because they do something with their church or their school soccer gene with thousand like X. The important thing is that you help them identify, is that you watch for important company data not moving to those things, and then you course correct them if you see those things happening and you always have the information available to you that you can. It's necessary to cross you down or whatever, but usually what you just need to do is course correct and let them operate the way they they need to offer and collaborate and work together. You cannot get in the way as if you start putting on these rules. One people's machines, it's posed down to machines for everybody, and so it tends not to be a great way to do things.

Speaker 1:

Yeah, it reminds me of a time a little bit earlier on in my career where, you know, my manager was walking me through, like our, our data loss prevention solution that we had, you know, and he was saying how, you know nothing, nothing is going to leave that we don't know about, or anything like that, and that you know it's a perfectly locked down and everything like that.

Speaker 1:

You know, and my security mind, I immediately and I did this subconsciously, right, I didn't even realize that I did this at the time, I took it as a challenge Like you think I can't get around this, like I'm not the admin on this, I don't own the tool, but you think that I can't get around this. And then I immediately started discovering ways of getting around it. And when he didn't believe me, I showed him right there. I was like, well, I just did it. You know, like it's, it's on my phone now, it's on my personal cell phone now. Like what are we? What are we doing? You know, and it's, it's a different, it's a different mentality that security people have, you know, because I feel like when you tell us that we can't do something, we immediately make sure that we can't.

Speaker 2:

Well, it's funny. I just as as as as comprehensive as our product is today. And the thing about traditional DLP is it doesn't see. There's so much it doesn't see. I bet in the way we talk about it is traditional DLP. You write specific rules to cover things that you want to cover and so as long as you know what all the risks are, you know you'll be able to refine.

Speaker 2:

Of course, you don't know all the different ways that people can exfiltrate data and move data around, and so they tend not to be very effective solutions. In fact, when we go, when we sell today, 85% of our customers and Greenfield mean it may not have anything. Because people gave up on DLP 15 years ago. They said these systems don't work, they keep us from collaborating, et cetera. And so the space we call the space and Cyrus management today Microsoft calls the same thing. So the, the, the idea is really like hey, we will block people for doing things that all the in rare circumstances only mostly focus on is monitoring and then and then scoring things in the cloud so we can see where did the data come from, who had it, where did it go? What do we know about that? Versus is the first time that you said we can do all that kind of analysis in the cloud and and and do it efficiently. So then also, people's and phones down and then we can score and then we can tell security. Well, I need to pay attention to Joe and in particular, when Joe is leaving the company, this is the biggest risk you have. We can look back 90 days and say, hey, joe, 40 days ago he took, he took, he took a lot of information and in this, well, one is we talk lies in the same way that you. You know you're very careful to make sure that we get our bags back so they can't physically get into building their very careful and actually the laptop accurate, so they can't get on our networks. But Only 10% of companies ask people about their data. Only 10% ask you on the way out hey, did you take the data?

Speaker 2:

And so one of the things that we have is a process then say, go back and look at every single person who's departing the company and just double check that they haven't taken all their data, because if you think about it out while they're in the process of leading her, before they've left, it's really easy to get it back at that point and I know that sounds like well, how is it? You know the security will. You'll never get that data back, but if it's someplace, I'm like, look, here's how you get it back. You basically tell them we need to get back, you need to do this, all copies, and we need to be confident that you're doing that. Because when I'm confident you're doing that, you're gonna sign in attestation that you did that the total, full bill of the process. We're gonna notify your next employer that you're in possession of our data. If we notify your next employer that you're in position of our data, that actually puts them on the hook legally for it. They're gonna also send a pretty strong message to them that, hey, you're hiring a new person and new person is never worth the day of your company.

Speaker 2:

Is the process of stealing the data from their last company. Do you really want to hire that person? So it's what most people come to understand very quickly is that they're putting their next job at risk if they don't return the data and up like when an empty lot of dollars you left, they end to start it for a year. By that time they had taken all that data and they were using it. All that information was being used across Uber in this case and so unwinding that was really complicated. If they had found that the two days before he had left and figured that out, it would have been a completely different situation Although now it's very time-prisoned and then you would take his mind and she's allowed it safe and his ideas and worked at the next company and maybe created it so that information and done some different things. But you want to sell his actual designs, source code to property, et cetera.

Speaker 1:

With the evolution of AI and LLMs? Are you seeing this as a new, I guess, attack avenue or attack path for insider threat? Because I'm thinking about it from the just a simple chat GPT side of it. Like most of the time the company isn't going to block chat GPT. You could just upload whatever you want to chat GPT and one. You don't really know what they're doing with that data on the back end. So that's obviously a huge risk, but at the same time you could save that conversation and then you can go and access it from your personal computer. Are you seeing newer attack paths that you haven't seen before or come up?

Speaker 2:

Yeah, and let's talk about sort of you know hat trial, how to act on demand. So first, I'm a rimerist. Most of the people that are moving data out of the company. They put them into large lines all like chat GPT. They're a hoise. And what are they doing? They're just finding their jobs, like they're literally trying to do their jobs. They're probably not thinking I'd stay on Excel freeing the data but not putting it up into chat GPT so that they can steal it in a case, or just like hey, do you like to get an answer to this or something? So one, that whole video education part that we talked about is super important, right? So just remember, why are they doing what they're doing? And can we dictate them and then say you know, help them not do that? That'll solve most of the problems. Just try and be there. They were actually trying to in most cases found steal data. They certainly not find a steal data in chat GPT. How would that benefit them?

Speaker 2:

I think what's newly been interesting is that it's the first security. So about seven months ago, when the supply chain attacks started happening, you know some of the poor members would ask security, hey, how are we on our supply chains and most of them are there when nothing has been quite as loud as the boards of directors specifically asking security leaders a re-cover for AI, like how to low-earned up in the chat to TV. It's one of those things that's been pressed to much that poor members who aren't great technical usually and aren't great security savvy but boy do they worry about this issue, is they? You know the tank cycle is really high and so they're freaked out. So security people are hearing a lot of like are we covered for that? So on the one hand, it's one of those times on security to be like, hey, you do the best to more money and so it has some resources because the board really cares about this. But the actual threat out again is relatively easy, easily mitigated and controlled by controlling what you know from a network security standpoint, what you get access to. You know, like our helping. We have our own internal chat to TV, so it's hard data and so it's easy to. We restrict corporate access to chat to TV. As I mentioned earlier, we also watch, you know, for large style uploads as well, and then I think that's an area that's been in this we've got to work. Test you now One last example, sort of to your point RCTO did a experiment and you just asked TechEBG to go tell us what the product roadmap plan is for our competitor and for work.

Speaker 2:

Peppers and TechEKING got all of their stuff, so somebody over there had uploaded enough information. I guess they were working on their you know product plan and why and how it been you know creating it. And we got the insider of product plan just by asking chat to TV. So, yes, the format is real.

Speaker 2:

It is usually accidental, so you got to handle it, knowing that it's usually accidental and closer, direct first, are paying a lot of attention to it right now.

Speaker 2:

So it's not, it's something you really need to pay attention to, even though I would say them, the previous threat to most organizations right now is source code extliferation from your own software engineers, because you know you think about like hey, I have forgotten to install our source code. That's bad, because you know they can look more bullies in your source code. But that factor is at a hard time. Taking that source code and making a product, the person that wrote back to the scope and knows exactly what it does and how it's written and what business problem it's solving because they wrote it. That's the biggest risk to your source code today and keeping those people in line and in check from doing that is a much bigger risk today and I think you'll start to hear a little bit more about this in the coming months. Is it a few big cases that are happening on Thursday? Is it probably a bigger risk than what's happening on the AS?

Speaker 1:

Wow, that product roadmap that is. You know, I mean the person that probably uploaded it right was just trying to get chatGPT to write it better. You know, I mean I do that with everything you know.

Speaker 2:

I mean I know people that do that with all their emails to executives, right, you know it's a very valid and again I get it, you're not malicious, so you think right, you're not trying to hurt the company when you do it right. So because of that, it's one of those things that can be largely solved through education. But you have to take it on, you have to address it and then you secretly leave things as well as soups. So most of us, all of us, go through training. You know there are companies that copy. Every year we have to do these mandatory trainings and none of us pay attention to it.

Speaker 2:

We all push 2X on the video thing. You know that's how we do it. What really we found is really the type of the moment you do something wrong within a few hours. That's when we send the video as an end time, because you know, oh, I did do something wrong, I did upload something to chatGPT. I shouldn't have done that and they noticed, and so you could.

Speaker 2:

Why those things? That's just too important. So I would say that right now, inside of Threat in AI is an education problem and then, of course, correction problem, because a lot of people will suddenly be like look, I know I'm not supposed to do it, but I'm not going to email to my CEO. He needs to be really good and so I'm going to put in there and ask for better language and stuff. Well, if you know you're made it caught building that, then you're less likely to do that, but if you don't think anybody's paying attention, then you're going to continue to do that. So I think it's that. Of course the education is important, also the enforcement of that education, but that's different than just sitting and someone with a stash card. No.

Speaker 1:

Yeah, you're playing more into the psychology, right, it's more of the psychology of the person. How do we change that thought process? How do we change how they view this or whatnot? Right, which is? It's the better approach than the sledgehammer that I would have done.

Speaker 2:

And Joe, you pointed this out. It's so different than most cybersecurity.

Speaker 1:

Yeah, it's completely opposite.

Speaker 2:

We should talk to fancy bear and tell them what they're doing is not even thick. You know what I mean. There's nobody at CrowdStrike that's sitting around and worried about educating. You know, one of the things that we always get raised eyebrows from security people is we say, look in security, we don't want anybody to know our methods and practices. Right, we don't want anybody to know that stuff. It's set inside risk and companies.

Speaker 2:

And actually why tell people A lot of during your food hour, monitoring you? We care about your job. We're not hot in your productivity, we're just monitoring when you move data to untrusted location. You know like we want you to know that we pay attention to that. And first of all, you tell security people that they're not going to tell people that, no, you do. You're not really want to catch them, but the rest of security is more tax stuff. It's like you just don't want them to do. It is what it comes down to and you know it's a way to grab to try to escape the hackers of the world and you know to find syndicates who are really wanting these things today. It's a waste of time and we see when they hack hospitals and they're literally causing people life and death problems and they'll tear. You know, and so clearly you know it's. We're wasting our breath and we try to educate the dinosaurs.

Speaker 1:

Yeah, it's a great point.

Speaker 1:

You know, recently I'm not sure if it was in the news or anything like that, I don't watch the news but you know, lurie's Children's Memorial was attacked.

Speaker 1:

You know, it wasn't like a small attack, like they took them offline, you know, and this is a children's hospital that treats only the sickest of sick kids, that doesn't even charge families if they can't afford the treatments, they don't even charge them. You know, I had a very personal experience with that right, like my sister was a patient there and there was no way, no way my family would ever be able to, you know, shell out the half a million dollars to get her, you know, treated for the next three years or whatever it was, you know, and they didn't send us a bill or anything like that. You know, like that was a huge weight lifted off of us. And now these attackers, you know they they go in and they they knock them offline. It's like guys, you know, at least do an adult hospital, you know what I'm saying Like a children's hospital is just like the most heartless thing that you can do, but these attackers they don't care, they know.

Speaker 2:

They know and that's, and again, very different than your insiders. I mean insiders do care, they work at your companies and they can most far, I mean there's those a few people doing, you know, real uh, espionage at organizations. I was a far kind. So like, yeah, we're going to help you find those two, but like, the most important thing is to help you reduce the amount of offense. You see, their non-friendly, their accidental. That would just require a little bit of, you know, a course budget so you can spend your time. Go to some of the others.

Speaker 2:

You know, a friend of mine told me to send us to my first school, my first job, in my early spring, because I was the manager of a small retail store and I saw like seven people working really small and, um, and somebody was taken through the till and this is where all came to surprise, you know, and that then we took tax rights and my owner, the boss, said look, joe, 10% of the people, uh, will never steal from you, and 10% of people will always steal from you. And 80% of the people will steal from you if you give them the opportunity. So your job is to set the systems and the processes and the controls so that 80% of them steal from you and you go focus and figure out who the 10% that will always steal from you is and um, and that's sort of a retailer, a retailer, a retailer, a retailer, a retailer if you ask a retailer, who's your where, where's most of your leakage from? And it's not from, um, people coming in well, that some cities are not mostly not people coming from from the outside, and so so it's your employees taking it.

Speaker 2:

And I think the information services businesses and professional services businesses have been so slow to realize that Um, uh I tell that story often now because we try to really help our clients do is get that 80% figured out, so that they're trying to start with the 10% that are, that are are going to take three or four or what, um, and, and catching that is is is really important because they probably have malicious intent with what they're going to do with the data.

Speaker 1:

Hmm, well, you know, joe, we're. We're, unfortunately, we're at the top of the hour here. We're at the end of our time. You know, I feel like we could we keep talking about this forever. Um, you know, yeah, it was a, it was a fantastic conversation. I really appreciate you coming on. Um, you know, before I let you go, how about you tell my audience, you know, where they could reach out to you if they wanted to reach out, and where they could find your company?

Speaker 2:

Yes, so we're code 42 and that's code 42.com and you can get me at pain Joe on back, sign us and you can get me at Joepain JYNE at code 42.com and happy to talk to any of you. And um, you know we try to get back to the community. So the research report coming out, um, I think in a few days, uh call the data exposure report. We do it every year, so look for that in this loss and information there. Uh, inside of that and center risk.

Speaker 1:

Awesome. Well, thanks everyone. I hope you enjoyed this episode.

Cybersecurity Expertise and Government Collaboration
Insider Risk and Data Protection
Managing Insider Threats in Security
Data Loss Prevention and Insider Threat
Internal Threats and Data Security