Security Unfiltered

Unlocking the Secrets of Future Technologies with David Holtzman

Joe South Episode 181

Send us a text

Cybersecurity enthusiasts and curious minds alike are in for a treat with our conversation featuring the esteemed David Holtzman, a cybersecurity virtuoso whose journey will both inspire and educate. Discover why the real mastery in cybersecurity transcends formal certifications, as David shares his formative experiences from high school to the present. His story emphasizes the critical role of curiosity and problem-solving skills, offering valuable insights for those venturing into this ever-evolving field. You'll gain a fresh perspective on what it truly takes to be an expert in cybersecurity as David and I explore the foundational knowledge and mindset essential for success.

Shift gears with us as we explore a captivating transition from the allure of traditional luxury sports cars to the groundbreaking innovation of Tesla vehicles. Our discourse navigates the shift in skills from mechanical to digital and unravels the challenges of maintaining vintage cars versus embracing new-age technology. Dive into the quirks of modern tech frustrations, such as dealing with printers, against a backdrop of fascinating anecdotes from my NSA experience. This segment paints a vivid picture of how technology has transformed both personal passions and professional landscapes over the decades.

Rounding out our conversation, we tackle the future of AI-driven cybersecurity and the pressing concerns around AI security and privacy. As we weave through the complex tapestry of intelligence work from the past to the potential of decentralized systems, we consider the implications of centralization and the innovations of web3. Witness the exciting possibilities of blockchain in enhancing network security, and reflect on the importance of interdisciplinary skills that prepare professionals for the unknowns of the tech world. This episode promises to enlighten, challenge, and inspire those seeking to understand the nuanced world of cybersecurity and technology's future.

Support the show

Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today

Speaker 1:

How's it going, david? It's great to finally get you on the podcast. I think that we've been planning this thing for quite a while, you know back in 2024. And now you're the. You're the first episode of 2025. Because I burned myself out and had to take like six weeks off.

Speaker 2:

I am honored no-transcript feelings about that, actually and so there's so many people who bill themselves as cyber security experts you're probably having some on the show and some of them. When you drill down on what they, why they think they are, it's because they took a bunch of microsoft certification classes, right. So that's virtually useless in any real world scenario, because the really bad stuff is stuff that people have never seen before, and no amount of Microsoft licensing or certification is gonna prepare you for some wacky denial of service attack that nobody's ever seen. That's just basically having a brain and being able to think through it. So the best cybersecurity people are people who are not formally trained to be cybersecurity people.

Speaker 1:

Yeah, yeah, no, that's, that's very true. You know, it's always interesting when, when I bring people on and you know I try and find fantastic guests, you know like overly qualified, very experienced people like yourself and whatnot and every once in a while this happened. This happened, you know, maybe in the middle of last year, right, where someone came on and we started. I started to get a little bit technical, because I'm technical, you know like I'm. I'm in the weeds, you know. I, I, uh, I wake up and I get you know into log files and I'm figuring out what's going on. I'm reverse engineering systems and whatnot, like all day long and I look up and it's 6 PM, you know that sort of thing, and we started getting a little bit technical and I immediately reach their technical quote-unquote expertise limit and then I push a little bit farther and come to find out, oh, you're not technical at all, you kind of stumbled your way into this thing and someone promoted you early and that's what happened, right, which is it's frustrating for myself.

Speaker 2:

I deal with this all the time, given the kind of things I do and where I live. I've been asked by some VC firms and other firms to do vetting of people, and about 80% of the people who call themselves cybersecurity are actually lawyers and they're no, I'm serious. They're people at a law firm who were involved in one case involving some kind of aspect of cybersecurity and now they're an expert. It's like you know, I go into CVS and buy a bottle of aspirin and now I'm a doctor. Yeah, it doesn't help our profession because it downplays. It makes it look like it's easy to be this.

Speaker 1:

Yeah, yeah, I mean, you know, when someone is trying to get into cybersecurity, right, and they're reaching out to me, they're asking for advice and whatnot really the very first thing. And some people, some people that I've had on and I said this too they like took offense to it and like were appalled by how I approach it. But I this to, they like took offense to it and like we're appalled by how I approach it. But I try to convince people to not get into security. Right, because if I can convince you just through words to not get into cybersecurity, you're not going to be successful in this field. Right, like, because you have to have a curiosity that cannot be itched. Right, and you need to be the expert like, not just an expert in security. Like. You need no networking pretty well. You need no system design pretty well. You need to know, you know the different processes and services that are talking behind the scenes on your windows device, right, what they're linked up to and where they're actually configured and all that sort of stuff so I've.

Speaker 2:

I've been doing this literally for half a century in some form or fashion with computers. The thing when I look at a problem, I think of it at multiple levels and sometimes it's literally at the bit level and I'm thinking, okay, what's on the heap, what's in the stack, what are the bits? And without even articulating it, this helps me do things like I repaired a remote control fireplace the other night because I just knew what was wrong with it without having ever touched the thing. And this is a skill set that I'm sure you have. I have Many people don't. You cannot teach this and it's some kind of weird survival trait that I don't think people recognize for what it is. But you can throw anything at me that's a computer-related thing and I can figure out what's wrong with it in a couple of minutes.

Speaker 1:

Yeah, wrong with it in a couple of minutes. Yeah Well, david, you know, we kind of we kind of just like dove into the deep end here without you know you talking about your background, so why don't we backpedal a little bit and talk about? You know how you got into IIT. What was that start like what? What intrigued you right about the field that kind of propelled you into this world that you're in now?

Speaker 2:

Okay, well, I mean, we have to go back a ways for this. So I mean, I went to high school in Pennsylvania in the mid-70s early 70s even and we actually had a computer and it was an IBM 1130 or something like that, and it had punch cards, if you've ever used those, and you had to mark, sense the cards and then you run them through. And if you did assembly programming, which is what you normally had to do, there were 16 switches on the CPU and you would configure the switches for a binary number and you'd hit the button and that was one machine instruction and then you would do that for the entire program you just wrote. So that's like days to do that. So I was intrigued by it and then I sort of let it go for a while.

Speaker 2:

I got a degree in philosophy, taught some symbolic logic and other things, and then I, through a bizarre set of circumstances, I ended up being an intelligence agent for a number of years. I had to go in the military. I did, and they, because of my test scores, they trained me to be a cryptographer. They sent me to Russian school for a couple of years and I ended up on submarines, and so I did that during the height of the Cold War and it was fun. I actually had a really good time. It's not like anybody got killed, you know. You didn't have to worry about bombs blowing up a Jeep or something, it was just the Cold War was a very different kind of thing. But computers first time I've seen computers play a part in the real world because submarines at the time were heavily computerized. I mean, given what was there at the time, what was there at the time. So the computers were called YAK20s, u-y-k and I think it was a deck computer and they used to send out this is I always thought it was hilarious. They used to send us out with a repair kit and it was a big brown plastic case and if you opened it the only thing I had in it was a rubber mallet. And they said you'll never fix anything. Just start whacking the crap out of the boards and something will go back in place. And you know what it did. I had to do it like three times. So I mean, that's kind of the early days.

Speaker 2:

I did some other work. I went to NSA, I worked at Cosmonaut, the Cosmonaut program, and then I got some more degrees from UMBC in computer science with a math concentration, did grad work at Hopkins and then I was at a crossroads because I either stayed as a professional intelligence agent or I went into this fledgling world of computers in the early 80s and it was a no-brainer for me and I mean I knew where things were going and I got out and started programming a number of different languages and within a few years I was designing systems and then I ended up. I was, I ended up running research at Booz Allen and Hamilton. The consulting firm IBM hired me to be the chief scientist for the Internet Information Group, which is all their Internet-related software, basically Not networking, but anything above that. And that was pretty cool. Actually at the time I'd never played in the big leagues like that before.

Speaker 2:

I got a lot of job offers this was like 95, 96. And I decided. The one I wanted was this little little company in Herndon, virginia. It was an 8A firm and it was called Network Solutions and the only thing they had going for them is they had a locked contract with the National Science Foundation to basically run the internet. So they it was called a cooperative agreement, so they ran the whole domain name system, all the root servers, tcpip allocation for North America and the CDPD, the cellular data network. So I came in as CTO and then I ended up running all that. So that was pretty cool.

Speaker 2:

And I got to deal with crisis after crisis because from 96, 97 on that's basically the dot-com bubble. So all of a sudden people actually gave a damn what was going on in the internet, and up until that they didn't. It was a curiosity. In the early 90s it was like labs, and by the end of the 90s it was billions, tens of billions. So I went through that. My company went public, I did an IPO, a couple secondaries and I was running all this stuff during Y2K and I was on President Clinton's task force representing the internet during Y2K and that's a whole story right there.

Speaker 2:

I didn't like where some things were headed. I left and started writing books on privacy and wrote for a couple of magazines. Nobody cared really that much about it at the time. It was some kind of like weird conservative thing and the liberals didn't want to have anything to do with it because privacy seemed to run smack into First Amendment issues, and so my natural constituency were people I didn't actually want to deal with. So then I got into some other things. Story's almost over here. Sorry the staking's long, no worries.

Speaker 2:

I did politics. I was a CTO for Senator Bayh when he ran for president for two years. That was actually a paid gig in Arkansas. And then I was the head of security for General Wesley Clark when he ran for president, and so I got some other exposure and at this point I was pretty cynical about almost everything. And the thing I was cynical about was the people who should understand what was going on did not understand what was going on, and this was a huge. I mean I knew where things were going. I mean what we're seeing today with cybersecurity, for instance, you know, and data breaches. I mean the writing was on the wall for that 20 years ago and it's now. Anyway, we can talk about that.

Speaker 2:

So I started doing. I started traveling the world. I hit 85 countries in a couple of years and then I came back and I started working with very early blockchain companies all in Europe, because none of them wanted to work in the United States because they were terrified of the Security and Exchange Commission, especially when they're doing ICOs for tokens. I mean it's still not clear how US tax law treats that stuff. So I worked with a number of those companies and I'm still working with a couple, and then I got into post-quantum encryption. So now I'm doing sort of Web3 non-centric security with post-quantum encryption. So that's kind of a long story. Wow that is.

Speaker 1:

I mean, that's really fascinating. You know, it's just where this field has taken you. I mean, did you ever think that you would taken you? I mean, did you ever think that you would, you know, be on President Clinton's cabinet? You know, like no, no, Like starting all those years ago. You know, did you ever have that in mind as that even being a possibility?

Speaker 2:

So the truth is I was a single parent, I was raising five children, I couldn't even afford daycare, and I'm sitting on all this stock in a company that might go public, so that was very good motivation for me. And finally, when the stock did do well, I mean I didn't get rich, but my kids all went to college and I bought a Porsche. So what?

Speaker 1:

kind of.

Speaker 2:

Porsche. I got a 911. Okay, which I'm now feeling really embarrassed about because I sold it and bought a Tesla and I really love. Yeah, I know, I love, love my Tesla and I just feel, I feel like such you for the Tesla.

Speaker 1:

We we bought two Teslas in 2024 and I love them. I absolutely love the car. Recently bought myself a Model X and, uh, I've wanted that car since it was announced, right Like. I just love everything about it, but I couldn't imagine selling a Porsche, even for a Tesla. I would just like have both.

Speaker 2:

Well, you know, here I live in, I live in the city Parking spaces are at a premium. I actually have two spaces, which is like two more than most people have. So when my wife and I bought this house, that was one of the reasons we bought it. But we have an SUV too, and the Porsche was just sucking up money and every time something happened it was thousand dollars. Oh yeah, I mean everything. You can't you know? Cigarette lighter, three thousand dollars, yep. So I got tired of paying it. The dealer here sucks so whenever they couldn't get parts. So, and especially during the pandemic. So anyway, that's, that's why I did it but I got to drive it I got.

Speaker 2:

I had to drive it for 20 some years. It's uh, it's an incredible machine.

Speaker 1:

When you're dating, oh yeah yeah, I, um, I I just sold my audi s5 and it was my. It was my first sports car and that's a good car. What a, what a fun vehicle. But when it breaks a man, when something goes wrong on that car, it's I mean, like you said it, I just got to the point where I assumed you know I'm going in for an oil change and I assume they're going to find three thousand dollars worth of stuff that's broken that I don't even know about. Oh, I'm sure.

Speaker 2:

So you know, going off that, going back to the other thing I said when I was a kid, growing up, this is, like you know, pre-psychedelic era, going through the Beatles and all that. So my friends who are good with mechanical stuff were highly in demand. Women liked them, guys liked them. They could change your spark plugs. They didn't have to go into the gas station, they would go, yeah, it's your timing, and they would get in there and they would fix it. They could fix TV sets, they could fix washing machines. Guess what? You can't fix a goddamn thing today.

Speaker 2:

So now it's the person with the skills that I was just talking about. It's it's the person who I used to go to dinner parties with people and a lot younger than me and I would have an iPod and a CD and I would say, hey, I'll give 20 bucks to anybody who can take the songs off this CD and put it in this iPod. Nobody ever knew how to do that and to this day, when I deal with politicians and multi-hundred millionaire VIPs, they don't know how to do anything either. And they all have like nephews and like eight-year-old, nine-year-old nephews and they do the work for them Like printers, like configuring a printer is still way too hard. Way too hard, yeah, and it should be easy, and if you're lucky it will be, and if it doesn't configure in the first two minutes, you're in for a bumpy ride.

Speaker 1:

I hate printers. I really do. I really do. I really hate having them. I only have it because my wife is a she's an early childhood teacher and so she has to print a whole lot. So we have like a very robust printer and it's just. You know it doesn't work. You know very like fluently with a Windows PC and MacBook laptops and you have to reinstall the driver all the time and it's so, so dumb. But so go ahead.

Speaker 1:

Yeah, I was going to ask you what, what your time was like at the nsa. You know, I've had other people on from various agencies CIA, nsa, dia and they all tell me roughly the same thing. And I have a good friend who's in the military and he said that if I ever do make it into the NSA, that first month when you're being read into 90% or 80, 80 of what you need to be read into and whatnot, it's gonna like the capability side of it. It's kind of just gonna blow your mind right, like you wouldn't even realize. Oh, you can use that for for this thing over here, right? Well, I'm I'm wondering did you have that same kind of experience back then? Because you were really, I mean, at the beginning of this digital era, right?

Speaker 1:

I mean it didn't really even start. You were at the very foundation of it. Was that experience true for you as well, or what was that like?

Speaker 2:

Well, when I got to NSA it was early 80s and they had a couple of supercomputers, like really expensive ones Cray's, cray 2s is what they were and we didn't have access. Nobody had access to them, so they had like PCs. They were like 83, 86 machines or something. So if I wanted software I had to write it. So people used to come to me and I would write a Turbo Pascal program to do some intelligence thing, because you couldn't bring stuff in from the outside, and so that was kind of fun. And when I was in the submarines I had some of the deepest security clearances you can get. I mean things that are only like that are still classified and only like 30 people in the world could read the material. It was like there is stuff like that. But in the end in CIA what that means typically is it means they have an asset, a human asset, like Putin's hairdresser. So Putin's hair I'm just making this up I hope if he gets killed tomorrow I'm going to feel really bad. But so let's say Putin's hairdresser gets turned you know, happens all the time. So that would be very, very carefully protected because they're going to shoot him in the head if they find out.

Speaker 2:

For NSA. It's almost identical to what hacking is. In fact, now it is hacking. It's like you know. It's like basically it's zero days. Before there was even a term zero day. Nsa was looking for zero days. They were looking for defects, bugs, some kind of malfunction in any mechanical or electronic device that they could turn into an acquisition thing. So that's why and this stuff's not classified anymore, I think but that's why they were doing things like bouncing laser beams off windows. So you could I mean you could hear what was being said in the room, and there's crazier things than that in the room and there's crazier things than that. And so that's the secret. The secrets in NSA were mostly that kind of stuff. And then a bunch of boring stuff to most people, like what frequency a satellite downlinks on. You know what I mean? It's it. Most people could really give a damn and wouldn't even understand if you told them. But if the Russians got it it would be a big deal, right?

Speaker 1:

Yeah, that is, that's really fascinating.

Speaker 1:

You know you talk about having that clearance and you know only 30 people in the world are even allowed to read that document. I always wonder how, like, the level up from that even works right, because someone I'm just trying to think of you know least privileged permissions, right From my perspective. From my perspective, if I want to give someone else access to a system or whatever it might be right, it doesn't matter the sensitivity of that system, I have to have access to that system right, in some way, shape or form, I have to have that access. And so it's just interesting to me for how agencies deal with that, because obviously you don't want everyone knowing you know nuclear secrets or you know whatever that might be, and you have to really tightly control that information. It's just fascinating for me to you know, think about it, how you would do it, even with a people like a physical, you know person right, like how do you control that, how do you monitor what they're doing, and that sort of thing 10 years from now, nobody's going to be doing that.

Speaker 2:

Maybe six or seven years, nobody's going to be doing that. And the reason is because both defense and offense is going to shift over to AI AI-driven systems because they move much faster than human beings. So if an AI is running some kind of denial of service act or some kind of penetration hit on your network, it can make like a million hits on every single address in your network just like that. So no human being will even see it coming, let alone stop it. So you need to have some kind of AI-driven defensive system.

Speaker 2:

On the other end, and that's one of the reasons I'm working with a company or two that's doing Web3 decentralized stuff, because I think the biggest damage that's been done in security in the last 20 years is deferring things to centralized companies and that's where all the breaches happen. They're service providers. I mean you know Equifax and SolarWinds. You look at any of those, it's never the company with the name on it, that's they're not responsible. It's some idiot third party that they hired to do credit card processing or something and they got hacked. And then it happened with AWS too. So I mean that's the hole. So in the future, when it moves into an AI driven system, that hole, those holes, will go away. Hmm.

Speaker 1:

Yeah, I, you know, I always talk about planning for the future on the podcast and and you kind of seem like someone that that thinks into the future. Right, then you start. You start working towards it immediately because it's like, hey, if we're going into a post-quantum world like we are, I need to be experienced with it, I need to have some level of expertise with it, otherwise in 10 years I'm going to be obsolete and I won't be able to do anything. Right, how, how do you determine? You know where things are going, where to spend your time, what to really focus on? Because you know for myself, right 10 years ago, I knew I wanted to get into cloud security, right, and now I've been in cloud security for a while and now I'm shifting gears, getting a PhD in how to secure satellite communications in a post-quantum world. That's a good one. Using zero trust principles right.

Speaker 2:

Good yeah.

Speaker 1:

So I'm also someone that looks towards the future and then acts on it and says, well, what's going to challenge me, right, what's going to make me grow? And those are typically the most rewarding, probably most arduous tasks, right, how do you approach it?

Speaker 2:

Well, I have some old friends who are very, very senior tech people and sometimes we talk. I just had a long call with an old friend of mine yesterday who used to be the chief scientist at Amazon in the early days In fact I had a fellowship there at the time and we had this futuristic talk because we both were kind of laughing about it, because we both see very similar things coming two years, five years, 10 years. I mean there's nothing we can do about it. And I found a long time ago that if you invest in the future you will go broke so fast, because I tried this, because I always saw what was coming and I was almost always right. But you can't just because you know something I like 3D printing.

Speaker 2:

I saw that coming years before it happened. I saw that coming years before it happened. So when the 3D printing companies came up, I said, oh, I'm going to buy stock in this stuff. Well, guess what? I was right about the industry, wrong about the companies, and I mean that's you know that's the kind of stuff that happens.

Speaker 2:

But I think futurism that's another word I mean I sometimes call myself that, but many people who call themselves futurists are frauds. I mean just flat. They're like hella evangelists, like that level of fraud. And when you talk to many of these people they have a marketing background. They're not people like you and I. That could you know in a pinch. You know, dig into a router and try and figure out what's going on. I haven't done that stuff in years but I could still do it. They're not like that and you know that goes back again to the kind of the theme that I didn't know I had here, but that these skills are changing and they're going to be less useful. Like I tell you, you know you were saying about cybersecurity and you give people like a test question to see if they're serious.

Speaker 2:

I try to talk people out of going into computer science and I've been doing that for seven or eight years and I often give talks to grad schools and they get angry, usually because they're, like you know, one year away from getting their doctorate in computer science.

Speaker 2:

The argument I have is computer science today and tomorrow will be mostly algorithm development, and there is only so many algorithms that you need people to develop, and it's a very small subset of the number of people running around today with graduate degrees in computer science. So most people who call themselves computer people or technologists they're kind of, you know not to be offensive but they're kind of webmasters. You know they put up a website, they know how to do some Java, javascript. I mean they know what JSON is, maybe. I mean they know stuff, but it's very, very narrow. It's not the way things used to be, where you had to know all of this stuff. It's like the mechanic guy who can do your spark plugs. It's not just General Motors. He had to work with Fords and Chryslers and you know else, because it was they were principals.

Speaker 1:

So yeah, that is a really good point. You know, I don't even know what, like what they would get a phd in computer science and like what does that even look like? Because in your bachelor's you're learning, you're learning, you know the bits and you know hexadecimal, you're learning c plus plus and all that sort of stuff and I I didn't get my bachelor's in that area. I actually got my bachelor's in criminal justice and you know, wanted to, wanted to go the federal agency route and I kind of stumbled into it and found it to be a lot more interesting in some ways. But what does that even look like for a PhD in computer science?

Speaker 2:

Yeah, I think your point's a good one. I never thought about that. Basically, everything you need to know about computer science you can get as an undergraduate Right. That's kind of what you're saying and that's absolutely true. The stuff that paid off for me in the long run was stuff like knowing how to build a compiler. So I took a couple of grad level classes in that and I did build compilers, but they were like natural language compilers, so you can apply that technology to many other things if you understand what that technology is and that kind of thing. Like, I was a Lisp programmer for a while if you know anything about Lisp. So Lisp was the language for AI for many years. But it's a crazy programming style. It's all recursion, so you have to be I mean all of it, that's what it does. So you have to understand recursion or mean all of it, that's what it does. So you have to understand recursion or you cannot possibly program unless. So those programmers are pretty much gone now, but that was a skill I had to learn from school.

Speaker 1:

Huh, I guess it makes sense. For, yeah, I guess, just thinking about from an education perspective, right, it makes sense to get that undergrad degree in computer science, if you're going to go down that path and whatnot, and then it probably makes more sense to get you know these onesie, twosie classes of developing core technology types right, rather than even going down the path of getting a master's like a full master's. Just get those courses, get that skill and then build from there. You know, because those are skills that really you know you can build off of right and it'll transform into something else where you're using it with AI and building a model.

Speaker 2:

Well, every once in a while, because of the kind of stuff I do, I run into hybrid people. Now I mean younger, typically like computers, people that have an undergraduate degree in computer science and then they get a law degree. I've run into half a dozen doctors who started off as IT people and then they went to medical school. And these people are they're killers because they can do stuff none of their colleagues can do. So when they get out into that world, the legal world, the medical world, everybody relies on them for anything that looks like a computer, and I'm talking like litigation. Is the hospital going to buy a new $150 million automated surgical robot arm? Well, let's ask Joe, because he's got the computer science degree, although you said you didn't, but even so. So I mean, I think that's very powerful. I don't see the specialization requirements anymore specialization requirements anymore.

Speaker 1:

Yeah, that's actually very true. You know, I think this is kind of how I approach it. You know, when I was getting started, I wanted to learn as much as I possibly could about everything. There wasn't a specific technology that I wanted to focus on or a specific domain or anything like that, and so I got experience, you know, with WAFs, right, and then regular firewalls and EDR systems. I have experience with all of the big EDR systems.

Speaker 1:

You know, when a lot of people, a lot of people, will say I only know CrowdStrike or I only know X EDR, right, I have a full spectrum of experience across almost every single domain in security. And then I went through and I decided, okay, I'm going to specialize in cloud security. And now I'm kind of taking a step back and I'm upscaling right on the PhD side with post quantum encryption on satellites, two things that have so many different facets to it that I've never touched before, right, while I'm also going back in my career and getting more broad, getting more generalized and specializing in a few niche areas, but still building, you know, a stronger I just I say building a stronger overall experience, right, because you know something I'll learn in network security or with a WAF or whatever it might be, will benefit me in vulnerability management and it'll benefit me in other areas.

Speaker 2:

Knowing the concepts will pay off throughout your entire lifetime. The concepts will pay off throughout your entire lifetime Far more than memorizing tables or something like that. Understanding the concepts is really, really important, and I think that gives people survivability in the marketplace. So you know. Something else to consider. When I went to college and my first degree was in the 70s, late 70s, there was no computer science degree. You couldn't get one. You had to get a math degree, carnegie Mellon. I've got a couple of friends who went there. They got math degrees and then they ended up being computer programmers, but that's all they could do. So we changed job titles, especially in this country, every seven or eight or 10 years. As look at what a lot of people do now in LA and New York and Chicago, and you, if you go to a room of millennials, you know at a bar or something, and say, what do you do? I guarantee you at least a third of them were professions that I may not even know what they are and they did not exist 10 years ago. I'm an SEO specialist, okay, well, what is that? I mean I know what it is. I'm exaggerating, but most people my age wouldn't, and it's because the professions have changed. So you want to be futuristic for a while. Put your hat on and think five, six, seven, 10 years. What kind of professions are we going to look at? Well, I bet a lot of them are going to have the word AI in them, and they're not going to be building AIs, they're going to be training AIs or they're going to be servants to AIs. So when the AI needs like a cup of coffee or something, you'll metaphorically, that's what you'll do, because they don't need us to do anything like this. They need us to feed them data, but they've already eaten all the data. Openai announced, I think a week or two ago, that they've now looked at every single piece of data that they could possibly look at and they're now building systems that generate false data that they can use for training. The rest of the systems Sounds goofy, but what that? I mean? What that is is those are machines that are now training themselves.

Speaker 2:

I mean, look at programming. I mean open AI, like the chat, gpt stuff. I'm sure you've tried to write programs with it. Everybody has. Oh, yeah, they're not bad. Yeah, I mean, they're not the most clever thing I've ever seen, but they work, they compile and they do the thing they're supposed to do. So you know we're just. And then you know not to get you know too spiritual here or anything, but you take that idea of technology and then you put drones and robots and Tesla. My Tesla has a summon feature that I am terrified to use. I tried it once in the middle of DC and we were like two blocks away and I hit the button and then it comes barreling down Connecticut Avenue with nobody at the wheel. And this isn't Waymo, this is like a car driving itself with no real particular direction in mind. So when you start looking at that, I mean what are we looking at? We're kind of looking at Skynet.

Speaker 1:

Yeah, yeah, that's a really good point. Where do you think AI security fits into the development of AI? I know that we talked about that offensive and defensive component, but when we're talking about models, it's a little bit different, right, because you almost have to, you know. It's like you have to monitor what the model is consuming. It's like you have to monitor what the model is consuming.

Speaker 1:

And you know this is the thing, I don't know how else to explain it Right, you wouldn't want the model right to look at Nazi Germany and say that is good, that's something that we want to propagate, but you don't want to keep that information from the model Right, and so you get into a weird dilemma.

Speaker 2:

Do you remember there was an AI that Microsoft did about 10 years ago that they had to pull off the market because it became a racist?

Speaker 1:

Yeah, Do you know what I'm? I'm just looking it up because I'm here, yeah, t-a-y.

Speaker 2:

So it did exactly what you're saying. So they fed it a bunch of stuff and they didn't really constrain what it ate on the websites and it hit a bunch of white supremacist sites and then basically it was saying Sieg Heil and it was saying like a whole bunch of anti-Semitic stuff and it used the N-word with people in casual conversations. So Microsoft shot it in the head and they never revived it again. Wow, that's not programming. That's not programming. That is its interpretation of the data that was input to it.

Speaker 1:

Right, right, so like that. That's that's kind of where I think AI security like comes into, into play, right, where it's kind of more about monitoring what the model is consuming and trying to figure out. See, I always view it and people with at nvidia they argue with me on this as like a ai model hierarchical system where you have an overarching ai model that you want people to consume and interact with and whatnot, and then that ai model is fed off of other models that is looking at specific topics. So it's almost like that model gets specialized into a certain area, like maybe world history or European history or, you know, sports, right, the finance industry, and when that model reaches a certain level of maturity, it starts feeding that upper level model that information to query, to interact with, for users like us to start querying it and building different things from. I think that that might be the only way to do it. But again, you know NVIDIA, those geniuses over there they argue with me that that's not a great way.

Speaker 2:

Yeah, I would argue that too. Actually, I mean part of the problem here. Here's another weird comment that I don't think a lot of people make. We have hit, for the first time in the computer lifetimes, we have hit the point where you can no longer backchain why a computer had an answer. We could always do that before. It might take a while, but if somebody said, god damn it, why did the computer do that, why did it shoot down that airplane, you know, a week later somebody's going to tell you why that day is gone.

Speaker 2:

And generative AI systems. It is completely impossible to backchain those guys and to get like a stack dump and find out exactly why they did what they did. So as that is I mean that's like pervasive across this industry. So as that continues, you know you're going to. Well, that was like the racist, the racist plot at Microsoft. They knew what it was because they went back and looked at the websites that it was it was looking at.

Speaker 2:

But in the future there's already so many of them. How would you know? And it's not like they have to look at whitesupremacistcom that pulled in. They can just go to Twitter or X or any of a number of other ones, and they can find all of that crap in free speech forums. So, from a cybersecurity viewpoint, you go back to your question. I don't think you can look at the input, I think you have to look at the output. So I think cybersecurity for AIs is going to be like it's like you have an attack dog and it's been trained and you're walking around with it on a leash to make sure it doesn't bite anybody. And I think that is what AI is going to be like, because you won't know why it's doing it and there will not be a human understandable correlation of causality between reading this post on X and deciding to use the N word in a forum. You just won't know. So you just have to wait until it screws up and then you have to roll up a newspaper and hit it in the nose. Huh.

Speaker 1:

That's. It's fascinating. I feel like we could go for another hour just talking about any of the 10 topics we just dove into. You know, unfortunately we're almost at the end of our time, but I really want to dive into the stuff that you're working on now, right, so you're discussing about building or working on, you know, web3 and post-quantum. So talk to me a little bit about that, because I don't want to butcher it and this can get pretty complex.

Speaker 2:

No, I, actually I wanted to do that. So I'm working with a company called Neoris. It started in Portugal, but it's a global company and it's a it's a web three company and the founder developed some really cool security approaches to where you have these little lightweight processes that can be very quickly ported to any device you know routers, computers, whatever, and even IoT devices and then, when there is a possible attack or there's some suspicious looking, you know, packets start coming in. Instead of going out to like Microsoft and saying, is there a CrowdStrike or something saying Is this okay? What it does is it takes a, it has a blockchain attached to it and it has a vote, but not just its computers, but other networks that are in, like the big meta network. So some computer you know in Berlin will vote on this because their profiles and you know, like virus profiles, right. So there'll be that kind of thing and and it works really, really well.

Speaker 2:

And the demo we've been using is we have a robot arm and we hit it with a, with an attack, put a virus on it, and then our system is able to. When we do it again with our system running, it deflects the virus and it won't accept it as input, and so now we've added post-quantum onto that. So the attractive part about this system, to me at least, is it's decentralized. So if you're a company and you buy a system like this and you run it and something goes wrong, it's your IT guy's fault, it's not Microsoft, and I think that's very empowering. That's interesting.

Speaker 1:

We spent the whole time talking about how that control or that empowerment is going away from us and more towards technology or these thousand-pound gorillas in the industry and that's interesting how it's bringing it back, how it's bringing that ownership back to us almost in some ways.

Speaker 2:

Well, I think I mean I'm kind of an anarchist at heart really. You would never know from my background, but whenever I see things getting too institutionalized it gets my hackles up. And the government never bothered me because I'd been in the government and the government's fundamentally incompetent, no matter who's president, and they can't really do things. They say they're going to do things, but it takes them like a decade to do almost anything. The thing to worry about is guys like Zuckerberg, you know, and those people, the billionaires, that are not stupid, that have lots of assets. Elon Musk is probably a better example, because he'll do almost anything, potentially, if it suits his interest. I worry about those. So the more we take our technology out of these people's hands, the better off we are.

Speaker 1:

Yeah, well, it also enables us to maintain our own privacy right, which has been something that you know doesn't really exist.

Speaker 2:

I wrote a couple of books in this. I wrote a book called Privacy Loss, still on Amazon. It's not there for 14 years. I predicted a lot of this stuff and I think that the trick to privacy is you have to accept the idea that the old definition of privacy is irrelevant. Privacy is not binary, it's not, and baby boomers talk about it in Gen X people. They go oh, I lost my privacy, oh, I got my privacy back. It's not virginity, it's not like that. It's not binary Privacy. It's like uptime on a network. You know 99.9999. It's like four nines or three nines or two nines. That's what privacy is, and you have to expend a certain amount of energy and time and money to achieve each granularity level on privacy. But people don't want to spend that money because they think they're entitled to it anyway. So that's going to be a problem too.

Speaker 1:

Yeah, yeah, that is definitely going to be a problem. I feel like to some extent it's been that whole debate, that whole talk has almost kind of been pushed to the back burner in some ways. You know, I I always remember the first time I went to to germany for a study abroad in college, it when I was on the plane it had just broke from the snowden leaks that we were spying on germany, right. So when I get off this plane, I have a connecting flight to make it to Berlin. I'm in Germany, I'm in Dusseldorf, right, and I'm going through customs and this guy doesn't want to stamp my passport and I'm sitting here like like hey, man, I have a flight in 20 minutes. I have to run across an airport in Germany. I don't know where I'm going to catch this flight that I hopefully get the right one, right.

Speaker 1:

And so I started to argue with him, right, and it eventually because there was no TVs around me or anything like that he eventually just stamped it. His boss came over and stamped it and by the time I get to my gate, I sit down for five minutes and I see America spying on Germany since whatever year, and I was like that's not good for me, because now I just came here and I yelled at that guy and they're probably looking at me a different way now, but I mean of course we were.

Speaker 2:

Everybody spies on everybody, and you know it's like this TikTok thing which is absolutely ludicrous. And you know it's like this TikTok thing which is absolutely ludicrous. I mean, it's not ludicrous to think that TikTok is gathering personal information. It's ludicrous to think they aren't, and in fact, I would be shocked if they weren't doing that. And guess what? I bet Meta does it, and Instagram and Facebook, and I bet Elon Musk does it with X, and I bet, you know, every one of these social media platforms does it. Microsoft does it, even if you surely you've noticed this. But if you buy stuff, software you used to buy, like Microsoft Office or Adobe Photoshop they have switched to these serialized license models which require a lot more information from you, and so not only do they want the money, they also want the information.

Speaker 1:

Yeah Well, these products, you know they can be free to some extent because we're the product. You know they're taking our data and they're selling it to whatever broker and you know it's a mess. And I don't know how we come back from this perspective without having something like Web3, you know, widely deployed, widely accepted and, you know, building from there.

Speaker 2:

That's why I'm interested in Web3. I mean, my basic meta thought on this is I think individuals need to be armed with cyber weapons and, like when I was at Network Solutions, I was running a thing called the internet, which is the DNS system and other stuff, and I had to defend the first. As far as I know, institutional denial of service attacks Big. No one had ever seen one before and they were really stupid and anybody today could have stopped it. They were just like smurfing on some broadcast address. But we had to decide what to do because there was no precedent and no policy. And I made the decision let's find out the IP address and let's, like, blow them up out of the water. And we did, and that was my approach. If somebody did that to us, I would find out what network they were at and I would blow their network out of the water.

Speaker 2:

And then I didn't have to worry about Smurf attacks. So I don't even think you can do that now, but yeah.

Speaker 1:

It's like Battleship. Yeah, it's like playing Battleship.

Speaker 2:

It absolutely is, and that's how things are with the Chinese in the US right now too, right.

Speaker 1:

Well, david, you know we're at the top of our time. I try to be very conscious of my guest's time. You know, when I say it's an hour, it's an hour. But I mean this conversation has been very fascinating, very engaging, and I definitely want to have you back on.

Speaker 2:

This is a fantastic time. Yeah, well, thank you. I've enjoyed it too, and I think the stuff you I looked at some of your other ones that you've had guests on, I mean the stuff you're doing is really relevant right now.

Speaker 1:

Yeah, I try to be. You know I don't want to put out like dated, dated information. I want the podcast to actually, you know, have value and show value to to my listeners. Well, you know, david, before I let you go, how about you tell my audience you know where they could find you, where they could find your company, that you're, that you're doing this great work with? Uh, if they wanted to learn more, the company is called Naoris.

Speaker 2:

It's a Portuguese word, it's N-A-O-R-I-S dot com. N-a-o-r-i-s dot com. You can find me at DavidHoltzmancom or GlobalPOVcom is another website I use and my email address is on there and if anybody wants to reach out, I'm I'm pretty accessible.

Speaker 1:

Awesome, awesome, well, well, thanks David for coming on. I'm definitely going to have to have you back on, and you know. Thanks everyone for listening. I hope you really enjoyed this episode and more to come in 2025. Thanks.

People on this episode