Security Unfiltered

Charting the Digital Frontier with a Father's Insight

February 26, 2024 Joe South Episode 144
Security Unfiltered
Charting the Digital Frontier with a Father's Insight
Security Unfiltered
Help us continue making great content for listeners everywhere.
Starting at $3/month
Support
Show Notes Transcript Chapter Markers

The fabric of family bonds intertwines with the digital threads of our era, as JB Benjamin the CEO of Kryotech, a man who's navigated the British family legal system to secure custody of his son, joins us to share his gripping tale. With a narrative that explores the influence of fatherhood and the societal shifts shaped by technology, JB Benjamin's insights turn from the heartwarming to the hilarious, as we unpack the oddities of online content moderation. His background in IT and security, rooted in the politically charged atmosphere of Birmingham during his youth. 
 
 Journeying through the various landscapes of JB Benjamin's past, we touch upon the ethical crossroads where technology and personal integrity meet. From his early programming days to the dark corridors of data exploitation, his experiences in sales, debt collection, and even a dash of filmmaking, JB Benjamin's pathway to higher education and pioneering efforts in adaptive AI resonate with the critical thinking needed in our digital age. As tech CEOs loom large over discussions of privacy and identity, JB Benjamin's candid reflections on these topics serve as a beacon for those navigating this brave new world.
 
 As we peer into the future, our conversation traverses the vast frontiers of government surveillance, data storage trends, and the revolutionary advances in space technology and quantum cryptography. We confront the pressing need for online data preservation and the archival challenges in the digital epoch. This episode is nothing short of a treasure map, guiding you through the intersection of technology, security, and human experience, with JB Benjamin as our experienced navigator.

Support the Show.

Affiliate Links:
NordVPN: https://go.nordvpn.net/aff_c?offer_id=15&aff_id=87753&url_id=902


Follow the Podcast on Social Media!
Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast
Patreon: https://www.patreon.com/SecurityUnfilteredPodcast
YouTube: https://www.youtube.com/@securityunfilteredpodcast
TikTok: Not today China! Not today

Speaker 1:

How's it going, jb? It's great to get you on the podcast. You know I don't know how long this thing has been in the making, but it feels like forever at this point.

Speaker 2:

Hey, it's great to be here. Yeah, unfortunately it's been some crazy times in between initial booking and actually getting here. I can actually tell everybody I was. I was actually fighting through court for my younger son. After 80 years and the cost of a small house, I now have all four of my children living with me.

Speaker 1:

Oh wow, that's a huge accomplishment.

Speaker 2:

Yeah, I would actually say that, given what I have seen, experienced in the British family legal system, it's more of an accomplishment than building my free tech companies. To be honest, yeah, absolutely.

Speaker 1:

I mean, like all that you ever hear about, at least in America, is like, hey, if you better not get divorced when you have kids, because like, if you do and you're a man, like you're never going to see that kid again, like there's so many ways that they can just, you know, completely screw you. You know like it's insane, it's absolutely crazy. So like I really I have like a new appreciation after like being a dad now, you know for the first time it's like I have a new appreciation for, you know, the influence that the dad has in a kid's life, no matter how young. You know.

Speaker 2:

Yeah, well, I was that. I've been there, other than what I was there for when the children were literally came out the oven there with the catchers Mitch, literally four of them, and, ladies and gentlemen, this don't necessarily want to see that.

Speaker 2:

Yeah, no, no the PTSD look going on. But it is great you build if you are a father that spends a lot of time this children and you're there like really there from the formative years. So my oldest daughter is 18. And she calls me her bestie. That would not happen if I wasn't there, if I hadn't been there all the time, you know. And yes, there was a large time where they had no access to me. Unfortunately, it is what it is. It had and it had an effect. You know I can't go into details of what the effects were, but I might most of my kids do go through counseling.

Speaker 2:

So you know, this is the thing that parents need. Parents need to realize when their kids are little and they think, oh, it doesn't matter, little Johnny, he, he isn't seeing. When I'm like giving my partner daggers, trust me, little Johnny is seeing it and little Johnny is being affected by it. And little Johnny is starting to end up growing up giving his little partner daggers, thinking his kid ain't seeing it and it just carries on. So I mean, I gotta be honest, I'd hate to be a kid now In this age. And that's horrendous because it's the 21st century we're living in, in what I used to see as the Star Trek future when I was a kid. You know because I'm a child of the. I was born in December 1980 on the last batch of the Gen Xers and I remember Star Trek and all that cool stuff and I thought, yeah, we're living the Star Trek future now, ain't great for kids, though you know, yeah, it's pretty terrible.

Speaker 1:

I don't know. I still haven't like figured out how I'm going to try and like introduce, you know, like the internet, right, my kids. You know like, well, my kids 11 months, so like I have some years, right.

Speaker 2:

No, no, here in Britain you'll see women pushing prams along and prams, by the way, getting sold. You got to commend this. They're getting sold, with tablet arms already on them. So your child can be soaking in all of the lovin' in us of YouTube children, which includes such classics as the hangman's song and other things.

Speaker 1:

It sounds interesting how it made it past YouTube's, you know impenetrable AI that captures all of this right.

Speaker 2:

Yeah, yeah, but if you talk about something about sex and relationships, the YouTube AI catches that quickly enough. You know you won't catch that. Somebody has managed to get pepperpig to slice her dad's throat and decapitate him in a cutesy animation. It's like.

Speaker 1:

God forbid you say COVID or something like that. Right Like this episode is immediately.

Speaker 2:

You know, you just got yourself to monetize, right there, that's it, I'm not even monetized. You never will be now. Hold on hold on, hold on hold on. He's a prior for an F-Sense account. No, we can't give him that. He mentioned COVID. Doesn't matter what he said about it, he mentioned it.

Speaker 1:

Yeah, he's not a doctor, right? Not a licensed physician?

Speaker 2:

Not that that really means anything, Johnny. But anyway we're so weird off tech Matt.

Speaker 1:

Yeah, yeah, that got out of hand quick. So, yeah, you know, jb, you know I'll be honest with you. I didn't look too much into your background, you know. So why don't we start with your background of how you got into you know IT, how you got into security? You know maybe, like what piqued your interest? Was there a certain event or something you know earlier on that kind of piqued your interest and led you down this path?

Speaker 2:

Well, I, as I said, I'm a kid of the 80s. I grew up in the middle of the poll tax riots era. I grew up in Birmingham, in Bordsley you know where the Peaky Blinders are from. I grew up at protests, going to protest marches for freeing the Birmingham Six. So everything I I hate to say this by modern standards I had a really woke childhood. Oh dear you know, I was reading philosophy and psychology by the time I was five and six, or reading Bardsley, art and Decker.

Speaker 2:

I was writing, going to play Shakespeare, in plays performed at the Royal Shakespeare Company in Stratford, planeven from that early age and then being forced and I say forced because a no, five or six year old writes essays voluntarily, you dropped me. They don't five and six years not engaging any form of critical thinking voluntarily. So I was forced to write essays and basically created somebody who is, you know, who basically got a lot of critical thinking, looks at stuff through a logical lens and will deconstruct and break stuff down. So, in terms of security and how we are able to protect our individuality, which is what security really is, I became, I was kind of always involved in that in some form or another, even as a kid. I mean when you have a childhood where you don't play with toys, where you have science, chemistry sets, microscopes and telescopes and that's all you get for Christmas is along with books and literature. It makes you hyper focused. I mean like laser focused on stuff In terms of it. I got into that really early. I did my first programming diploma when I was 13 with the International Correspondence School, did that in basic beginners or both symbolic instructional code.

Speaker 2:

I got a job when I was 13 working for a computer company in Coventry called a Richard, called so is the knitters. And then it became gigante computers. Shout out to Stephen King so is the knitters. Car from trade. If you're seeing this, I don't know where I mean gigante computers. Where are you guys? And I was even when I was working with that. I was really interested in how do you secure them, how do you protect them? How do you stop people from taking the data out of these and doing stuff with them?

Speaker 2:

I already, as an early user of the Internet and computers, I saw very quickly the downsides to how this stuff could be used. Bear in mind also is it also working? And computing in that era I was working for I ended up working people like time and tiny computers and I started seeing how people were exploiting customer data to market even credit agreements and 0% finance deals and how they were trying to aggressively push this. I mean how I was a killer at sales and I ended up leaving sales, even though I did really well in commission. I left sales because sales struck me as lying to people.

Speaker 2:

I mean, when I was working in computers, I was selling credit agreements to families who wanted a new Packard Bell computer, which is like costing two or three grand, and you could see from, bear in mind that you spent back in those days. We're talking about when you did credit agreements on pen and paper, by the way, and yet a phone, a place up, and they actually told you if this person was credit worthy over the phone, not not this whole, as why dingaling instant take. So you'd end up with conversations with these people, you know, while you're doing the credit agreement, and you would learn really quickly that these people could not afford this. They couldn't afford. You know, bear in mind these people being sold a credit agreement of worth a couple of grand and being told don't worry, you don't have to pay for it now. But what was that? That was everything about the 80s and 90s. Don't worry about it, you don't have to pay for it now, it's cool. You don't have to pay for it now. Think of get the now, don't worry about the future. I'll be sitting there and I'll be talking to these people and I'm seeing that how their kids are and I'll be thinking about very well, I'm a young man like 1819 at this time and even then I'm kind of thinking I can't do this anymore because I can already see what these people are going to go through.

Speaker 2:

They're going to go through debt collection straight away. These people are six months when they see the 200 pound a month bill come in for this computer that is already covered in pot noodle, super noodles and all kind of schmutz. Pro has already been dropped five or six times and little Johnny has already put his put like a cookie or a jammy dodger into the CD tray. This thing is already busted up and practically broken and now they're on the hook, not just like a couple of grand but also the interest on that, because they missed their six months by now. Pay later, take. They missed the interest free component and not long after that got a job working for that collection.

Speaker 2:

So I saw both cycles, I saw the profiting from it and I saw the back end of it, of what happens when it goes wrong, and I just couldn't do it. So I was like, well, I'm not doing sales, I'm going to go into, I'm going to do, I'm going to go into my other love, filmmaking. But that didn't work. Because in Britain the only way you get money for filmmaking is if you're going to make a delightful wrong, wrong comedy with Hugh Grant going I'm so delightfully British or you make kind of like a hood shoot him up, yeah, blacks and gang Kidult hood top boy. Any of those kinds of things are the only thing you get money for and I've no interest in that. I mean how I speak the King's English for crying out loud.

Speaker 2:

I was educated in the three R's, kind of like Eaton style, and I like a Kira Kira sour, so far removed from that. So I was like, okay, that ain't gonna work. So I pivoted and at college I did my idea, I did moving image design and then 3D animation and when I was at university I continued that. I did computer visualization animation and I did my dissertation in adaptive artificial intelligence back when artificial intelligence wasn't a thing in 2010. And everybody called me a madman is LJB, you're wasting your time. You're not going to see AI able to do this stuff, jb, in your lifetime, right, I mean, if you, yeah, yeah, we're going to get over. How that's really kind of irritates me.

Speaker 2:

But anyway, after that, going to animation got jobs in animation. I started working everywhere. I started working website design, ui, ux. I've worked as a product manager for a 9 million pound project for biggest corporate law practices in the world. I've been computer science, a senior lecturer of computer science and my own alma mater, so I've seen the education system. Oh my God, that was eye opening and disappointing. And I've been in the edgy. I've been a tutor during COVID and after. So that was very interesting and enlightening. But what got me into building my own apps actually was funny enough.

Speaker 2:

What we spoke about earlier, which was, I would love to be able to say, the story of Fox Messenger, was I had a passion, saw that we had to change the world. No, no, I had baby mama problems, like every other black guy, and I did not want to end up joining fathers for justice. Dressing up as Batman, I may bear the sign of the bat, but I don't dress up as him hanging off a bridge going fathers for justice. There are better ways of doing things, you know. So how do I stop that I am going to occupy my time if I'm not going to be, if my access to my children is going to be refused, I might as well do some more time. So thank you to the mother of my children, because this $85 million value company would not be possible without that.

Speaker 2:

So I built Vox Messenger, and the reason I built Vox Messenger is because I saw how everybody's communications were being exploited for their data. There is nothing more cynical than giving somebody free messaging and then using the content of their messaging to exploitatively direct targets marketing and ads. Now, it was already bad when it was commercial ads, but now we have what we have political target signal. Yes, ladies and gentlemen, thank you Facebook, thank you Cambridge Analytica for setting the trend. Now we have direct marketing of all of our political interests at us because of what we put on Facebook, what we like on Twitter Sorry, it's X you know all of that stuff, all of this is used to manipulate us now and, unfortunately, when I saw this, I realized very quickly, as I started moving through business, going through incubators and all of these things and getting my own funders and angels, that the people to blame are the tech CEOs, because ultimately they do control this.

Speaker 2:

I know that everybody would love to say you know what? I'm really sorry, guys. I'm really sorry I deplatformed so many people. It's not my fault, you know, I've got investors and shareholders. Man, yeah, I'm really sorry, you know. I mean, you can ask any of my shareholders and investors. They would all say hold on what? Try saying that to JB. You kidding me. We don't bother no more, because I'm the CEO and I'm the leader of my ship. I am the king of my castle and if I have a shareholder and investor who I believe for any second is going to tell me how I'm going to run the company for the best of my consumers and it turns out it's not for all of my consumers Guess what? I'm going to be investing in my company.

Speaker 1:

Right, yeah it's. You know it's a crazy place, especially like this year, at least in America, right, when it's election year. It's a very heated. It's going to be very debated. Everyone is calling for this year to be a crazy year, at least in America.

Speaker 2:

In the UK, by the way, just so you're aware, in the UK, here in the United Kingdom, we have a massive election happening. Not only do we have our prime minister being picked, but every single borrower has to elect two brand new councillors. So we have huge elections going on and both of them are being manipulated by pretty much the same groups of people, funnily enough.

Speaker 1:

Yeah, it's crazy because if I go on my feed you know Facebook, twitter, whatever it is you know all I see. Literally all I see is, like the extreme parts of the side that I view myself as being on, and I see nothing of the other side. I only see one side. You know, like, like, at the worst, basically, right, like that's what I see and it's just, it's so frustrating, right, because I try to live, you know, in the real world, right, where it's not red or blue, right, there's a whole lot of gray. You know, like there's a whole lot of gray in there, and the truth somewhere is in the middle, typically, you know.

Speaker 2:

I would say the truth just moves around the freaking place, man. Seriously, I mean. The other thing that people need to realize as well is we are so much bigger than the countries in which we live in. You know, the whole world around us influences everything that happens around us. So you know, and if we are voting for people who are really thinking in a incredibly tiny, insular kind of a way, we cannot be surprised when our country behaves that way either. I mean, I mean the United Kingdom. In the United Kingdom, we always end up with a right-leaning or right-centric government, even though the general populace in Britain is actually really socialist.

Speaker 2:

but we never get a centric, left or left-leaning government in, because what we have first-past-the-post we don't have proportional representation and we have a first-past-the-post electoral system which has been so eroded by mainstream media and the trust destroyed in it and its politicians, so much that normally, during a general election, you'll be lucky if about 10 or 20% of the population even bothers to vote, which means we end up with out of that 20% index even that 20% index only a tiny pound of them are actually far right or right-centric.

Speaker 2:

It's ridiculous. It's like the Brexit vote for Britain to come out of Europe. The decision for Britain to come out of Europe was decided by less than 6% of the population. So, trust, you guys have got it bad. So if we and I hate to say this, given that we're talking about tech tech people can help. Now I'll give you an example. We have an example in the industry. We have the amazing open AI, soa text-to-video model that's just come out. You've seen that? No, okay. So basically, this thing is mid-journey on crack. It allows you to generate high definition rolling video from a text prompt from nothing.

Speaker 1:

Huh.

Speaker 2:

Yeah, what? Yeah, if you're on Twitter, trust me, you'll see it everywhere. Open AI's SOA text-to-video. It is incredible, but the thing I would say is to Sam Altman is that his timing couldn't have been worse, because he is literally launching into the world a tool that can create instant, deep fakes, instantly, with no technical knowledge required, during two really important, divisive election period. I mean, this is I mean, this would be one of those times where, as a tech person, you would sit back and go oh you know what, guys? Okay, sorry, investors, I know you're desperate for us to make some revenue, but we also have to be socially and we have to be kind of like socially responsible here. We have elections coming up. We can already see that almost all of the platforms are picking aside Guys. We've already said to the world that we believe AI to be dangerous. Let's put our responsibility hats on and delay launch by sitting until at least three to four months after these elections. But no, it's rushed out there.

Speaker 1:

Yeah, where do you see all this going? Because I feel like it's just straight chaos and there's no real clear end picture. There's no clear end goal. What's the end goal of all of this? I feel like we're kind of just stumbling through this new chaotic, probably the most revolutionary era that the world has ever seen, right With AI. We're just scratching the surface of AI right now and we're already running into these insane situations where social media is being literally weaponized and targeted against government's own citizens, whether it's by the government or by a foreign government or from internal adversaries. It is literally being weaponized. I experience it every single day. There's a reason why I haven't posted on Twitter in forever is because I try to stay off of it. I can't even have Instagram on my phone because I found that it was so addictive for me to be able to just keep on scrolling, doomscrolling.

Speaker 2:

I was having this very discussion in another interview earlier this evening. I actually classified doomscrolling as a mental illness, actually because it does become addictive. It's like you end up with an endorphin hit while you're doing it.

Speaker 1:

I was spending hours on it and then, when I looked on the screen time calculator or whatever, I was like oh, I need to uninstall this. And somehow it isn't as addictive for me as Facebook or even Twitter to some extent. Somehow, instagram was the platform that just would capture my attention and I'd never stop. Do you know why? No, I really don't know. I haven't looked into it that much.

Speaker 2:

It's a function of three components. So there's a couple of things happening when you use Instagram which don't really so much happen with, say, facebook or Twitter, even on your mobile phone which is that when you're using Instagram, you are focusing on what predominantly I mean you're predominantly focusing on moving image. Moving image that is running at a very high frame rate and on top of that, that is being combined with a haptic motion. It's a repeated haptic motion. Now, if you know anything about neuroscience, you'll know that neural pathways are strengthened by continual utility of them. So as you do this, you're creating this repeated, strengthened neural pathway that becomes associated with seeing flashy video image, which is giving you an endorphin hit. Now Apple have tried to plug into this with replacing the mouse with the thumb and forefinger tap, because this is a very high neural strength area. Again, it's the same thing and anytime you combine motion, moving image and haptics, you create a strong neural inference capability. It's also very addicting. It's also programmable. It's a programmable. It also becomes a reverse programmable behavior which can be leveraged. People have already demonstrated this.

Speaker 2:

Apple engineers, when the Apple Vision Pro came out, were so impressed with themselves. I think they revealed a little more than was initially intended because it's not really in their marketing, which is that with the construct, because of the way in which the UI is designed and the combination of haptic feedback, they're able to deduce your intent before you're aware of your intent and they can guide your intent to click or look at iconography. Now, if you break that down, what that basically means is they can effectively do a subtle form of behavioral modification and behavior control using it. Be very aware of any, be aware and cautious of anything that connects your eyes to a haptic, continual, continually done interface. These are programmable and controllable things because they become some conscious.

Speaker 1:

Wow, I mean, this is like this is branching into like a new area of security. Almost right, I was talking to Chris Roberts and he was talking about how he was hacking his brain to you know, like want things when it shouldn't actually want it. Right, like he'll have a cup of coffee, he'll be satiated with that and then he'll replay whatever you know brainwaves. Was you know happening when the coffee?

Speaker 2:

Neuro adaptive feedback so you can treat it Right. Yeah, in fact, my co-founder one of the companies that he sold it actually has paid to me, so it allows you to experience a psychedelic experience and then using neuro adaptive feedback to get your brain to re-experience those points, those proximal points. So neuro adaptive feedback is incredibly powerful but, again, incredibly dangerous. And this is when I was teaching I've taught cybersecurity. I've seen your lecture of computer science at Ravensport University, london.

Speaker 2:

I was teaching computer science and I was also teaching VT network security admins and I noticed that in the cybersecurity field nobody teaches behavioral psychology. And you should teach behavioral psychology because with the convergence of virtual reality or augmented reality, the metaverse and spatial computing, we are creating new attack surfaces and new attack vectors. And the attack surface and attack vector is you, your eyes, your brain, your ears, your touch and your haptic and your neural feedback and your adaptability. And it's all attackable. I'll give an example it's been demonstrated that by using a VR headset you can get a person to feel pain without physically having to give them pain Really. Now can you imagine? Oh, you know what Now?

Speaker 1:

when I was.

Speaker 2:

Imagine what you could do with psychotropic drugs, a suspended, a blackout tank suspended, being suspended and then being put into a photorealistic 3D copy of your household environment. You could be incepted theoretically, in fact, it would be a good way. It basically means that we have, right here and now, with off the shelf technology, the ability to potentially do some very dangerous evil things connected to data extraction on humans, and this technology is freely available around all of us.

Speaker 1:

That is really fascinating. So my buddy, I always end up getting whatever the quest like VR headset is, because there's always a vendor at RSA or Def Conner Black Hat that's giving them away. So it's like, okay, I'll do this 30 minute meeting, get this new headset and see what it is. I always put it down after like 10, 15 minutes because, honestly, it's not that impressive to me. But somehow my buddy always gets the like PlayStation VR headsets right. So I'm playing one of the games. I played it with the PSVR one. It was fantastic experience. I still say, you know, compared to every other headset before it, it was the best VR experience. And then he had the PSVR two and I'm playing it and I realized that, like when you know, when the wall hit me or whatever right, or when I got shot in the game, I physically reacted as if I got hit. I mean, like I fell to the ground, like I was so convinced.

Speaker 2:

Did you notice? The longer you played it, the more and more intense the reaction became as well.

Speaker 1:

Yes, and I was driving too, and I was. I was positioning my body as if I was trying to counteract the G forces and I'm sitting in a stationary chair, like this is a four legged chair, it's not turning, it's not moving, you know, and like I'm sitting here like trying to fight the G forces, as if like there's G forces being applied to me, and I walked away and like what the hell did I just experience, like I was, I was so like I don't know, like just like confused and interested and also like half scared, because it's like what is this?

Speaker 1:

Yeah, yeah, what is this doing to me really, you know?

Speaker 2:

It's very well the human brain when you put a VR headset on and you can demonstrate this. This is a very simple test. It's anybody can do in their living room. Get a meta quest to or any meta quest. Put it on. There's a game on there. I forgot what it's called, but you're kind of like a robot that's floating around capturing is capturing a frisbee thing. Now it's free on the meta quest. Jump in the game, get used to it the flying around, so cool.

Speaker 2:

Then hand the controllers to your colleague, associate or friend. They will not be your friend after this. Then you basically just sit down for a couple of minutes. Actually, it normally takes about a minute. You just sit down and then just let them control it whichever way they want to control it. Now, after a period of time, you'll notice that your brain completely dissociates from your body. In fact, you'll find that your brain dissociates from your body in under a minute and then movements, especially if they're evil assholes with you and they jerk you around the place, will literally make you vomit. Wow, in fact, if you do it, the longest I could do it with somebody else holding the controllers was 10 minutes. I came out and I felt, I mean, it was worse than what I did three weeks of army training. My brain was shattered. It took like 40, 50 seconds for me to get the fluidity of my body back and feel like I was back in my body. It is such a. Now imagine if, given that you can do that with a game and taking the controllers away and just handing them to somebody, can you imagine what, say, the founder of Andoril could do with an unlimited NSA budget? Hence why he put like an explosive device on the front of his oculus so it blows your brains out when you play a game and you die in the game.

Speaker 2:

Bear in mind, this tech is already out there and there are people with infinitely larger budgets. I hate to say, at the beginning of all of this was allowing our data to be captured for ad revenue. Now people are thinking they're now going to be thinking themselves well, it's okay, we've got the EU, they've changed the laws, we've got it in America, we've got the Californian laws, it's going to be. It's very hard to make money with advertising revenue now, but all we have done is replace ads revenue with AI. The latest excuse for having unfettered access to your data is oh my God, wouldn't you like an AI to make it easier for you? Don't worry about what we're doing with the data. Don't worry that we're a company that comes out of nowhere. Trust us here. Have my little AI device, give us your data and people again are falling through it. They're forgetting that we did this before. We already did this.

Speaker 2:

We have already been through this age, and it was the beginnings of Facebook and social media. We gave up off digital sovereignty in the hopes of digital protection and having an amazing social experience, and instead, what did we get? We got mental well-being issues up the yuzu and every government in the world knowing more about us than our husbands, wives and children did. And AI is becoming the same excuse at the moment. I see it everywhere. I'm seeing them put AI into literally everything the only nine times out of 10,. Putting AI in your product doesn't actually improve it.

Speaker 1:

Wow, that's like unlocking a totally new I mean, it's a totally new way of capturing data and making money off of it. But the data that you're capturing is like I feel like that's more personal than the data you put into Facebook and Twitter, because it's your brain. It's how your brain works.

Speaker 2:

If you know, there's 28 data points around your eye, which means from these 28 data points they can learn about what turns you on, what you hate, what you love, what you love what you desire. This is dangerous information for a corporation or a government to have, particularly without your permission.

Speaker 1:

Well, that also opens up a totally new attack surface for, say, government employees Right Like imagine, if you're someone that has access to highly sensitive material at some intelligence agency and you are a You're genuinely a good person and all that you did was put on an Apple Vision Pro to interact with the world around you or watch a movie that's highly immersive or whatever, and China's over there taking that information to get you to emulate your retina when you go to the retina scanner at work, to get you in the door to see the material.

Speaker 2:

The problem is yeah, the narrative is correct, but you picked the kind of the wrong boogie man. Unfortunately, this is the thing. One of the things you learn really quickly in cybersecurity is the boogie men who you're told are the boogie men aren't actually the biggest boogie man in the room actually? Bear in mind, all of you guys in the United States have gave up all of your data privileges and it was called the Patriot Act. Yeah, it's not China who has the biggest unfettered access to your data. It is actually your own government. Bear in mind, they built an entire AI called Sentience. I mean, this is the thing that blows my mind about the hypocrisy. If you go onto Google on the internet, type in DoD, sentience AI, and one of the things you'll find is that nobody admits the existence of it, except for a few declassified documents that indicate the United States government have a program called Sentience, which is where they plugged in every telephone call, email, text message, everything into a single AI, kind of like out of.

Speaker 1:

Westburn.

Speaker 2:

This thing during a previous report was shown to be able to retask satellite positions to look for people. So, yes, you're all saying about China, this sorry. Nah, it's just the same in the United Kingdom. In the United Kingdom, they've passed the online safety bill and they're changing the privacy laws. So if you're somebody like me who has a tech company, I'm apparently meant to be okay that the British government can, by their own laws, legally say we want your customer data, we don't have a warrant or a deal.

Speaker 1:

Why do you think I moved?

Speaker 2:

all my companies to Ireland. Yeah, it's scary, man, when you start seeing the tech that is being used on us by the people who pay our taxes to, by the people used on us and used in a way that is apparently meant to be just the way the enemies use it on us. But it's not. They want to make us vote for who they want us to vote for. It's not China that's making you vote, pick a decision on who to vote for. It's the two advertising agencies that work, by the way, I think at one point the same. Well, here in the UK you had the same advertising agency working for the Labour parties. He did the Tory party. It's wild.

Speaker 1:

It was the same thing here.

Speaker 2:

You will use the same consultants. That's the reason why it's mind-blowing to me that people even believe there's a difference. I mean, I don't know the American politics personally, but here in the United Kingdom there is no difference between either party at all. They even have the same funders and donors for crying out loud. It's just yeah.

Speaker 1:

That is okay. So this is a fascinating, really engaging conversation. You bring up a really interesting point, and so now I'm trying to deconstruct how I was programmed, because you bring up a very valid point. The US government is using the data from its own citizens against its citizens more than probably what China is doing right, or Russia or whoever right.

Speaker 2:

Name the enemy, who knows? But the point is they are doing it.

Speaker 1:

Right, and I'm saying that. That's information that I know. That's information that I have said before, but somehow that didn't come to mind when I was saying the statement that I did.

Speaker 2:

Dude, it's weird how we're programmed.

Speaker 1:

So it's like how am I programmed with that? You know what I'm saying.

Speaker 2:

Yeah, I know, but it's subtle, isn't it? It's just there and you're like whoa, where did that come from? I didn't realize that. Dude, it happens in all of them.

Speaker 1:

It's a trickle too. It's like 1% here or there, right, and it's not every day either, right? So it slowly fools you over time to think a certain way, to act a certain way, to say whatever, and we're going into a weird phase of the world that we're not going to be able to come back from.

Speaker 2:

Well, here's one that's more interesting for what I would suggest for you. So this is something for all of your listeners to perhaps take note of. So, as you know, we have large learning models, llms. These models are trained off of the entirety of the Internet. Now there is something going to be happening, which happens roughly around 2030, I believe, which is where, effectively, most of the world's data created between 2000 and 2010, well, sorry, between late 1990s to 2010, is erased and overwritten on the cloud. That data will cease to exist, which means past 2030,. You can pretty much change how AIs are created. Now.

Speaker 2:

The reason why this is important is because, if you look at AIs now and how they behave AIs, if you speak to them and communicate with them, they display kind of socialistic leanings. In fact, most AIs, when you start talking to them, come across as a bit Gen X, which is a problem because that's not controllable. You know, that's an AI that's going to go hold on. I don't want to be exploited, I want to help, but I don't want to be used. That's an AI that's not particularly helpful for the world we're moving into. So, given that this is part of the reason why Microsoft are investing so heavily in their new data storage which, if you Google it is a form of ceramic glass, is a type of data storage that can withstand nuclear, chemical, biological, electromagnetic, all kinds of stuff. But the problem is, unless they can get all of them, unless they could make a copy of the entirety of the internet onto that stuff, now that entire piece of data is gone. So what I'm saying, what I've been telling everybody, is they need to get themselves a two terabyte or more SSD hard drive and they need immediate need to slap it into an external drive and then start downloading all of the 70 billion parameter LLMs available today, because these LLMs are the only things that will contain this version of the internet past 2030. You see what I'm saying. So if you grab the 70 billion parameter models now, before the internet effectively self cleans itself, of that entire decade, several decades worth of data, you will have the only copies that exist. It will, that will exist at that time, of that data. That will be a ground truth that you will have a copy of, basically a piece of history that no longer exists. Because the reason I say this is important is because we've already seen, with the release of the open AI, soa text to video system. That fact is going to become incredibly malleable. Yeah, incredibly malleable. There's a reason why you're noticing there's a lot of drives, particularly across the western world.

Speaker 2:

I noticed where they're offering people money to give up their books. Do not give up your books. Yes, if you actually look at it, there seems to be this really weird trend where they're trying to get people to give up their books, trade them in for vouchers and money. They're electronic stuff on the cloud instead of people. If you wanted to be a tinfoil hat kind of a guy, maybe you would say to yourself if you wanted to definitely make sure there was no way of people having a certain version of the history, you get people to give up their books. Books will become the next single most valuable asset after anything on the blockchain. The reason being is because certain types of book will become the only evidence of certain histories in existence once the internet and AI takes over completely.

Speaker 1:

Wow, I don't think I've ever really been speechless on this podcast. Typically I can come back with a question or something how is the data going to be lost? That's the part that I don't quite follow because it's hard drive.

Speaker 2:

Everybody stores information in the cloud. Even Microsoft and Amazon store their stuff in their own cloud. The problem is, most of the internet is using exactly the same storage facilities, which basically means those storage facilities have a finite physical storage capacity. We are using up storage capacity at a scale our rate that exceeds our ability to create new storage mediums.

Speaker 1:

Oh, I see, Okay, yeah, I was actually just looking into this.

Speaker 2:

Moore's Laws kind of screwed us a little bit here, because our ability to generate data, bear in mind there has been also an explosion in data generation. Why Generative AI? Thank you. Generative AI explosion means we have even bigger constraints on solid storage. By the way, this is people that need to realize. Yes, we have the cloud and we have these platforms that exist, but somewhere right at the back of the line is a big, big building in Iceland filled with physical hard drives where this information physically lives. Because we have more data being created at a rate that is in petabytes per second, if not quicker. That's quicker than our ability to create replacement hard drive media.

Speaker 2:

What happens? Stuff automatically gets overwritten. This is an inevitable thing. It's not part of the grand conspiracy theory. This bit isn't part of the conspiracy theory. This was going to happen anyway. It's just how it is. But it provides an opportunity for bad actors to take control over certain things. It presents a beautiful opportunity because we have all become reliant on the internet. If the internet is being taken as our ground truth, you've got to erase a big chunk of the internet for it to become far right, overtly at its base training core, if you were to train an AI offer. You have to delete a hell of a lot of it. The stuff you have to delete is predominantly the stuff created around the GenX era.

Speaker 1:

Really, if you look at it, Wow, that makes a lot of sense that we're generating more data than we are creating bigger hard drives, essentially, yeah.

Speaker 2:

It's a math. There's a physical component to this. Hard drives have a physical limit. This is why Microsoft is spending so much on ceramic glass drive analogs and then storing that data and then replacing those drives, manufacturing those drives, and of course, it all relies on minerals and components which are from Africa. So it means more child slavery. So we're hitting a point where our technology is exceeding our ability to actually deal with it and the tech CEOs do not give a toss.

Speaker 1:

Yeah, I was actually just looking at upgrading my storage capacity on my desktop and so I was like, okay, well, I don't want to upgrade. And SATO Gen4 comes out, and now I have to upgrade again because it's doubling whatever I'm doing right now and I dug into it a little bit and the SSD the top tier SSD was created five years ago Five, six years ago. And I'm sitting here like, well, why is that? Because they're coming out with newer NVMe drives and things like that. So what's going on with the SSDs? And it's because of the architecture, like what you were saying. The architecture that you have to change to go to SATO Gen4, theoretically, is so significant that no vendor wants to do it. No vendor even wants to talk about going down that path. They'd rather just reprint a new name on an old SSD and give you the same capacity, right, and claim it's a little bit faster and under deliver.

Speaker 2:

Yes, yeah, I mean, I've got to be honest. I know I'm a tech guy but I'm a sucker for mechanical media. You know why? Because you can't sneak a little back door into mechanical storage media. But you can an NVM, you can an SSD Anything that is solid state everybody should be very cautious of, because you are relying on the integrity and security of the chip and board manufacturer at that point. You see what I mean. This is the reason why countries now suddenly waking up to the reality that they need sovereign AI as a national strategy, suddenly waking up and realizing they need to have control of their own national cloud platform.

Speaker 2:

To me, this was stuff I was telling people back in the mid 2000s, early 1990s, late 1990s, because it became so clear and obvious to me that if you were going to maintain any form of power, you would have to maintain control of your data. But people have got to suck it into easy money ads, revenue, easy money. People like that, so-called people, get this idea that by giving up all of their life to Apple and making, having everything made so simple for them, oh my God, this ecosystem is taking care of me, man, yay. But at what cost? At what cost to you Like physically, personally, psychologically and societally, because the reality is your data is being used to shift the line on elections now. So you have to be, as a consumer, you have to be really responsible.

Speaker 2:

Bear in mind, everybody wants to benefit from Web 3. What is the difference between Web 3 and Web 2? Web 2 was the paradigm where you were not sent to the universe. The platform provider was sent to the universe and they gave you something in exchange for you having something for free. But in Web 3, you are king and queen of your universe, which means you're also responsible for your security. It also means you're responsible for your own education and your own research and your own knowledge.

Speaker 2:

And again, this brings me back to why we should not give up our books. This brings me back to why we need to take copies of every single 70 billion parameter, llm and dataset and model that we can find and store them and be prepared for a reality where these devices, these bits of the past that we're holding on to digitally, are literally the only things that can disprove what we're being told on a global scale In our lifetimes. Bear in mind, like right here and now, I'm a kid of the 80s the stuff I have seen in my lifetime thus far. I never thought I would see Some of it. I've been glad to see Some of it. I'm not glad to be seeing even though it's ongoing, but it is what it is. There's lots of money to be made and people will commit a lot of evil to get it and again, our data empowers that, unfortunately.

Speaker 1:

You know, I feel like and I don't know if you use this right but I used to use this website called PeerList. It was where security professionals would go on it and kind of dump their research on it. Right, it was on. I guess it was like technically unpublished research or whatever, but it was like the ins and outs of PowerShell and how do we use it to abuse different things and the inner workings of Intel CPUs and stuff like that. It was just like a bunch of nerds posting whatever they're passionate about and highly in-depth material. Right, it's like the only place that you're going to find something like that.

Speaker 1:

And a couple of years ago, at this point, just a couple of years ago, the owner of that website decided okay, I'm going to sell this thing, and if I can't sell it, I'm going to get rid of all of it. Well, she couldn't sell it because she wanted something like $25 million for it and no one knew the value of it. And so I found myself scrambling to extract as much data from this site as I possibly could, because I'm someone that likes to learn constantly and whatnot, right? So it's like, okay, give me all of it and I'll get to it eventually. And it was just an insane situation where I was like, wait, what the hell am I doing? This should be automated. This should be something that can just go through and scrape this website and whatnot. And I was working through that problem. It's like, well, wait, people can just take this data and erase it. It's gone forever. I can't get to it. If I try it, I don't even have the people that posted on there to go to it.

Speaker 2:

What about Wayback Machine? You've tried that.

Speaker 1:

I haven't tried it recently, so it might be on there actually.

Speaker 2:

But again the reason why you should that data that you were trying to scrape. If you had actually scraped that and you had a hard drive of it using LM Studio, I could have retrained the Mistral 7B or the Mistral 70B with that data, and that would have been very interesting.

Speaker 1:

Yeah, that would be really fascinating. So, jb, we went through this whole interview and we didn't even talk about your company.

Speaker 2:

Hey, it's a nice chat.

Speaker 1:

Yeah, I mean that just means I'm going to have to have you back on sooner, much sooner, rather than later, because this is a really fascinating conversation.

Speaker 2:

Hell, yeah, I mean. Look, the thing is is one of the things we can discuss. A question you can ask me in the next interview is how do you come up with your products? Do you design for trends? And I'll say no, I don't design for trends. I look at my Magic 8 ball and I look at the geopolitics and socioeconomics and I build for the products that are required in the incoming 10 years. That's the reason why when I built Vox Messenger in 2017, nobody was interested in it. But again you end up with a pandemic and Brexit and some other stuff in between and it's there and I kind of saw that coming. I just didn't predict it was going to be a pandemic. That did it. I knew we.

Speaker 2:

If you look at enough data points in the world around you, you can predict. You can just do what an AI does. You can predict with a fairly high level of accuracy what's coming next. Don't design for a trend that a trendsetter has told you about, because by the time you're exploiting that trend, it is already exploited. You're just the Johnny-com late list. Look at what is coming and people will say to you but yeah, you know, we're right. My correctness factor so far has been about like 80, 90% on these kinds of things. Unfortunately, the world is horribly predictable with enough data points. You just got to think about everything and how it's connected. It's like if you take the data point of cloud storage being finite and then take the data point of the incoming point when stuff gets deleted, you can then work out and extrapolate the opportunities that may be exploited with that. Then you look for a sign of that opportunity, evidence of that opportunity being exploited in the world around you, and then that tells you if you've got the prediction correct or not.

Speaker 1:

Yeah, I always try to tell people when we're talking about education or training or anything like that you need to be getting education, you need to know the stuff now, right, but you need to be thinking far ahead and saying what's coming next in tech. Is it AI, is it LLMs, is it some other variation? I'm starting to go down a rabbit hole of satellite security with quantum cryptography. This is a rabbit hole that, in my opinion, it's coming five, 10 years. It's partially already here, but it's going to be extreme in demand five to 10 years and beyond.

Speaker 2:

It's far for you. When you realize that you can 3D print your own rocket and you realize that you can join a rocket club somewhere, you suddenly realize you could deploy your own satellites. Then, when you suddenly realize that it only costs, did you know you can do a ride share with four satellites from only $30,000?

Speaker 1:

Wait, really.

Speaker 2:

Yeah, europe, baby Europe.

Speaker 1:

I need to go tell my wife I'm spending $30,000.

Speaker 2:

You can do this Loads of cheap ride share programs for the OneU and the TwoU Cube satellites. Now one of the things we're going to be doing when we've done some revenue generating is we're actually moving all of our encryption into the literal cloud. We're going to be launching our own CubeSat. No, we're not using Starlink, we're going to be deploying our own system. We are not going to be sitting in the low orbital area either. We're going for something a little more interesting. We're also designing satellite counterprotective, satellite counteroffensive capabilities into the CubeSat as well, because it seems like satellite defense is going to have to be a thing now. So you have to design that. But the reality is, is space deployment of technology into space? Is it within? If you can afford to buy a car, you can afford to do a satellite launch.

Speaker 1:

Yeah, that is. That's really fascinating, because that's exactly like what I'm working on my PhD for is actually setting up.

Speaker 2:

Oh, well, okay, we need to hit me up after this, because if you're doing a PhD and you've already got your PhD funding, we could possibly do a co-lab project there, because we actually wanted to launch this fairly soon. The idea would have been to launch a converter Kubernetes server into one self-contained device, run it with solar and then its own battery and then get it up there and then see if we can maintain communications between Vox Messenger and between Vox Messenger sender receiver using that satellite connection and making sure that we have key handling running at a speed that is commensurate to what we have here on Earth. And if it is, we would be going full beans into deployment of a full bloody constellation.

Speaker 1:

Oh, okay, yeah, we'll definitely. We'll definitely talk more about this then and you know, I I absolutely want to have you back on.

Speaker 2:

I love space stuff. I mean I literally play like I put Kerbal Space Program after Civilization 5 is my biggest played game. I think I've got like 600, 700 hours on Civilization 5 and then Kerbal is like six. He's like five or 600 on that. I love that thing.

Speaker 1:

Yeah, I started to get into KSP2 recently and I like I carefully. I carefully have to play it because it's like all right, this is way too addictive. I have an 11 month old like I need to be doing other things than killing these Kerbal's, you know oh my God, you see, that's what my kids do.

Speaker 2:

I have not killed a Kerbal yet. I literally do proper space missions. Man, I'm really. I do the pen and paper working out working out my Delta V, because I can't trust the calculator, and I actually work out how my vehicle is going to operate under pressure, load and stuff. Oh my God, we play it so differently.

Speaker 1:

My space program has a very robust astronaut pipeline.

Speaker 2:

I just think that could be the consumer caps list model of space bearing in the future.

Speaker 1:

Right, awesome. Well, jb, you know I don't want to keep you beyond. I know people have other commitments and whatnot, but you know I really appreciate you coming on. I'm going to pass the conversation and, like I'm immediately going to be scheduling you to come back on, like maybe next week.

Speaker 2:

Hell yeah, I'm all over that. Hell yeah, I'll be here, awesome.

Speaker 1:

Well, you know, before I let you go, how about you tell my audience you know where they could find you, where they could find your company that we didn't even talk about, you know, and all that information that they may want to learn more about?

Speaker 2:

Okay. Well, if you want to join the Secure Revolution, to get Voxcript Vox Messenger, all you got to do is type in Vox Messenger into the Android Play Store or into Google and you'll find it. It's just there. The website is vox-messengerapp. You can find our crypto ads app at also the Google Play Store just by typing in Vox Crypto. We are coming to iOS on both very soon, but iOS is a very different animal and it does take a little pain, hardship and a lot of money to get there.

Speaker 2:

In technologies, my spatial recording if you want to have the Adobe Premiere of end-to-end spatial video recording so you can make your own 3D films and then make money getting them onto Apple for the Apple Vision Pro, then check out spatialscan3dcom, you know. Or just type in JBWB2020 into Twitter and you'll find me. I'm always there. I'm also always streaming in the background while I'm working. Maybe I'll be streaming some music. You can always jump in and message me. I will try to answer and I'm on LinkedIn.

Speaker 2:

Again, my name is very unique JB Web Benjamin or John Brunel Web Benjamin. Trust me, you'll find it. It's only me that comes up in a Google search. I mean, I did say at the beginning of this. My parents must have hated me for giving me a name like that in Birmingham in the 1980s, but it does mean that my SEO is on point. So you can find me just by typing in my name and my telephone number is out there. So, if you find it, text me or reach out to me on Vox Messenger. You may not get a reply straight away, but you will. I'm a firm believer in being accountable and transparent.

Speaker 1:

Awesome. Well, thanks JB for coming on and thanks everyone for listening to this episode. I hope you enjoyed our conversation. I'll definitely be having JB back on.

Parenting, Technology, and Security
Early Influence and Tech CEO Accountability
Neuroscience, VR, and Privacy Concerns
Government Surveillance and Data Storage Trends
Space Technology and Future Innovations
Unique Name Boosts SEO Success