Security Unfiltered

From Apple’s Inside to a New Kind of Phone: Privacy, Free Speech, and Building a Third Platform

Joe South Episode 206

Send us a text

We trade last‑minute schedules and kid chaos for a deep dive into how modern phones leak data, why “Ask App Not to Track” isn’t enforcement, and what a third platform built for privacy and free speech looks like. Joe shares his Apple-to-Unplugged journey, the Raxxis findings, and practical features that make privacy usable.

• zero‑to‑one background from Nomi acquisition to Apple services
• motivation for a third platform beyond Apple and Google
• Raxxis test revealing 3,400 sessions and 210,000 packets in one hour
• third‑party data brokers, pattern‑of‑life risks, Fourth Amendment gaps
• layered threat model from passive tracking to seizure and signals
• emergency reset, false PIN wipe, and hardware battery cut‑off
• first‑party vs third‑party privacy and ecosystem incentives
• “Ask App Not to Track” as preference vs permission
• Time Away to reduce engagement and regain attention
• firewall, USB data blocking, 2G limits, Bluetooth controls
• camouflaged VPN and operational noise in repressive networks
• app compatibility layer and broader app sourcing without Google
• clear business model: hardware and subscriptions, no data sale




Support the show

Follow the Podcast on Social Media!

Tesla Referral Code: https://ts.la/joseph675128

YouTube: https://www.youtube.com/@securityunfilteredpodcast

Instagram: https://www.instagram.com/secunfpodcast/
Twitter: https://twitter.com/SecUnfPodcast

SPEAKER_01:

How's it going, Joe? It's it's great to get you on the podcast. You know, we put this thing together pretty last minute. And uh I was surprised that it actually worked out. Usually last-minute podcasts just there's a million things that happen that that just like take over and bump it out even further. So I'm glad to you know be able to get you on.

SPEAKER_00:

Thank you so much, Joe. I appreciate it. And yeah, the timing worked out great. I happen to have been traveling a ton recently, but we have a nice window here of quiet today here back at home. So I'm really glad I lined up.

SPEAKER_01:

Yeah, I I have quiet for maybe three more hours until my uh my little terrorists come home from daycare. It's it's always just crazy as soon as they come home. And how old are they? I got a two and a half year old and then a four-month-old. So the four-month-old you know, crazy, but the two and a half year old. I'm in a similar bird.

SPEAKER_00:

So our youngest is one, we have a two-year-old, those are two girls, and then we have four boys, eight, nine, ten, eleven. So I understand.

SPEAKER_01:

Jeez, you have six kids? That is I do, yeah.

SPEAKER_00:

We're blessed. We're blessed.

SPEAKER_01:

Man. That is hard. I'm like I'm at two right now and I'm questioning if I want to go for three.

SPEAKER_00:

Like, just go for it. Stack 'em. Keep going. Stack 'em up. It's great. It's sort of especially as they get older, they start taking care of their siblings. It it becomes more manageable.

SPEAKER_01:

Yeah. Yeah, because I I'm like in the middle of my PhD right now, and I'm just like, I need two hours to do this. You know, like, and then two hours turns into four hours, and you know, I'm up till two or three a.m. doing it, and then I'm up early with the kids. It's just crazy.

SPEAKER_00:

Yes. I I do not promise you will get any sleep with more kids.

SPEAKER_01:

Yeah.

SPEAKER_00:

That will not happen.

SPEAKER_01:

I mean, I I don't really need it, I guess, you know. I I haven't really slept for so long now.

SPEAKER_00:

It's like that's sort of the boat I'm in. Yeah.

SPEAKER_01:

Yeah. With with the first one, I thought that it would be like a giant problem. And it was in the beginning, but after like three weeks, I got so used to only getting four hours of sleep that like if I got to bed, literally, if I got to bed at like 10 p.m., I would be up wide awake 2, 3 a.m. totally fine, have my entire day, to take a quick nap, and then keep continue on, right? Like it was so crazy how quickly I adjusted, but it was brutal for those for those two, three weeks. It's so terrible.

SPEAKER_00:

Yeah, there's a transition cost, and then it normalizes, and then I think I'll probably be in that zone for you know another 15 years. And you know, so what goes?

SPEAKER_01:

Yeah, yeah, no, for sure. Well, Joe, you know, I I want to start you off with you know, telling your background. How you got into the space? You know, did you start, did you start in IT, did you start in security, or did you start somewhere else and kind of, you know, pivot over, right? Just kind of follow the winding road of life. And here you are, right? So what does that background look like?

SPEAKER_00:

Sure. My professional career started actually running a family business in the advertising production space, which was really exciting. We innovated a lot of live real-time human emotion capture-based 3D animation technology. Um, it was a great small business in New York City where my dad and I had these two companies together. He had started it and I had a chance to get involved in it. And that went really well. We grew that business a lot. It basically used 3D animation to replace drawings to plan out large budget commercials. So we focused a lot on VFX and storytelling and sort of quick productions for clients to gauge commercial ideas. That was great. In 2013, 14, I wanted to transition out of this family business. I'd also seen a lot like the advertising market was consolidating and price pressure was happening a lot in the production space. At that time, a friend of mine and I started talking about a new idea for a video app. And a lot of my sort of skills that I learned in working with teams to build technology for the animation company, we turned that into building a video app for iPhone. That was called Nomi. And Apple acquired that in 2015. So that's how my journey at Apple started. I, my first meeting at Apple was pitching Johnny Ive in the design studio, my phone app. I was very excited. It was sort of this sort of entrepreneur's dream, right? Where like the curtain opens and they're like, we want to buy your company. I was very excited. But that that kicked off a career in consumer tech at Apple, where I led a special projects division in the services group at Apple. I was sort of like one of the zero-to-one guys at Apple. I would, I would be tasked with like new product ideas. Sometimes that was leadership saying we need a solution for this. Sometimes it was me saying, hey, we should do this. But I led a cross-functional team that innovated a number of new products that shipped on iPhone and iPad and also dealt a lot with other platforms. I would say my focus there, we looked at a lot, we touched a lot of the Apple ecosystem. But my focus there really refined over time to on-device intelligence while retaining customer privacy. I was very excited about being able to design experiences that enabled great customer experiences without allowing Apple or any developer to see customer data from the device. So privacy was a major focus of mine. When I saw Eric talk about the up phone and unplug my Eric Prince, who was one of the founders of the company last summer, I saw him on a podcast. I was already very invested in the sort of privacy space at Apple. And I had, you know, over time increasingly felt like we really need a third platform. Apple and Google are great at what they do. But I think in a lot of ways, although they're different companies, they're very similar in a lot of ways. Their business models are symbiotic. Apple is in many ways a leader in privacy, but it also sells real estate to Google for a lot of money to be the default search engine on iPhone. So I felt that, you know, the need for both a new platform that really focused on customer privacy big time, like completely, not partially, but completely. As well as a third platform that was less willing to control what information could be seen by customers on the platform. This is another area that I'd become increasingly concerned about was the US having really two ways of experiencing the internet and both of them sort of agreeing on what can and can't be said. And I think for a lot of people, whether it was COVID stuff or other stuff that at certain times were put was politically sensitive and then it turned out to be like obviously true and something we've all accepted, that felt like another reason for a third platform. So when I saw Eric talking about Unplugged, I was very interested. I actually initiated contact. I reached out. My initial propellant was just as a supporter, sort of like, hey, listen, I'm on the inside of this. I think what you're doing is great. I actually had a little constructive feedback about how they could sort of think about things, nothing, you know, proprietary or whatever, but that just began a relationship. And, you know, the more I leaned in and the more I learned, really the more concerned I became about the status quo of the two dominant platforms and the more optimistic I became about the prospects for a new alternative independent platform. So that's brought me to the final decision, which I made last May. This was a very challenging decision for me. I mentioned I have six kids. You know, Apple was very good to me and it was a hard call. But it it really, in the end, it for many, many reasons sort of coming together, it felt like absolutely the right thing to do. I will say was the timing was sort of serendipitous because the very day that I had this kind of yes or no conversation with Eric was the very day that I had gotten some really good news on something I was focusing on for years at work, and it was like, yay, great news. Wow, I'm about to walk away from this. And, you know, but that's that's how God lined it up. So here we are.

SPEAKER_01:

Yeah. That's a so that's a fascinating journey, right? When you when you started, you know, that first company, I mean, did you looking back on it, did you even have the mentality of like, hey, maybe I can sell this thing and you know, make X amount of money, right? Like it's kind of just like how life happens, you know? I mean, I don't know how else to put it. Like you take the you take the leap of faith, you know, and then life happens, right? And you kind of uh end up in this position where you're at Apple with, I mean, a dream job, right?

SPEAKER_00:

Like, I mean This is totally correct. Yes, that is a a good way to describe how it happened. You know, I personally have an understanding about this, you know. I'm I'm a Christian, I believe God had this all written out for me. I also can say I had zero visibility into what the plan was. And I also will say that, you know, another thing that has been clarified for me is that almost every positive turn in the journey that I've described was on the heels of some disaster, right? Like, you know, so I just I just told you the kind of cliff notes, but like, you know, and it sounded like this, but it was actually like Yeah. But I I have found Steve, Steve has a lot of people quote Steve Jobs on a bunch of things. There were, there's one quote of his that I really hold close where someone asked him, a reporter asked him, what's kind of the secret ingredient of a great product? And he starts by saying how to not do it, which is people get the this notion that if they have an idea and they can just go tell people to figure it out, that it'll just work out. But that's not how products happen. Products happen by endless iteration of individuals who discover, bouncing off of failure, what the thing really wants to be. And I believe that that's true for a product. I think it's also been true for my career, which has not been conventional at all. I didn't finish college, I have no official credentials of any kind. What I have been, you know, really blessed with is incredible people to work with who've taught me a lot, a lot of good breaks, a lot of bad breaks that we've sort of continued on and pushed through and turned into opportunities, both in the previous company at Apple to also the journey within Apple. I mean, the journey within Apple was, you know, great news from one executive, then the next day it would get shot down. And oh, we got to come back and come, you know, it was a lot, a lot of that. And it continues now, you know, with a different problem set, but one that I think is very important and very well timed. This issue of, you know, concern both with data privacy, with free speech, with not having a lot of alternatives for these products that we spend our whole day on is, I believe, increasingly important to more and more Americans.

SPEAKER_01:

Yeah, no, that that's um it absolutely is. It's like at the forefront, you know, of my own, of my own, you know, mentality and how I view security and privacy. And now I have little kids, so I'm, you know, the security paranoia in me is turning into like, okay, well, how do I like protect their existence? You know, because like this thing called the internet isn't going away, you know. LLMs are not going away, you know, the genie's out of the bottle. So what do we do? What do we do?

SPEAKER_00:

I completely agree with you, and I share those concerns, which is another I mentioned there were many reasons that drove me to this decision, and that was a big one is I look at my kids and I think about their future. What I hadn't fully understood, and what has honestly was a learning journey for me. Apple, I mentioned Apple is sort of of two minds when it comes to privacy. I was on the end of invent cool things that protect customer privacy. Great. What I realized though is while I was doing Apple first-party services that may have taken that approach, mostly what we do on iPhones is third-party applications as customers. And the truth is, I sort of imagined that my approach to first-party software at Apple just magically applied to third-party software, which I have come to see as like naive as the understatement of the century. So we recently actually did a test. I don't know if you've seen this, it's on our site. It's it's really eye-popping, but we hired this cybersecurity firm called Raxxis. And we asked Raxxas to take our phone and an iPhone and put them both on closed networks, but 33 standard apps we all see, you know, from Spotify to Pinterest or whatever, Expedia, stuff we use all the time. Ask App Not to Track for every iPhone app and watch the traffic that goes on and off the device at a really acute level. Pardon me. What we saw blew my mind. The iPhone, even after Ask App NotTo Track was selected for every app, in one hour opened 3,400 sessions with known third-party data harvesting servers. Meaning the apps that were launched had SDKs in them that reached out to these servers to transmit what? Location in any way they can get it, of which there are many ways, any other types of fingerprinting, trying to get on Bluetooth to see other devices that are around, the orientation of the phone, are you moving or still all that stuff, what you're doing in the app. In those 3,400 calls that were completed, 210,000 packets of data were transacted. This is in one hour. And this is a phone with just 33 apps. Most phones have hundreds of apps. So if you imagine this, this is one hour. This is happening 24 hours a day. Right? So we're talking about, you know, eight packets of data per second in this test. Now the good news is our device stopped all of them. So it it shut down calls to all of these known data harvesting servers. But that again, this is an example of like a discovery, right? So it was that test which led us to realize like, wow, these numbers are crazy. So we created this feature. So can you see this uh dashboard on my device? Yeah. What's the number there in the middle?

SPEAKER_01:

The middle, 976. Oh, 977.

SPEAKER_00:

977. Okay. That's the number of times, it's 1124 here. That's the number of 978. That's the number of times today that my device has stopped the apps on my phone from opening sessions with third-party data harvesters. By the time I go to bed, that'll be 567,000. Jeez. So why does that matter? Okay. Back to our kids and the future and security. What I didn't know, what I have come to get very educated on is layers here. Number one, I did not understand. I thought like all tracking was the same. Google, Meta, there's there are differences, they're different species, and they each have different risks. But what we're talking about here is third-party tracking services that are part of this data harvesting ecosystem that is behind this huge world of online advertising. The data that is gathered here is publicly purchasable by basically anybody. This is not something I understood. I did not understand that there were reams of data spilling off all of our phones that is being held in databases and analyzed and is publicly purchasable. So I've now done this myself. With a few bucks and a, you know, a website, you can go to an advertising firm, a data harvesting firm, and basically design campaigns to pinpoint devices that go to a certain address and that open certain websites. You can say, put this on a phone in this area that reads this website, Breitbart versus CNN. But you can also reverse engineer this data. It's very easy to buy a bunch of data from a certain area and say, hey, I know this person goes to the gym here and goes to school here. Show me all the phones that do that. Oh, there's only one phone that goes to those two places, and here's where they sleep. This is all totally doable. I'm not, I'm not the one lifting the curtain here. Byron Tao wrote a book on this called Means of Control about a year a year ago, this became popular. The main subject of his book is a guy called Mike Yeagley that we work with, who's an intelligence person who really revealed this when he went to the head of the DOD and did this process I just described to get the home address of an entire home addresses of an entire Delta team, which is obviously like a super secret thing, right? So when we think about the future and LLMs and all this data that's out there, it's purchasable by our government. So that's the first risk, in my view, is like there's no Fourth Amendment protection on this data. It's considered third-party doctrine-covered data because you've shared it with the company. It has zero Fourth Amendment protection. So that means the government can and does today buy this data and say, show me the people who go to gun stores, show me the people who go to crypto conventions, show me the people who go to gay bars, whatever. Happening all the time. One, two, foreign countries are doing this, profiling people in our country who have important jobs or who might be involved in government or defense. Like this is huge risky stuff. Also, individuals can do it. Crazy individuals or private investigators are doing this all the time in court cases now. Right. So it's this is, I think, the big like there are many layers as we often describe. There's we see it as sort of like a pyramid of risk of smartphone data and security. But this is the big one in our mind because it's happening so much, and the data requires no warrant to access and has literally no constitutional protection. And I don't think anyone understands that this is happening. So this is this is sort of a big area that we see as a huge risk. Above this, this is sort of the base of the pyramid for us, this third-party tracking, this publicly purchasable data about all of us, which again is it doesn't just cover you, right? You can very easily find out your wife's phone, right? And that where she works or you work. It's crazy. It's very easy to draw patterns of life from people from this. I can actually show you if you're interested, an example of this. It's crazy. So this is like the bottom of the pyramid, right? And then the next level up above passive data harvesting is what happens if your phone is actually taken. That's another thing I don't think people realize is we're walking around with devices that have our most personal thoughts, our relationships, our banking information. And whether it's a criminal or a law enforcement situation, or if you're crossing a border, by the way, if you're going on a vacation and you're going into another country, like you have no rights to protect that phone, right? So, you know, in this scenario, we have unique features in the product, not just to block the third-party tracking, but to make it very easy to basically like remotely wipe your phone. So, like one of the things we have, which I love, is like um we call it emergency reset. So I have like a false pin on my phone, which I won't tell you. But if someone like demanded that I open my phone, I can tell them a code that will wipe the phone and replace it with essentially fake data. I can also tell the phone, hey, if I'm not back here in 30 minutes or two days or two weeks, wipe. Uh things like that, you know? And then lastly, there's like the top of the pyramid, which is like all of the unintended electronic tracing that can happen or penetration that can happen outside of like the kind of commonplace third-party stuff. And, you know, this is things like, you know, turning off your phone and not realizing that when it's off, it's actually pinging towers to do some data transmission that you know you aren't aware of and you've just left a trail of your location, right? So we have a very simple feature of a hardware switch right here with a little pen or pencil, you can just flick it and it separates the battery from the electronics and the phone becomes completely inert. It's like putting it in a Faraday bag, right? So we have a ton of customers for whom, like, you know, many who are in very sensitive security-related jobs. And they have to go into situations where, like, maybe they're walking into a room with like very senior people in the defense org situation. They need to physically turn the phone off because for someone like that, you kind of never know, right? Like, is my phone being turned into a monitor and I don't know it? You know, the only way to get around that software is with hardware that literally shuts it down. So, in any event, you know, again, these security risks related to our phones are myriad, they're layered, and they're they're definitely not going away. Like, as more and more of our lives are lived online, right? As more, as our screen time grows, as more of our lives and relationships are conducted on these phones. And as you pointed out, as the technical capabilities via LLMs emerge to give everyone the tools to quickly synthesize data sets and target people and extract information, this is the risks here are just gonna keep multiplying.

SPEAKER_01:

Yeah. That is that's crazy, right? So I'm I'm a huge Apple iPhone guy. I got MacBooks, I have iPads, I love the products. You know, I have new ones coming in because they just launched some devices, right? And I I guess this is what I would say, right? I think there's probably a delineation that you also hinted at, that there's a delineation between like Apple native services or apps and then third-party services and apps, and how they're both using you know the resources and the limitations of the device. Can you talk about that a little bit? Because it sounds like Apple itself is indeed following its privacy and security practices that it claims that it has and offers to its users. It sounds like the limitations are more loose for third-party apps because they aren't able to control it the exact same way. Does that make sense? Am I off?

SPEAKER_00:

Or yeah, I think that makes sense. And I'm certainly not speaking on Apple's behalf. Right. And and I, you know, I will say that I was very proud of the privacy work I did at Apple. What we discovered in this test is that I I would say a couple things really emerged to us. We also tested a Samsung in that same test. And it actually made fewer calls to third-party data harvesters by about half compared to the iPhone. Now, some people saw that and said, Joe, does that mean the iPhone is less secure? I did not take it to mean that. What I took it to mean, anecdotally, but I I stand by this, is the iPhone is a much richer target for data harvesters. The value of iPhone users is much more valuable to advertisers than the valuable of Android users on average, by an order of magnitude. It's it's not a it's not a small difference. So my my takeaway is not, oh, therefore, iPhone is less secure than Android. I don't have any information to suggest that. But I what I do see is iPhone apps, third-party apps targeted at iOS, are appear to me to have from this test, to have more data harvesting eyeballs on it because there's so much value attached to especially American iPhone users, customers. So regarding the delineation between first and third-party data, Apple speaks very publicly about its own policies with data. And I think that those are admirable. And I've never seen anything that would indicate a divergence from those. We should we should double-click into that though. Yeah, yeah. Because, you know, I like while Apple has services that are designed to prevent Apple apps or third-party apps from seeing what's like one of the things I was involved in is the smart share sheet. It's it's one of these examples of an out-of-process system experience. Like you know, when you go to share a photo in iOS and a sheet comes up and it has suggestions of who you might share with. Our team was designed that. And that experience is out of process from any application. What that means is the app gets backgrounded, the sheet comes up, and you're interacting with the OS at that time, meaning that data who you're talking to is in the OS, not the app. I would consider that like an elegant I personally am excited about that solution. Apps can't see that data. Okay, great. To me, that is one type of choice. Another type of choice is the symbiotic deal that Apple has with Google. That is a very different type of choice in my perspective, right? In which rumored tens of billions of dollars, I don't have any private, you know, whatever. I've seen news articles that say that a lot of money is transacted to essentially rent the default search engine space on iPhone. So that's sort of like a little those two choices to me represent like different types of choices. Like in one case, the default is privacy. In another case, the default priority is revenue. So those seem like divergent to me. When it came to this ask app not to track situation, I was really alarmed by that outcome we saw in that test. And again, I'm just speaking here as a customer, you know, special insight here is other than just uh someone viewing this from the outside, this test is a third party. But I will say when I when I looked at the language of ask app not to track, I just as a result of this test, it really hit me. I was like, this is different than I thought it was. Like and I I really slowed down. And it's like, I thought this was a permission. Like, can this app have access to my contact list or not? And then I looked at other permissions and those were saying, can this app have access to your content? Can this app have access to your camera? Can this app have access to your location? This wasn't saying that. This was saying ask app not to track, which is very different. The other ones weren't saying ask app not to access my camera. They were saying, Are you gonna allow the app to access your camera? And I realized I sort of started pulling that thread. And I realized from my perspective, the language and the presentation was less than clear. And as a customer, I thought I was hitting a permission, but I wasn't. I was asking it not to do, and it just seems like such a strange thing to ask. Like, why would I want an app to track me? Like that, like if I go to a burger joint and I order a cheeseburger and the waitress says to me, Oh, hey, before I go, I gotta ask you, uh, would you like the chef to not spit in your burger? I'd be like, Wait, why are you why are you asking me that? Does the chef want to spit in my burger? Like, what's happening here? You know what I'm saying? Like, it seems like such a strange. So, my sense here is that this is an area where we we really want to differentiate. And I again, I described areas where I see Apple makes really smart decisions about privacy. Other ones that are not as clear and then it sort of get less even less clear. We want it to be really, really, really simple. So, like, we have some basics in our product that eliminate any of this ambiguity. Number one, we make zero revenue off customer usage. Like we have no upside at all. I think everyone who makes phones knows that the more someone uses the phone, the worse it is for their life. Phone usage is like directly correlated with worse health outcomes, loneliness, polarization, all these things. Okay. But these other platforms make more money the more you use the phone. They just do. You know, we don't. We make money by selling phones and by selling subscription services. That's it. That allows us to do something pretty cool, like, you know, this here at the top. How many minutes since I've unlocked? Eight. Eight. Okay. This is my actual favorite feature of the phone. Okay. I use my phone too much. It's chronic. You know, I'm blessed with this beautiful family, and I have a terrible tendency to sit down at dinner and like keep working on the phone. It's very bad. A famous speaker, I love once said, he says, you know, because of phones, we can work anywhere. So now we work everywhere. I'm guilty of this. So, like family dinners or church on Sunday, my favorite thing now is I put my phone in my pocket and I have this impulse, like, let me check it. Let me look, let me look, let me look. And I'm like, no, I know what's gonna happen. I'm gonna let it sit there. And when I leave church or when I leave dinner, I'm gonna pick the phone up and it's gonna be like 75 minutes since I looked at the phone and I am gonna be the boss of my phone. Like for me, that's enormously valuable. And I feel like this reframing, like, forget screen time, which is about the phone. We called the feature time away, which is about like you building something positive, not minimizing something negative, right? So we want to help people increase their time away from the device rather than making them feel like you know, they need to feed this sort of addictive habit. So what I'm getting at here is when we start having like privacy, business model blurriness, we start ending up, I believe, with less clear loyalty to the customer only. And we end up having to make sacrifices and compromises for other interests like Wall Street or whatever, right? And we don't want to have to make those compromises. So I would say, like to answer your question around like the delineations, my observation is other companies have some great policies, some blurrier policies, and some really blurry policies. We want there to be like zero confusion. We want the phone you buy out of the box to like obviously not sell your data, obviously discourage apps from harvesting your data and make it really easy to know what's happening. That's the last thing I'll say is you can't really have security without transparency. And what I mean by that is if you don't know what the phone is doing, then how do you know where your security boundaries are? Right. This is why we're really shifting a lot of our experience in the product to like showing people what's happening. That dashboard I just showed you is the tip of the iceberg. Like, you know, when you tap into it, you get like a pretty detailed graph of like when these calls are made by what apps. And we're we're gonna keep going in this direction so customers can really understand like specifically what servers are communicating with their device. So, yeah, for for us, like eliminating the ambiguity is a big part of the sales pitch.

SPEAKER_01:

Huh. Yeah, that that's crazy because you know, I I remember when Apple came out with that service, that that new feature, right? Ask app not to track. And like I questioned it myself. I'm like, well, it's saying ask. That's not that's not shutting it off. And then you go into the settings and it looks like all of the other permissions, right? I mean, it looks the exact same same color, same option. Option, you know, and so in my head, I'm thinking, well, maybe it's poorly worded. And it's doing the same function, it just worded differently to appease, you know, Google or third parties that are on the app. Surely, if I say ask app not to track, it's gonna limit how much meta is tracking me, right?

SPEAKER_00:

And I I by the way, I believe there are some limitations that occur. Like the more I've been reading into it's it's not very easy to understand, but like the more I've been reading into it, it appears that it may, I believe it does. In fact, it may do something like hide the ad ID from the device's ad ID. However, like what we see clearly is our phone has no ad ID, and that does not stop these apps from trying to fingerprint in other ways. So it's like all of these businesses, this is a multi-hundred billion dollar business of data harvesting on phones. All of these SDKs are invested in finding ways to fingerprint devices, follow you around, get your location for everything you do, regardless of what settings you do on your phone. And this is what we're trying to stay way ahead of. So it may be that, for example, the ad ID is shut off in that scenario. I believe that that's possible. The issue is like they don't need the ad ID. There are numerous ways that these applications and SDKs fingerprint the device and end up with the same result. So, again, to your point, I think, you know, the other thing that it does seem clear is as far as I understand the language is and the literature on this, is that the OS is telling the app a customer preference, but it's not enforcing, as far as I understand it, the customer decision or monitoring what happens. And that's what we're seeing in this test, right? Is I'm just using an app where I say ask app not to track, and the app is m is making insane repetitive calls to data harvesters sharing information from the phone.

SPEAKER_01:

Wow, yeah, that that is that's really eye-opening, honestly. And so I have I have one of the unplugged phones. I haven't switched over to it yet as my main, mostly because I'm just a little bit lazy with everything going on. It's just like I'll get to it, you know, like and experience it. But what I really am interested in. So you got the ad part of it, the tracking part of it, down, it sounds. Are you expanding the security posture of the phone in other ways? You know, like I'm thinking of like Wi-Fi attacks, Bluetooth attacks. Are you looking at securing it in those ways as well?

SPEAKER_00:

Yeah, there are already existing features and there's more in development. I'm I would definitely, by the way, let's make sure we get you on the latest OS because we dramatically revamped the OS and now it's much easier to switch over. I I understand it the older it can be a hurdle. So it's it's an order of magnitude easier now. So one thing I will call out here is that in the privacy center here, one of the things we do, we we change it to make it a lot easier to drive. And we created this four-tab experience that allows you to pick from like firewall, which is here, to the usage stats, which include like your time away, how much you're picking up the phone. But also just these device controls. This is really important. So here at the top level, I can determine like I want no ads and trackers, I want no porno, I want no gambling on the phone, but I can also control do I allow USB to transmit data? This is very important. Like people will go plug their phone in at an airport for like a USB charge and not realize they're hooking up to something that's pulling data off the phone. So, like this feature is important because it allows you to get the benefit of power from USB without pulling that data off. We also have things like obvious things like Bluetooth controls, but also limiting 2G is really important. And I think a lot of people miss this. Like MZ MZ catchers leverage this super unsecure 2G network to be able to get identifiers of phones. Right. So these are just showing like examples of ways we're hardening things that other phones are sort of leaving as blind spots. And I think, you know, you're gonna be seeing a lot more from us in this area as we as we move forward. So, yes, whether it's from the physical off switch, which is by the way, the most secure way to make sure the phone is not being compromised, is by physically turning it off to a clearer way than other phones provide of easily understanding the layers of access that you're providing in like one simple to use place are ways we're addressing these issues. You know, we also provide here, I'm sure you've seen this like on the original device, but at the top level, you know, right below my time away, I can with software control the OS's access to camera, Wi-Fi, microphone, location, et cetera. I can turn these features off actually outside of the operating system. And this allows like an extra layer of protection from the OS itself being compromised. So again, we see this all like facets of a diamond that our goal is to increase protection in each area and just stay ahead in every area. I think, you know, areas you'll see more work from us in the coming cycles are in the prevention of first-party data harvesting. So we've talked a lot about third-party data harvesting. But there's a lot of the bigger companies have first-party harvesting, meaning they don't call on third-party STKs, they build it into the tunnel of their app. And a lot of action is taken there. So that's a focal point for our coming releases as well.

SPEAKER_01:

Wow. So so it sounds like I can actually maybe go to China and not be fully, you know, tracked for my device.

SPEAKER_00:

Well, we do have a VPN, we have a unique VPN that is, to our knowledge, one of the only VPNs that's operational in some of the more repressive environments. So we have a number of, I would say, like security-oriented professionals who are operating in the worst environments you can imagine on our device, and our VPN is designed to hide the fact that it's in a VPN in these environments. So there are a number of countries that have, you know, extremely closed internets and they prevent VPN usage. And our VPN is uniquely designed to camouflage itself among normal consumer traffic.

SPEAKER_01:

How's that even possible? Basically, don't give me your trade secrets, but yeah, I'll just tell you like what I'll tell you an example.

SPEAKER_00:

Like, let me just give you a corollary example, okay? Is that as I mentioned, a lot of like teams, you know, people who are on the pointy end of the spear, right? Use our device. And something we've learned is that our device is so much quieter than other phones that electronically savvy adversaries can pick out our device because it transmits so much less signal. Because other phones you don't realize are constantly calling these data harvesters. They're also constantly calling home base, their own OS home base, right? With information, updates. Our phone is very quiet, right? Which in some environment, like in most cases, this is an enormous advantage. But if you're in an environment where an equipped, savvy adversary is trying to find an individual who's different from other individuals, this quietness is actually a liability. So for those people, we literally deploy special software that transmits fake activity. So it's like it's just indecipherable from a normal phone, right? Meaning it looks like someone scrolling through Instagram and ordering pizza or whatever, you know? Wow. So that is really interesting. These are things that you end up discovering when you're dealing with these different areas. Now, again, most of us aren't in that situation. The problem most of us face is not, you know, state actors trying to target us to put a drone on our head, right? God forbid. What most of us are facing is this passive extra constitutional data collection in which our whole lives are being recorded on public databases. And I, you know, I think I'm not alone in feeling that like after COVID and seeing what can happen in a crisis, like maybe we don't want that information out there. You know, maybe it's better for us to follow the direction of our founders. You know, something I've really become obsessed with is the founders' intent with the post office. To me, this is the most important lesson for us today in our internet age. Right. When the founders wrote the constitution, a big part of their debate was how we needed a private, secure way to transmit information to maintain the republic, both private letters, but also to get news. So Article I of the Constitution establishes the post office. Literally, right after establishing currency, they established the post office. Right after this is the Post Office Act of 1792, which is established then that it's a felony to open another citizen's mail. It establishes it's a crime for people in the post office to sell or leak access about any information about what was shared or mailed. So their vision was for us to retain a republic, we need totally secure private encrypted communication and ability to get news without knowing who reads what newspaper, for example. What changed is in the 70s with a couple of Supreme Court cases in the Warren Court, which established at the time it was phone companies and banks that had some third-party data. But it was determined that, hey, if you've shared information with a phone company or a bank, you know, you don't have a right to privacy expectation with that. That has been applied to our phones, which are very different. The amount of information I just mentioned, hundreds of thousands of packets per hour, the amount of information flowing off of our phones and what that makes discoverable about us is really dystopian. So this is this is really where we want to, you know, we don't think legislation's gonna fix this, although maybe there's something that can be, you know, some improvements that can be made. We really just think we need better products that are more aligned with the interests of customers and don't have these other financial interests involved. So our vision here is like, let's innovate our way back to an internet communication data infrastructure with products that are designed to protect our rights rather than sell out our information. Again, whether that's first party, third party, whatever. We just want it very easy to understand with no with no exceptions or or blurry terms of service.

SPEAKER_01:

Hmm. That is that is so fascinating. Just this this entire world, I mean, what you're describing, right, is essentially tradecraft. Right? I I mean that that's that's really what it is, because of of the way that you have to think of how you would be tracked and what you bring into a certain area and how it can be used against you, and how you would subvert, you know, those data collection techniques and whatnot. You know, like this phone is absolutely you know user-friendly, which is not something that you would expect from a highly secured by default device. You would expect that kind of device to be something that you know, someone highly technical like myself who lives in the weeds is using, right? Like and it's like only that kind of person would use or could even use this device and actually understand, you know, what's going on and whatnot. But it's huge for this device you know to be designed the way that you guys designed it and be as user-friendly as it is, which I I was actually pretty astounded by. I was impressed by that.

SPEAKER_00:

You know, that's really great feedback, and I'm really happy to hear that, Joe. And we we want to keep making it easier and easier because I think your your sensibility is where we are. We are very aligned, which is so far, you've had like think of it like a two by two, like old school business diagram. You've had phones that are really easy to use, like Apple and Google, but as we've pointed out with these tests, at their heart, they're not really private as ecosystems. Then you've had other phones that are sort of more private, but they're very hard to use and they require like loading special software and OSs and bootloaders and like and it's just or they don't run apps, right? Like you're sort of forced between like you can run normal apps and have not no security, or you can not have apps and have some. And our vision is like, what if there's an easy-to-use daily driver phone that's that works just like your iPhone? It's simple. You know, we have like cloud storage that holds all your stuff. We have password manager, so you can easily, you know, run your apps, all that stuff, all the simple, all the things that simplify the experience without the giant security and privacy risk. That's what we're trying to solve for. And we think the market there is meaningful of people who want an independent phone that's very easy to understand, out of the box, is protecting your privacy, but it just works like a normal phone, you know? And we just did something that really, I think, is gonna even further amplify just the automatic ease of use, which is, you know, for us, we run Android without Google, which means a lot of apps require Google to do stuff and we have to route their calls, for example, for notifications or maps away from Google to an open source alternative. And as we've gotten better and better with this, what we call the compatibility layer, which is now making like pretty much every app work totally great, we can open up our app center. So, you know, we have the, let's say, the top 10,000 apps that our customers download most on our servers. So, you know, if they're ever censored by big tech, they're alive on our servers. However, we now, when you do a search, have access to an open source library of all Android apps. So if you search for some like really niche IoT app, like I had this the other day, I needed an app for like my daughter's baby monitor. And I was like, man, normally I'd have to like call our guys, hey, can we test this app? But now we've integrated into the flow where I can just search this and it shows up as an external source. It's available in our app center, but it's not on our servers yet because it's not super popular. And, you know, because of this compatibility layer, these just work, right? So this is one of the reasons why we think this is possible today. Again, this easy to use but private is because apps are ubiquitous. So there's sort of already an app for everything. So our job is just like make sure we have fluid compatibility with apps. And then that's sort of A, and then B, design these features of the OS that prevent those apps from taking data in ways that are not clear and dangerous.

SPEAKER_01:

Yeah, because you're when you secure the device by default, you're able to then just put anything into the walled garden or the sandbox, right? And it'll be fine. Right. Like there when did when did that feature come out of the third party pulling it from another server that you talked about?

SPEAKER_00:

This is literally rolling out right now. You'll see this in a week. Wow. Okay. Yeah. Because when I was like, I can get you, we'll send you an APK so you can run it locally and give us any feedback. But uh, we're rolling this into the OS literally right now. Our guys are at the logistics place in in Nevada right now. Basically, we're we're we're flashing all of the devices with a new operating system that includes this feature.

SPEAKER_01:

Wow.

SPEAKER_00:

Which is just such a great innovation.

SPEAKER_01:

You know, you guys should just throw my device into like an internal test, you know, pool. Like I'm happy to test it.

SPEAKER_00:

Great. Well, we'll we'll we'll put you on the list, man. Yeah, that'd be fun. I mean yeah, definitely, definitely.

SPEAKER_01:

Man, well, Joe, you know, I I I I really do my best to to stay on time, right? I know we started a little bit late today, but you know, it was a fantastic conversation. I I I really try to stop on time because I know everyone's schedule is so busy that we're not always able to go over, but I really do appreciate you taking the time out of your day to come on and talk to me about this.

SPEAKER_00:

Joe, thank you so much. I really appreciate it. It's been super fun. We'll make sure to get you that software too.

SPEAKER_01:

Yeah, absolutely. Well, before I let you go, how about you tell my audience where they could find you if they wanted to connect with you and where they could find the unplugged device if they wanted to actually buy one. There will be a link in the description of this episode. I have an affiliate link for people that can use to go and you know purchase the phone at a at a small discount.

SPEAKER_00:

Thank you so much, man. Yeah, so we're use Joe's link and we're at unplugged.com. We are a couple weeks out, a week about from shipping right now. So if you order right now, you'll get a phone, you know, uh second half of this month. We are also on X and other social media apps. I'm posting on our X channel frequently. So that's probably the easiest place to find me is on our unplugged X account. And we're putting up videos there all the time. And, you know, we're we're really trying to educate people through the internet on these issues, and we just would love feedback. So that's gonna be the easiest place to message us, DM me there. We also have on our site, we have a support mechanism. We also do in the phone. So any questions, I think one thing I want to make sure people know is like we have a US-based team of experts who like help people get on the phone and answer any questions. So, like, we really want input and feedback and are like there to help. So this is I end up getting involved in many of these conversations myself. So don't be shocked if one of those uh emails makes its way. But uh again, this has been really fun and we're super grateful.

SPEAKER_01:

Yeah, absolutely. Well, I'm I'm really happy that you came on. Well, thanks everyone.

SPEAKER_00:

Thanks, brother.

SPEAKER_01:

Yeah, yeah, absolutely. Well, you know, thank thanks everyone for watching this episode. I hope you really enjoyed it. Go ahead and go pick up a phone. It's a fantastic device. I was very surprised by it. Again, I'm an Apple fanboy, you know, so I'm very used to the Apple ecosystem. I absolutely hate Android with a passion, but when I picked up this phone, it was very easy to use, you know, first time out of the box, no issues. All right. Well, thanks everyone. Peace, brother.

People on this episode