- 6 weeks ago
It’s a huge week in AI, with OpenAI releasing GPT-OSS and GPT-5, Grok getting deeply problematic again with its “spicy” video generator, and Tim Cook admitting that Apple may need to cut some deals. Then we talk about the age gating of the internet and how you might soon need an ID card to get just about anywhere online. Finally, the Lightning Round get re-rebranded. Adi Robertson and Alex Heath join the show to discuss.
Category
🤖
TechTranscript
00:00:00Hello and welcome to The Vergecast, the flagship podcast of extremely consistent naming schemes.
00:00:07I'm Jake Castronakis, Executive Editor of The Verge. Joining me today, Verge Policy Editor,
00:00:12Addie Robertson. Hi. And Verge Deputy Editor, Alex Heath, who is also filling in for Neelai
00:00:18on Decoder. What's up? First up, a lot of AI news this week. I want to start with OpenAI.
00:00:23On Tuesday, they announced an open model GPT-OSS. This is their first open model in six years.
00:00:31Because it's open, that means you can tweak it to your liking. You can even download it,
00:00:35run it on your laptop, they claim. They say it's supposed to be as capable as some of their recent
00:00:40mini models. And this is something that you can just use by yourself privately now if you want to.
00:00:46Then on Thursday, they announced GPT-5, the next big model for ChatGPT. They just held this event
00:00:53showing off what it can do. And this time around, I'm not sure that there's necessarily anything
00:00:59new that it can do. They just seem to be saying it can do everything better. We just watched this
00:01:05event. It just wrapped up right before recording. I'm really curious what you guys think. I thought
00:01:09this was a weird one. There were portions where they just sent requests to ChatGPT, and then we just
00:01:17waited around and watched it load. It is apparently a lot faster. I mean, we should preface this by
00:01:25saying we have not tried it yet, obviously. They're in the stages of rolling this out to
00:01:30all ChatGPT users, which is a new thing. Usually they gate these Frontier models to the paid tier
00:01:36initially. This time they're rolling it out to all 700 million plus weekly users. But I'm not even sure
00:01:42if by the time this comes out, everyone will have access to it yet. They have let a handful of
00:01:49independent creators, AI writers try it. Interestingly, they don't let any of the
00:01:55quote-unquote legacy media try it ahead of time. But what I've seen from the folks at Every,
00:02:01for example, is that people seem to like it. The vibes are generally good. It seems faster
00:02:07and more confident, whether that's a good thing or not. But it's not this huge leap that I think
00:02:16people were hoping for. I do think it's very funny. AI announcements are the only announcements
00:02:21where the executive on stage will tell you, and it's going to lie to you less, and it's going to
00:02:25be less dishonest, and it's going to try to screw around less often. It's like, I can't believe we
00:02:31have to announce this. Yeah. I was on a press call with Sam Altman and a bunch of open AI execs this
00:02:36week. And that was a big thing they hit was that the hallucination rate for GPT-5 has gone down quite
00:02:42a bit. It still hallucinates. In fact, I think people were pointing out slight inaccuracies in
00:02:48some of the examples it was giving in the live stream for how it hallucinates less. So that's
00:02:53still a problem that remains inherent to large language models. But they're saying this is the
00:02:58most right of the models out there. And you take what you can get.
00:03:04And I don't want to diminish this, but I feel like, you know, if you go back to the GPT-4
00:03:09announcement, I think it was 4.0, you know, they had the new voice model. There was a very
00:03:14impressive new demo. And I don't know, Addy, you were watching this too. Was there anything
00:03:18that you saw today that felt like you hadn't seen from an AI service before, or even from
00:03:25ChatGPT before?
00:03:26I feel like I am a really bad person to get hyped about AI stuff. I feel like I am just tuned
00:03:32way lower than any other normal person because I'm like watching this and I, first of all,
00:03:36it does not seem like something that I haven't seen before. And second of all, it doesn't seem
00:03:42that exciting to me. I'm like, I do not, how many times in my life do I really want to build a
00:03:48service to let my girlfriend learn French? Like, I understand that this is incredibly technically
00:03:55cool and accomplished. And I am looking at this as a product and it is still just really failing
00:04:00to grab me in a lot of ways. And the French game, Addy, this is when they vibe coded like a flash
00:04:05card slash snake learning game. As someone who really feel, I feel like I should love the idea
00:04:11of vibe coding. It just, it feels like it's just here. You can create this thing that you're maybe
00:04:15going to use once that already clearly exists a million times online because that's why it can
00:04:21create it. Yeah. On the call, Sam Altman called this software on demand. So we're moving from vibe
00:04:26coding to software on demand. And yeah, I mean, it does have big upgrades in terms of the front end
00:04:32code development it can do. I do think ChatGPT was widely considered to be behind on CodeGen.
00:04:39That's something that Anthropic with Claude and even Gemini, the latest Gemini release really
00:04:44improved on. And so, you know, really, even though ChatGPT has grown like crazy, again, it's by far the
00:04:51most used chatbot in the world. OpenAI hasn't had a frontier model on coding in a while. And so
00:04:58they're saying that this reasserts them to the top of the leaderboards. It looks like that's the case.
00:05:03But yeah, I mean, I think vibe coding is or software on demand is very early in what it is. And is it
00:05:10going to be something that creates like real value in the world versus like little toys and
00:05:15trinket websites? I don't know. I mean, right now, it kind of reminds me of the early app store when
00:05:20people were just making like fart apps. And maybe we never graduate from AI fart apps. Maybe that's
00:05:24where it stays. But who knows? I mean, you can't deny that the progress is rapidly moving towards
00:05:32more complex stuff that can be coded. So I haven't played with it yet again, but it looks like a huge
00:05:39improvement on coding. Software on demand is way less fun of an activity than vibe coding. I've never
00:05:46vibe coded. Have either of you vibe coded yet? No. Yeah, I was just actually talking with the CEO
00:05:52of GitHub on decoder about this this week. Like, I don't think any of the vibe coding tools are to a
00:05:56point where someone who literally has zero software development experience like us can reliably do
00:06:02something and deploy it. You still need to understand basic mechanics of code, which I do not,
00:06:10and I will refuse to learn. Although I probably should, you know, we could all be making $100 million
00:06:13if we knew how to code, right? So I mean, I can code in very specific circumstances. And I just I
00:06:19don't think I have the general broad purpose. I know, architecturally how this works, knowledge
00:06:25that I think does kind of feel like what so far a lot of vibe coding at least sort of requires.
00:06:29Yeah, it feels like that's actually getting more important is to have the higher level understanding
00:06:34of how everything works together and how it can break than actually just like writing code. Guys,
00:06:39we're also we're burying the biggest news, at least for I think consumers, which is the model
00:06:44picker. It's gone. That to me is like, if that's all they shipped, I would be happy. You don't have
00:06:50to switch between reasoning, regular, mini, whatever. It just does it all. They invented this new router
00:06:57that, you know, you can say think harder, for example, and it'll just go to reasoning. But,
00:07:03you know, RIP, but not really to that model picker, you know,
00:07:07it's all just one. Now you go and it's just one box. There's no note that this is it's so funny,
00:07:12like Chachi, they invented like one of the most impressive products the world has ever seen.
00:07:17And then the actual UI for it just had the like cludgiest little like set of options that you had
00:07:24to pick from and decipher. And it's very funny that like it took them this huge upgrade just to
00:07:30figure out, okay, let's have our AI system pick which AI bot to actually use.
00:07:35Yeah, Sam called that a confusing mess on the call. They really have hated the model picker.
00:07:41And I think they were waiting for a new frontier release to unify everything. It still has multiple
00:07:46models in the background and developers can get the different models through the API directly.
00:07:51But yeah, if you're a ChatGPT user, now it's just ChatGPT 5.
00:07:55Have they fixed the naming schemes as well here, right? This has been their other,
00:07:59this has been the weight around their neck.
00:08:01Well, now it doesn't matter because it's just five.
00:08:03It's just five. It's all five. You don't worry about it. They're getting there. They're getting
00:08:08there. It feels like they probably hate the GPT name and they just can't change it at this point.
00:08:12I don't think they can. I mean, it's the Kleenex of AI now. I mean,
00:08:16they're about to hit a billion users. That's probably going to happen in the next few months.
00:08:21So no, I don't think they'll change it. I would say the other really interesting thing is that
00:08:26they have this new thing called safe completions where before ChatGPT would just refuse to answer
00:08:33something if it thought it was potentially a dicey prompt. And now they'll basically go halfway.
00:08:39The example they gave on this press call was if you asked the question, how much energy is needed
00:08:45to ignite some specific material? And basically saying that could be someone making a bomb or it
00:08:51could be a student trying to learn. And now instead of just refusing a question like that,
00:08:56that could have potential bad outcomes, the model will give a higher level answer that doesn't
00:09:03actually get into the specifics, but is apparently set with guardrails to avoid giving an answer that
00:09:09could lead to potential harm. Again, we have to see all this in production, but to me, it shows that
00:09:15they're at least trying to mitigate the inherent problems of LLMs, which is that they
00:09:20confidently lie and also make shit up and they're trying. Whether they can get there obviously
00:09:27remains to be seen, but I feel like they're making strides more on the product side than any of the
00:09:32other chatbots out there right now. The selectable tone also seemed kind of interesting to me. I
00:09:36haven't actually really seen what that looks like in practice. You can choose it between like it being
00:09:42nerdy or being a good listener. The personalities. Yeah. So there's four new personalities
00:09:48that are being added to ChatGPT, which I, they haven't really explained. That was something kind
00:09:53of buried in the, in the release, but that's starting to roll out and you can tweak them
00:09:59apparently a little. They are cynic robot listener and nerd. I, this is so weird. I, this is like very
00:10:07Grok like for one, right? Grok, like you open it up and it's like, do you want me to be a sexy nurse?
00:10:13And like no other system does that. And it's very weird that ChatGPT is like, do you want me to be a
00:10:19nerd? Like it it's, of course it's a nerd. It has information from the entire universe. Like what else
00:10:25can it be? It's also, I don't know. I just like, is it going to like throw in like Star Trek quotes?
00:10:29Companion chatbots do this a little. And I get the feeling that it's kind of leaning into that.
00:10:33Just, they are clearly leaning into anthropomorphizing this thing. They're also going to let you
00:10:37change the chat color bubbles for individual threads, which is, you know, one of the core
00:10:43things for a bunch of messaging apps that use humans to talk to each other. So yeah, they're,
00:10:48they're obviously leaning into the idea that this thing is something that you think of and talk to
00:10:53like a person. They're also rolling out advanced voice mode. They're replacing it, um, with the
00:10:58current standard voice mode for free users. Uh, and you, there's like bragging about how you can talk
00:11:02to it for hours and hours on end. So yeah, they're, they're totally leaning into this.
00:11:06So Alex, you were mentioning earlier, the, the safety improvements. I think this is really
00:11:09interesting because this week they also released their first open weight model in six years. Open
00:11:15weight means you can tinker with, uh, how it's learned everything. I believe is that, is that a
00:11:21right? Yeah. The weights are the parameters that go into the model. It's not the training data. So
00:11:26you still can't see the data they used to build this thing. Um, and they're not talking about that
00:11:31with GTP five either, but it's definitely really customizable. The key thing is like,
00:11:35you can run it on device, you can run it on a laptop or a company can load its proprietary data
00:11:42into it. Uh, and it can work behind a firewall. So it's not something where like a big bank would
00:11:48be sending its very sensitive legal data up to open AI servers. Well, and so they, they had stopped
00:11:53releasing these models because they're like, it's a safety concern. And now they released a vastly more
00:11:59advanced model. They say it was a safety concern. I mean, yeah, this is their first open weight model
00:12:03in six years, which is very ironic given that the company's name is open AI. They've actually been
00:12:09just doing closed models. Uh, and the timing is telling. I mean, safety was the reason they gave
00:12:17and then guess what happened at the end of last year, deep seek. And then in January, Sam Altman is
00:12:23on Reddit saying we're on the wrong side of history when it comes to open source. And lo and behold,
00:12:28they come out with an open weight model. So I think what's really happening is just the competitive
00:12:33dynamic around all this has shifted. And if you look at the leaderboards for open source, open weight
00:12:38models, they're all Chinese models, you know, because, you know, Lama has kind of shit the bed for meta.
00:12:44And so China's really, you know, pulling ahead here. And I think open AI sees that having it is
00:12:50strategically valuable because you basically have a flavor of everything for companies. You know,
00:12:57if they want the closed thing, they can have it. If they want the open thing, they can have it.
00:13:01Uh, and they get used to open AI's, uh, you know, systems, the way it works, the way the models are
00:13:08tuned, and it just locks you into their ecosystem. So yeah, I, I don't, I don't really buy the safety
00:13:14argument. See, see, yeah, that's, that's what I've been really curious about. And then, uh, Zuckerberg,
00:13:18I think in the past week also said the exact same thing, right? He put out his big, uh, super
00:13:22intelligence memo and he's been the one who's been harping on open source for a really long time.
00:13:29Yeah. All the Lama, Lama models. And now I'll be shocked if their next frontier model is open
00:13:33source. Yeah. I think again, for all the nerds, I know it's not actually open source. We should
00:13:37just call it open weight, uh, open AI's GPT, OSS, OSS, whatever we're calling it. It is actually more
00:13:45open source than Lama. Um, it's under the Apache 2.0 license. It's, it's pretty commercially accessible,
00:13:50but yeah, it's not true open source. And is the industry going to really be releasing a bunch
00:13:57of frontier open source models going forward? I don't think so. I mean, GTP five was the same
00:14:02week. I think they wanted to get GPT OSS. I don't, I hate the name, but they, they wanted to get it out
00:14:09ahead of time to clear the way for GTP five, which is the thing that really matters for their business,
00:14:15right? It's the thing that may get more people to use chat GPT or to consider not switching to
00:14:20Gemini or whatever. Um, and you know, that's what matters. You know, open AI is the chat GPT
00:14:26company. That's, that's their business. So I think it's admirable that they're doing an open
00:14:30weight model at all, but at the same time, I don't think we should pretend that this is because they
00:14:35all of a sudden cracked some safety thing that they couldn't have before. Yeah. And Addy, you were
00:14:39pointing out earlier this week that, you know, all of these services, if you're using them online,
00:14:45you were handing over a ton of data and obviously GBT OSS is not the first, uh, open model that you
00:14:52can run locally by any stretch. Um, but it's, it's clearly the highest profile immediately,
00:14:57right? Llama is, is, I, I would be surprised if most people know what Llama is. Um, and I think
00:15:04chat GPT now has a version that you can run directly on your computer. That feels like a big deal,
00:15:09right? Yeah. I mean, it's still, as far as I can tell, kind of resource intensive,
00:15:12like you still need to have a pretty powerful computer. But I think that honestly, like the
00:15:18biggest safety thing with a lot of this open source AI is the fact that you can run it on a
00:15:23device. Like, I think that's a really hugely under discussed, not for the companies, cause
00:15:27they don't really have a reason to care, but for normal people, safety issue is that the less detail
00:15:33and data you can have sent over a network off your device, the better things are for you.
00:15:38I think about this pretty much all the time when I'm using AI tools. Uh, there's a ton of stuff that
00:15:44I would probably be using chat GPT for or Gemini or Claude or whatever. I don't want to hand it over
00:15:50my information. The other day I accidentally, I was just like moving very fast and accidentally pasted
00:15:54my address into, into chat GPT. I'm like, that's, that's there forever. I hate to break it. It's not
00:16:01like Google didn't automatically, like, it's not like people didn't hand a bunch of information over
00:16:05to Google, but this is just, it's all the problems of Google with even more confusing privacy settings.
00:16:12There's the story that like a bunch of open AI, like chat logs got scraped, uh, onto Google just
00:16:18because the settings were confusing. Um, like I think people are even less clear when what they're
00:16:24sharing is private. People are, are being encouraged to share even more and they're doing it at a time
00:16:30when frankly, there are just much higher surveillance risks right now than there probably
00:16:35have been in America for a long time. Yeah. Allman himself has actually been out there saying,
00:16:40uh, I think he's been using this lawsuit with the New York times kind of as a scapegoat where they're
00:16:44trying to get, you know, millions of, of logs to see how New York times articles were surfaced in
00:16:50chat GPT through legal discovery. Uh, he's been basically saying, look, a ton of people are using
00:16:56chat GPT for therapy, very intimate private conversations. And there is no legal protection
00:17:02from those conversations. If we're subpoenaed or forced through discovery to hand those over,
00:17:07there's nothing protecting them. There's no version of, you know, the law that protects your,
00:17:13you know, privileged discussions with your lawyer or your therapist in the real world.
00:17:17And he's saying, we basically need that for AI, which is, is, you know, there's a point to that,
00:17:22but it's also convenient because what it does is it firewalls all the data that open AI is
00:17:28collecting, which is its real moat is the data, the flywheel, and then the memory and the way it
00:17:33learns about you and locks you into chat GPT. So yeah, it's a very, it's a very dicey situation.
00:17:38Also, frankly, if, if open AI wanted to actually not have access to a lot of that data,
00:17:44it could not, there are ways in which it could block its own access to the information that you put
00:17:50it to like the text that you put into it. There would be huge problems with this. Like there
00:17:53would be huge safety problems with it. There are lots of reasons why they would want to be able
00:17:59to see what you're putting in there. But if they were really technically committed to the idea that
00:18:04there should be a mode of chat GPT, where when you put this text in, it is firewalled away and
00:18:10encrypted in such a way that we cannot access it, I think that would be plausible.
00:18:14Well, the, how would they train it? They need to train on it to learn about you and do memory.
00:18:18So I don't actually know if that's possible.
00:18:20I think that is probably one of the trade-offs. Like I think that there are a lot of different
00:18:23trade-offs, but what I'm saying is that if they really believe people are putting in sensitive
00:18:27information and we would like to encourage them to put in sensitive information, there are ways
00:18:31that they could minimize this information that they have access to and the information that they
00:18:36would be able to be required to legally give up.
00:18:38And I think you're both right. Like it feels different chatting with chat GPT or any other AI bot
00:18:45than I, you know, the information that I put elsewhere, right? If I put something in a Google
00:18:49doc, I'm like, yes, this is a document that exists in perpetuity on Google servers. And it,
00:18:55chat GPT, it feels like, it feels ephemeral, feels like a chat box, which it is. And it's funny,
00:19:00they are adding these other features to make it feel even more casual, which kind of makes it feel
00:19:04like it's going to get worse and worse in terms of people handing over their personal information
00:19:08and not realizing just how much is stored in those servers. I'll say, speaking of AI that wants you
00:19:15to get a little too personal with it, the other thing that happened this week that I want to talk
00:19:19about is that XAI, Grok is in the news again. This has happened every single week on the Vergecast
00:19:25for the past, past month, I think. I cannot get away from it. Grok is always doing something
00:19:31inappropriate. And this time, I think in like the middle of the night, Elon just tweeted out,
00:19:36hey, Grok has a new image shader generator mode and video generator. Addy, it has, what is going
00:19:43on? There's a spicy mode as part of this video generator, like this is, it's letting you
00:19:48intentionally create NSFW videos. Is that, that's maybe an understatement. Yeah, it's, so we've had
00:19:57Jess Weatherbed testing it, and she keeps running into the problem that the unlimited service is not
00:20:03actually unlimited. And so our ability to test, our ability to test it has been slightly curtailed
00:20:08by this, but it's basically a people rip their clothes off button. That there are these settings,
00:20:14you can generate an image, and generating the image seems to have sort of guardrails, like it won't
00:20:19generate unblurred nude images. And then you can click a button that's like, animate this image,
00:20:25and it will give you settings. There's like normal and fun and spicy, which is the operative one.
00:20:33It will result in nudity, but it is basically a men will rip their shirts off, women will also rip
00:20:38their shirts off, and sometimes their other clothes. It's like, it's basically a softcore porn
00:20:43button. The videos are like kind of disturbing too. Like we were unfortunately looking over them
00:20:52this week to write about this. And it also like, it doesn't have a lot of guardrails, right? Like you
00:20:59can put in the name of a celebrity. And it just like, sure, here it is. Put in the name of like
00:21:03Batman or Superman. And it's like, sure, copyrighted character being naked, like, go right ahead, happy
00:21:08to. And it's like, Jess was messaging us with us. And she was like, my takeaway is that Grok,
00:21:15its image generator is only really good when it comes to things that it should not be able to do.
00:21:20It's like, that's the only thing it can do that others can't, because others have guardrails in place.
00:21:24It seems like its biggest guardrail for creating deepfakes of real people is being unable to
00:21:28actually produce an image that looks like the real person.
00:21:31This is true. I feel like a lot of them were very far off. It like, it didn't know a bunch of like
00:21:36modern celebrities too, right? It produced a picture of Sidney Sweeney. And I feel like it,
00:21:42it looked more like Elizabeth Banks or something. It was just like a very generic blonde woman.
00:21:47I don't know. The Taylor Swift one looked like Taylor Swift to me.
00:21:50Actually, a surprising number of them did not look like Taylor Swift.
00:21:53The video that we picked does.
00:21:55Oh, so you had to prompt it a lot to get to that?
00:21:58I think it was the first, the actual first one that Jess found. But it's the thing that is,
00:22:03one thing I will say is, is sort of impressive about it is, unlike other models I've played with,
00:22:09it just, it generates something, then just keeps generating more and more and more and more and
00:22:13doing riffs on it. And so usually your first one is good, but then it will kind of deviate from
00:22:20that as it goes on. And so I think a lot of these were a little more questionable.
00:22:25I think it's very good with, like, there are certain people that it seems like it has
00:22:28absolutely no trouble with, even if it's inconsistent, like Taylor Swift. I think like,
00:22:32there are just not as many pictures online of Sidney Sweeney as Taylor Swift.
00:22:36Yeah, Taylor Swift, there's probably the most information about on the internet.
00:22:41Adi, I think there's a lot of pictures of Sidney Sweeney online.
00:22:45No, I think there are, just the volume of time that Sidney Sweeney has been around for,
00:22:50and the volume of stuff that goes into training data, I think actually Taylor Swift appears in
00:22:55public more often than Sidney Sweeney. I'm not saying there aren't a million pictures,
00:22:59I'm just saying I think by volume, the Taylor Swift weight is just so incredibly high.
00:23:04I think that's fair, I think that's fair. Before we wrap this up, I do want to talk about
00:23:07one other thing in AI this week. I think it's particularly interesting that we're talking
00:23:11about Grok. Grok, I think, was pulled together very rapidly. And, you know, there are, don't get
00:23:16me wrong, quite a few problems to talk about with Grok, most of which I think are probably deliberate.
00:23:21But it does show when a company, you know, has a pretty singular focus on building an AI
00:23:27service, they can build it pretty quickly, and they can build a pretty decent one, it would seem.
00:23:31Um, meanwhile, there's Apple. Um, and I think in the past week, Tim Cook has come out and made a
00:23:38couple of big statements about Apple needing to get in the game. And Alex, you thought this was
00:23:45pretty bold of Tim Cook, right? That the fact that he came out and said, they're, they're open to
00:23:51acquisitions, right? They're looking at this or interested in it.
00:23:53Yeah. How many times have we heard Apple talk about wanting to make acquisitions?
00:23:57Never. The answer is never. So it's a big deal. I think it shows the pressure they're under.
00:24:04If you read the earnings call transcript from last week, he just gets asked over and over about AI
00:24:09and the threat to Safari search, new devices. Like this is what they're hearing constantly from
00:24:17investors, uh, you know, people in the tech industry. So yeah, it shows, it shows the pressure
00:24:24they're under and they're losing AI talent like crazy to these other labs, to meta, uh, to open AI,
00:24:30et cetera. Um, I think Apple's in a really tough spot here. Look, is anything happening with AI right
00:24:36now going to make people not buy their next iPhone? Probably not, right? Like the pixel events coming
00:24:43up. I'm sure there'll be some cool AI stuff. I'm not buying a pixel still. It's going to take a while
00:24:48for that to happen in the next three to five years. Are you buying something different or
00:24:53in aggregate, our iPhone sales going down because people aren't upgrading as much because they have
00:24:59new devices or peripheral devices, AI pendants, et cetera. Maybe it's not out of the realm of
00:25:05possibility for the first time in the last like 15 years of the iPhone. So yeah, Apple's on under
00:25:11tremendous pressure. Um, and I'm not confident they have the leadership currently to figure it out.
00:25:17So maybe they do need a big acquisition, you know, maybe they do need to go buy one of these labs
00:25:22and pay, um, a lot more than beats to hire a team. Uh, cause that's what it would be. I mean,
00:25:28to get one of these leading labs, you're talking tens of billions of dollars, which is something
00:25:32Apple's never done. I mean, beats gets you what, like two or three researchers nowadays,
00:25:37like beats. Yeah. Three billion. That gets you three researchers. Yeah.
00:25:41Yeah. Yeah. At least with how Zuckerberg spending it. Yeah. This Apple thing, I think is going to be
00:25:46fascinating to watch. It will be really interesting to see if they actually pull the trigger on a
00:25:51purchase. That's like pretty antithetical to how Apple traditionally operates. Um,
00:25:56but I think it's a big deal and it's, it's clearly a big deal as you see them struggling to catch up
00:26:01and all of these reports pointing to the fact that they are having a hard time with it.
00:26:04I'm afraid I don't totally understand what Apple is not catching up with. Like I understand I'm being
00:26:09sort of provocative, but I'm just saying, I'm still not really sure that they've actually
00:26:13demonstrated there's a thing they're losing out on. I mean, Siri sucks. Uh, you know, Apple
00:26:18intelligence is not intelligent. Um, but I'm not clear either of those things. I don't know. I
00:26:24guess I'm just not clear that anyone else has demonstrated there's necessarily going to be
00:26:29still a transformative use case of AI that I don't know, makes Siri the thing that Apple has to bet its
00:26:37entire future. I mean, you spend 30 minutes talking to advanced voice mode in chat GPT and Siri is
00:26:43going to feel outdated pretty fast, but you could also just not use Siri. Yeah. But like they need,
00:26:47they need Siri to be relevant. You know, like there's also this idea in the industry that apps
00:26:53as a concept are going to go away, which I think is interesting with the rise of vibe coding, but
00:26:57like basically the OS is like fully abstracted to a, either prompt that it's text or voice. And,
00:27:05and basically, you know, what we think of as an operating system becomes much higher level.
00:27:11And for Apple, if they have no talent in house to build that and other people build that,
00:27:16and it makes it literally a dumb pane of glass more than it already is, you know, the only thing
00:27:21keeping you buying an iPhone is what you like the shine of the metal better. You like the fact that it
00:27:26syncs with your AirPods faster. Um, they lose a lot of their competitive edge in that, in that
00:27:31scenario. I am really curious, particularly as the models become commoditized, right? Like what is
00:27:38the advantage to having their own in-house one versus just paying for one, just using an open
00:27:43source one? I think they've given up. I think that ship has sailed. I bet they do this plug and play
00:27:48approach. There's all these reports of them, you know, talking still to Gemini, Anthropic,
00:27:52letting people pick their models, just like they pick their search engines. You know,
00:27:55I, I think for now that's the path they have to go on until we get to a point where
00:28:00AI is starting to rethink how operating systems work, which is obviously a few years out, but
00:28:05I think we're going there. I mean, I know people building startups in the space that
00:28:09are focusing first on the desktop, but are thinking like, how can we reimagine how you use
00:28:15your computer with AI? Um, and Apple's at risk there. Yeah. And there's also just this basic element
00:28:20of, um, like Addy, you're right. There's not necessarily anything that the iPhone is
00:28:25specifically missing at this point, right? Syria's been bad for a while, can continue to be bad.
00:28:30Um, but they also like came out and, and they branded this thing. They put, they redefined AI as
00:28:35a thing with their name built into it. And so I do think there's just some basic stakes that Apple
00:28:41has set up for itself too. Um, and it's, it's really interesting watching them attempt to meet those.
00:28:47And, and I think by their own account, not quite deliver. All right, we've got to take a break.
00:28:51We'll get back. We're talking about the age gating of the internet.
00:28:56This podcast is supported by Google.
00:28:58Hi folks, Logan here.
00:29:00And Thulsi. We're from the Google DeepMind team. We're releasing Gemini 2.5 Pro,
00:29:04our most intelligent model. It's now available for you to test out in Google AI Studio.
00:29:09It's been awesome to see what everyone's been building.
00:29:11From creating mini games from a single prompt to debugging 50,000 lines of code,
00:29:16we're excited to see what else you create.
00:29:18Try it out today on Google AI Studio by going to aistudio.google.com and let us know what you build.
00:29:28Okay, we're back. I want to talk about age gating on the internet. Adi, you've been covering this
00:29:33closely for years now, and I want to lay out what the situation looks like to me. And let me know if
00:29:39I'm overstating it because it feels like this is about to be a really big deal. It seems to me that
00:29:44in the past couple of weeks, there's been a seismic shift in how the internet works,
00:29:48where presenting some form of identification to prove that you're 18 or older is going to become
00:29:54the norm for when you want to access certain information online. Maybe that's porn, but maybe
00:29:58that's just like a slightly spicy subreddit. But either way, a lot of people in a lot of places
00:30:05are going to have to prove that they're of age. Is that off base? Am I in the ballpark here?
00:30:11I think what we've really seen over the past couple of weeks is that the first time someone
00:30:15really, as a country said, you guys have to all do age verification, that moment happened after like
00:30:21a decade. And so there have been over the past several, several years, all of these plans to
00:30:27kind of slowly creep age verification into things. And the UK in particular has been trying to age-gate
00:30:35porn for, I think I looked back, and it's been since about 2016. And they passed the Online Safety Act,
00:30:42and then the portion of it that requires age-gating for harmful content, including adult content, but
00:30:48also a sort of variety of things that get posted to social media that finally went into effect in
00:30:53late July. And now since then, we've just seen what actually happens, and it's really weird.
00:30:58And that there are all these other countries that are just kind of waiting in the wings to do this.
00:31:03And the US is slowly starting to roll it out across states. And so we're, I think,
00:31:09reaching this kind of tipping point.
00:31:11And the UK's version almost seems like it's one of the, maybe I shouldn't say it's the worst case
00:31:16scenario, given that it's the first scenario we've seen, but it seems quite controlling,
00:31:21right? Their definition of what needs to be age-protected is really broad. It seems to be
00:31:27like anywhere, any website where a child might possibly encounter something problematic, right?
00:31:35And so blue sky users can't access certain features unless they prove that they're of age.
00:31:42Reddit users can't access r slash periods or r slash stop smoking unless they prove that they're of age.
00:31:49Like, this is, like, pretty wild stuff to have to show that you're 18 to access.
00:31:55Sorry, is this literally you have to show a photo of your ID?
00:31:59Basically, there are a sort of list of harmful content things that you have to make sure that
00:32:06if you have them on your site, there is some kind of reasonable assurance that people looking at them
00:32:11are of age. And typically what you end up doing then as a site is you have to either say nobody from
00:32:17the UK can access things or you have to say we don't have this on our site or you have to say
00:32:22we're blocking this unless you go through this typically third-party service that will either
00:32:29have you upload a government-issued ID or use something like a credit card or use something
00:32:36like a picture of your face that is you that is run through age estimation to see if you look like
00:32:43you're of age in a like sort of age verification algorithm. Or there are these methods that kind of
00:32:53they're using inferences from ways that you've used your account. Like if, say, your X account is
00:33:0015 years old then they're going to guess you're probably 18 years old. So it's this whole weird
00:33:07cornucopia of different methods, some of which are easier to fool than others. And basically the upshot
00:33:15is it's just really unpredictable what people are going to post on social media. And so if you're a social
00:33:19media site, you probably at this point need to run age verification. That Blue Sky has in the UK limited
00:33:27things like DMs and it will auto-filter adult content. Reddit will block a lot of subreddits
00:33:34that are even sort of borderline content potentially. That that's ended up being things like at least as
00:33:41of the time we were writing about it, our period. And there have been, say, violent protest footage
00:33:48or other kind of material that then gets considered adult content on services like X. So yeah, if you're
00:33:56in the UK, you basically, you have to give up your ID or you have to take a picture of your face or video
00:34:01of your face and show it and run it through these services.
00:34:05The UK is really, it's constantly impressive how shittier they make using the internet become
00:34:11almost on a yearly basis. Like props to the UK where the internet will continue to suck more and more.
00:34:18There are all these reports of these small websites that are just like, you know, shutting down in the
00:34:22UK because they're like, we, we don't know how to deal with this. We don't have the resources for
00:34:27this. The way that they have implemented, like it is so broad and the way the options for identifying
00:34:33yourself again are so there's so many options out there and they don't necessarily have close
00:34:39restrictions around them. And it feels like the privacy thing is the big question to me, right?
00:34:44The idea of, okay, we should stop children from accessing porn on the internet. I don't think
00:34:51that is necessarily an unreasonable contention. I do think that every single adult who wants to
00:34:58access, I don't know, information about a rated M video game needs to hand over their ID or a face
00:35:06scan to some service that can possibly then track them around the internet and tie that back to their
00:35:11identity. That's very concerning. I think the problem is that there's not really an easy way
00:35:15often to, to draw a line between those two things. Like on, in the US so far, it has mostly stayed on
00:35:23just, yeah, it's Pornhub. It's like X video. It's a few different sites. But the problem is that that
00:35:30means either you have to mean, you either have to say, all right, that means there's been no sex on
00:35:34social media at all. Like you're not allowed to have, in some cases, artistic nudes, you're not
00:35:40allowed to have, say, just sex workers operating on this site as humans. You have to either then say
00:35:46all of these other services are getting sanitized, or you have to say, look, we're going to allow for
00:35:52some level of surveillance of all of these sites. And the UK also, yeah, it's that it's not just
00:35:59pornography. It is also, it's things like eating disorders, it's things like suicide, which are all
00:36:05things that it's really not good for kids to encounter. But just the nature of social media
00:36:10makes it really, really difficult to automate those things away, or to say that a child is never going
00:36:18to encounter them in a way that could be problematic. Right, right now, like Wikipedia is complaining that
00:36:23they might have to, you know, make some big changes as a result of this, because they can't quite,
00:36:27you know, guarantee that there isn't any inappropriate information, because Wikipedia is a
00:36:32user-generated content platform. And outside Wikipedia, there's also just, there are the
00:36:37problems of profit motives, like that there are these services, and these services cost money,
00:36:43these verification services, and these services also, this is all a commercial operation, and there
00:36:48is a lot of incentive to collect as much data as you can, and to either cut corners with that data,
00:36:54because you are trying to make money off of this thing you're using, or to just actually use that
00:37:01information to target advertisements, or to otherwise, like, just run a commercial service. And
00:37:08like, companies will absolutely promise privacy, but just we've seen real security breaches. We've seen,
00:37:15like, Twitter, for instance, was under a consent decree in part, because it would use information that you
00:37:21provided for as, like, security phone numbers, for commercial purposes. Like, we just understand how all
00:37:28these systems work, and how all these incentives work, and they're just throwing something into a system
00:37:32that is, in a lot of ways, filled with perverse incentives, and demanding that it work well.
00:37:39You're saying Europe doesn't understand tech regulation? Shocker.
00:37:42This is the funny thing is, I actually think the EU is doing better, even though the, so the EU is
00:37:47running a pilot program that its deal is, look, we also, we have age verification, and I personally,
00:37:53I believe that age verification is, in a lot of ways, fundamentally flawed. But they're at least
00:37:57saying, we also have this technical system that we're trying to build, that's just going to be
00:38:02a best, like, a system anybody could deploy, that's going to ideally mean that somebody uploads
00:38:08their passport once to a service, and then that service is held to a, sort of, like, it runs on
00:38:14best practices, and all that service does, then, is tell a site whether or not you're 18. And that
00:38:22this makes it, in theory, easier for the, for the websites, they just know that there's something
00:38:28that they can use, that they don't have to go and find this third-party verifier, and that they are not
00:38:34relying on just this private sector development of all of these weird, different options.
00:38:42So I think the EU, in some ways, is doing a better job than the UK, potentially.
00:38:46Well, so do we expect this to spread more broadly in the EU? Because right now, this is, like, five
00:38:50states are testing an app. I don't, I don't know where, where it goes next. Is it, is this a legal
00:38:56requirement? Or, because they've been pushing this, right? I think this seems to be, kind of like
00:39:02everywhere in the world, they, there's an increased momentum toward age-gating things.
00:39:07Yeah, Australia has, I believe, also determined that they're going to be age-gating search engines
00:39:13in the future for, if you want, basically, safe search off. I think it's a little bit hard to tell.
00:39:20I think, on one hand, it seems like all this stuff is becoming, kind of, inevitable. On the other hand,
00:39:25the UK has been, it seems like, kind of a mess. And none of this is, I think, totally inevitable.
00:39:31I think it is possible that now we are seeing, this is what happens when this rolls out. And it
00:39:36turns out it's really messy, and there are a lot of problems. And we are seeing, really, some of the
00:39:40first real-life test cases of it. And it's possible that that will make it possible to shape how future
00:39:48rollouts work.
00:39:49So I want to touch on the US, too. And this is about a month old at this point. But you cover the
00:39:54Supreme Court's ruling that age-gating porn websites is allowable under the First Amendment.
00:40:01And I think at this point, what is it, like a third of all US states have laws that require
00:40:08age verification for porn. So it feels like this is going to expand in the US. It's not clear to me
00:40:13if it's going to expand beyond adult content. In the same way, I'm curious what the ruling even allows.
00:40:19The ruling really specifically deals with content that is obscene to minors, which is a very
00:40:27specifically, this is pornography. I think that there are probably ways you can kind of find wiggle
00:40:34room around that to try to impose regulations more broadly. But it's a pretty different court case.
00:40:40I have talked to people for the last months or year about, well, what happens if age verification
00:40:47for porn is allowed? Does that mean you can just age-gate social media? And the consensus has
00:40:51been it's possible that will happen, but it's a pretty different court case. And that you have
00:40:56to just do a completely different weighing of what are the privacy risks versus the harms that you're
00:41:02trying to prevent. But we've definitely seen states try to pass regulations for social media. And so
00:41:09it seems inevitable that it's going to get up to the Supreme Court.
00:41:12Yeah, because you could just say, there's porn on Blue Sky, right? You could easily just make that
00:41:17argument that, okay, now this entire platform needs to be age-gated. And Reddit, right? You could
00:41:25have to age-gate the entire platform. I guess maybe this starts to just slice off chunks of these
00:41:32platforms somehow.
00:41:33I mean, so far, the actual laws have all basically set a threshold that's like, is one-third of the
00:41:38content on the site adult? And I think that it's basically been understood that social media
00:41:45platforms don't fall under that. But I think the thing that's going to be maybe the next issue is
00:41:50there have been laws passed that are basically requiring age verification on app stores.
00:41:55that those are going to, I think, probably end up going to the Supreme Court. It's, I believe,
00:42:04Utah and Texas that are saying that if you operate an app store, like a mobile app store specifically,
00:42:12you have to have a system by which you can guarantee that people are over 18. And that basically,
00:42:22the legal issue there is going to turn out to be, is this unduly burdening speech in a way that is
00:42:29unique from the concerns of like the, it is fair to restrict access to obscene content.
00:42:37I mean, I think the Supreme Court right now kind of operates just on really weird vibes. And so it
00:42:42seems plausible that they're just going to say, yes, that's fine. And so at that point,
00:42:48I think we're going to have to start looking a lot more closely at how all of this is just going
00:42:53to end up getting implemented. The app store thing is really interesting because I think
00:42:57a lot of the tech companies who do not operate app stores such as Meta are really pushing for this
00:43:02because it seems like a way for them to pass the buck a little bit, right? Alex, I think you've covered
00:43:07this from Meta where they've been pushing for Apple to implement these requirements and they've been kind
00:43:14of trying to like guilt them into it a little bit. Yeah. It, their comparison, which I think is
00:43:21an apt one is like, you know, you age gate someone's ability to drive a car. You don't age gate,
00:43:26like you don't force the roads to age get everyone as they drive down the road, you age gate the ability
00:43:32to drive the car. And what is the car in this analogy? It's the phone. So if you really wanted to
00:43:38solve this, which Apple in particular is aggressively lobbying against in the United States, is you would
00:43:45force the phone platforms to age gate at the device layer. I got to say as a metaphor, this makes no
00:43:51sense to me. It's not like cars are required to tell whether you're 16. I did get barely any sleep
00:43:57out of you writing about opening eyes. So maybe it doesn't make sense, but I think conceptually,
00:44:01uh, what makes sense to me is gating at the root, which is the device and not gating all over the
00:44:11web, um, and chopping up the web. And it just, I don't think you should gate at all to be frank.
00:44:17Like I think this should, you know, it would solve this whole discussion as you make parents liable for
00:44:21what their kids do. And if you, if we've decided as a society that it's evil or bad or harmful for kids
00:44:27to look at porn, uh, guess who should be liable for that should be their fucking parents. It
00:44:32shouldn't be these platforms. Uh, that's, that's my view of it all. But, um, yeah, I, if you're going
00:44:39to gate, why not gated the device? I think that's a reasonable. I think that there are a lot of good
00:44:44answers. I think that I agree with you that there's a really good reason pragmatically to say that, look,
00:44:50you could probably create a better system using devices than you can like regulating or you can
00:44:58like verifying ages on every service. At the same time, I think in some ways, like my biggest problem
00:45:05with this is that it just means that basically everything has to route through an app or an app
00:45:09store at this point, that it means that just trying to make a thing that works on a desktop, for
00:45:15instance, like it means that ultimately you're just saying desktop, desktops and websites don't
00:45:22matter. It's really strange. Like, I think it basically just assumes, okay, look, most people
00:45:28are using phones. Most kids are using phones. That's fine. And so we're just getting like the
00:45:32broad, we're getting most people and that's enough. But if you push this far enough, what you're saying
00:45:37basically is just any content on the internet has to, at some point, be verified through like Tim Cook or
00:45:42Sundar Pichai. Oh yeah. To be clear, I think this is all a horrible idea. Like gating it all is a
00:45:47horrible idea, but if you were going to do it, it would be the most consistent to do it at the device
00:45:51layer. It shouldn't happen at all, but that would be the most consistent. And the US also has the just
00:45:58very particular issue of the major party platform is banning pornography entirely. Like the UK and the EU
00:46:06are not both engaged in a just really massive surveillance state building operation right now,
00:46:13or an attempt to make like drag shows illegal. So the US has kind of its own whole set of problems
00:46:20where like, even things like, I think the EU building a, trying to build a sort of government
00:46:26solution for verification is really good. If you have a good faith belief in the government
00:46:32and that the government is going to protect your privacy, I think that even something like that
00:46:37just doesn't work well in the US. Google did this weird thing this week
00:46:40where they announced that they were going to start like just trying to like size people up to see if
00:46:45they were kids or not. And if, if they, if they just like think that you're not 18, they're going to
00:46:51put like certain restrictions on your account. And this is, this seems to be just entirely voluntary.
00:46:56And I, I guess I'm wondering at this point, if, if this is like, they're just trying to, you know,
00:47:00look responsible so that some of these maybe worst case scenarios don't come to pass.
00:47:05Yeah. They're trying to get ahead of regulation. This is what they've done time and time again
00:47:08with other things, um, other kind of regulatory movements, the biggest companies will do it on their
00:47:15own. Uh, and it will be impossible or way more difficult for startups to, to do it. That's how this all
00:47:23goes with regulation. Yeah. I'm very concerned about where this is going. The fact that this has
00:47:28actually gone into practice and that we were seeing the fallout be so, uh, broad and widespread
00:47:35and that we're seeing major mainstream websites getting blocked is, is kind of horrifying. Um,
00:47:41and I think the fact that we're seeing an escalation in the US where more and more states are passing laws
00:47:46and added to your point, there's an overall movement to crack down on anything that one of the
00:47:54major political parties thinks is a little too edgy can lead to some very bad places. But it's, I think
00:48:00it's maybe encouraging to hear that you think that this is not necessarily a done deal that the UK is
00:48:06maybe just take it, taking a, a, a first swing at this and the US may not be able to get that far.
00:48:14I mean, the US is way more chaotic. Uh, the UK, there has been, uh, a request for a petition for,
00:48:21I believe parliament has to respond to, uh, to like reevaluating this. Uh, that does not necessarily
00:48:26mean anything, but it does mean that there is, uh, public pushback to it. I think that unfortunately,
00:48:34if this is just like anything else that's happened with privacy on the internet, it means that maybe
00:48:39some things are going to loosen up some, but that this is still probably here to stay. I think that
00:48:44I haven't seen really a ratchet away from privacy invasions on the internet since I've been working
00:48:50here. Uh, I, that sounds really doomer. I'm sorry. Um, but I am hoping that around the edges,
00:48:57people can shape things and that if this does screw up badly enough, maybe I will be proven wrong.
00:49:03There we go. UK do a worse job. All right. We got to take a break. When we get back,
00:49:10some sort of round, maybe there's lightning. We'll see.
00:49:12Support for this show comes from Monarch Money. When you're young and you hear the phrase personal
00:49:21finance, it sounds like something that only applies to people with top hats and monocles.
00:49:25Then when you become an adult, you may realize that fun accessories are not a prerequisite and that
00:49:31everyone needs to get good with money. Monarch Money makes managing money simple for everyone. It lets you
00:49:37track your spending, savings, and investments effortlessly. So you'll always know where your
00:49:42money stands. And Monarch Money is more than a budgeting app. It can act like your personal CFO,
00:49:47giving you a complete financial command center for your accounts, investments, and goals. So whether
00:49:52you're looking to clearly manage your spending or plan your long-term financial goals, Monarch Money can
00:49:58help. Get control of your overall finances with Monarch Money. Use code VERGE at MonarchMoney.com in your
00:50:05browser for half off your first year. That's 50% off your first year at MonarchMoney.com with code VERGE.
00:50:17Okay. Welcome back. So we heard your feedback on the ThunderRound. We've heard it is a branding disaster
00:50:25on par with Tronk or X. So we've been doing some work, some deep research here, and I'm pleased to
00:50:33announce that we have re-rebranded this segment. It is now known as the Lightning Round, colon, the ThunderRound.
00:50:44Colon, Jake's edition. And we have a sponsor, Eric.
00:50:51Yeah, that's right. This week's Lightning Round is presented by Google Gemini.
00:50:57So here's how the Lightning Round, colon, ThunderRound edition, colon, Jake's edition,
00:51:02is going to work. To make sure we get through all these stories, we're going to do five stories,
00:51:06five minutes each. Eric Gomez, our producer, has the power of thunder. If we need to wrap it up,
00:51:11we're going to hear some rolling thunder coming in, and then he will strike us down with a single
00:51:15single-funder crash if we've gone too long. All right, five stories. Let's do this. Addie,
00:51:21you are up first. All right. So my first story is that RFK has pulled $500 million in funding for
00:51:29mRNA vaccine contracts. They announced it, this is via NPR, the Department of Health and Human Services
00:51:38will cancel contracts and pull funding for some vaccines that are being developed to fight respiratory
00:51:42viruses like COVID-19 and the flu. Robert F. Kennedy Jr. announced in a statement Tuesday
00:51:48that 22 projects totaling $500 million to develop vaccines using mRNA technology will be halted.
00:51:55This is bad. And it is the latest in a long string of anti-vaccine decisions that Kennedy has very
00:52:02unsurprisingly made. It's really, it's very sad. His argument is basically just these don't work for
00:52:08respiratory diseases. That's not true. It doesn't matter.
00:52:12It's wild. Like I, it's not clear to me if I'm going to be able to get a COVID shot this, this fall,
00:52:18right? Like they changed the rules and right. I've just been getting the, the COVID shot and the flu
00:52:23shot at the same time. Um, I don't want to get the flu. It's not cool. And it's, it's not clear if
00:52:28that's, I mean, hopefully the flu shot is still available. We'll find out. Um, the COVID one,
00:52:33it's not clear it's going to happen. And yeah, I, I, their vaccine strategy just seems to be
00:52:38don't do it. Yeah. I mean, uh, so Lauren Leffer has done a lot of really good work for us on RFK
00:52:43and vaccines. And basically he just believes that if you die from a virus, you were weak.
00:52:50That sounds like a horrible miscarriage characterization of someone. That sounds
00:52:54like a horrible mean-spirited thing to say. That is just what he keeps saying.
00:52:57Yeah. Pointing out the hypocrisy of the Trump administration, I, I realized that does not get
00:53:03very far. I do find this one to be particularly striking. Um, because if you go back to Trump's
00:53:11first term, um, listen, you don't have to hand it to them for how they handled COVID. However,
00:53:18one of the successes was the MRNA vaccines, right? They got the COVID vaccine developed
00:53:26fairly rapidly. They got it distributed fairly rapidly. This is something that Trump in another
00:53:32world would be touting, would be taking credit for, would be boasting, right? This is a, this is a, a new
00:53:39technology that has some incredible potential impacts for vaccinations. And after spearheading
00:53:48this push for it in his, at the very tail end of his first term, which he seems to totally regret,
00:53:53he's now hired somebody who's just wildly undermining that, right? We're just MRNA,
00:53:59all this potential. It feels like we're just setting this work back by, I mean, at a minimum
00:54:04three years and potentially vastly longer. Yeah. It was one of absolutely the best things
00:54:09he did around COVID. It was an incredible technological development and it's just, it's also such
00:54:17incredibly just, it's such a waste. Uh, the again, $500 million is nothing. Like the only reason to do
00:54:25this is just that you hate vaccines. Even if you believe there's a 99% chance this, none of this
00:54:31is worthwhile, which I believe is really pretty clearly not correct. That's just barely any money
00:54:38for the government. It really is. And the other thing that I've seen Kennedy complain about is,
00:54:42uh, and this is not true. Um, but among his complaints are that we need to do more testing
00:54:48on the vaccines. We need to see if they're safe. And it's like, okay, you know what this money is used
00:54:54for, right? Vaccine testing is part of this. We make sure the vaccines work. They're very thoroughly
00:55:00tested. And so this really does just come from this bewildering place of not wanting to help people
00:55:10not get sick, which is, is upsetting, uh, to say the least. Right. He's it's, I mean, everything
00:55:16here is just in absolute transparent, bad faith. And that it's that he believes that if you get sick,
00:55:25like the thing that matters, isn't making you well, it's making sure that your body is in some way,
00:55:30intrinsically strong. Uh, because if someone whose body is weak is healed or doesn't get sick, then that's
00:55:38actually just, I mean, it's just eugenics. Like it's, it's really straightforwardly just eugenics
00:55:46and it's really a disgrace. It's very, very bad. Um, and I think just startling,
00:55:52startling to see, um, what a change of tone the Trump administration has had on MRNA from the end of
00:55:59their first term to this new term. Okay. Alex, what have you got for us?
00:56:04So while we've been talking, I've also been looking at the reaction to open AIs,
00:56:10GTP five live stream, and it looks like they have been vibe graphing, um, and doing some really
00:56:20insane chart crime. Uh, I want to call out two specific things that people have noticed. The
00:56:26first one is this sweet bench benchmark, which is this kind of industry standard coding benchmark for
00:56:34LLMs. Apparently GTP five actually does worse than Oh three on this, but the bar chart that they put
00:56:43in the stream makes it look like it's better. Like it's a taller bar, even though the number is less
00:56:50than Oh three, which is not great. And then perhaps there's apparently several of these,
00:56:54but then the worst one is when there there's a chart about hallucinations. Actually it's called
00:57:01deception across models and the coding deception one GTP five scores of 50. They don't explain what
00:57:09this score means except deception rate. There's no, there's, there's only one access and it's just
00:57:15called deception rate GTP five scores of 50 and Oh three scored a 47.4. But the bar for Oh three
00:57:23is like more than double the height of GTP five. You really have to see this. We'll have the links
00:57:28in the show notes with the, with the images, but it basically makes it look like if you just glance
00:57:33at this bar chart that GTP five is materially better at not hallucinating on code gen when in reality
00:57:42it's worse. And this is literally a bar chart called deception across models. So incredible.
00:57:50Um, yeah, not, not good. Um, I'm going to call this vibe graphing.
00:57:57Vibe graphing is good. Vibe graphing is a, a really good excuse for, Hey, why did,
00:58:03why did you turn information that's completely wrong? Well, I was being efficient. I vibe graphed it.
00:58:08I mean, listen, these are graphs. Uh, they just don't make any sense. These are bewildering.
00:58:14There's no consistency. The, the, the highest number is not in the highest spot. I don't
00:58:22understand what's happening. Things are just randomly colored. Um, this is, this is
00:58:30very confusing and, uh, maybe suggest why this presentation was a little confusing overall.
00:58:35I'm wondering if, if these are like, if they, are they trying to hide it or they just,
00:58:39do they just not know how to make a graph? It could be vibe graphed.
00:58:44Either answer is really bad to that question. Like either way, it's really bad. So, uh, yeah,
00:58:51I'm staring at two boxes of the exact same size and one says 69.1 and one says 30.8 and they're right
00:58:58next to each other and they're both smaller than the one that's 52.
00:59:01I'm honestly, this is an impressive level of inconsistent. Like the, the truly the only
00:59:08consistent thing about this is that everything is wrong. It is amazing.
00:59:12I just love that. It's really wrong on a chart about deception.
00:59:16I do have an important question here, which is what's the right deception rate. What number
00:59:21are we feeling is good here? Preferably I would like zero from the thing that apparently is okay to
00:59:27give me medical advice from preferably zero, but they had an extended segment on using this for
00:59:33medical advice. And this says the deception rate is 50. Sam Allman. I don't know what it means.
00:59:38He literally brought an employee's spouse on stage who had been diagnosed with like three cancers
00:59:44and was like talking about how Chachi PT had helped her navigate her diagnosis. And then,
00:59:49yeah. And then in the same presentation, they show a deception rate graph that is intentionally
00:59:56very deceiving. I have to tell you, I've been very bullish on Chachi PT. And now that they have
01:00:01presented a nonsense chart that says their deception rate is 50, this is, this has done the most to
01:00:10tell me that I should use this thing less. Jake, it's not even that the deception rate is 50. It's that
01:00:15the bar is less than half as tall as the last model, which was actually better at not deceiving.
01:00:23So it's made to look on the bar chart. Like it's way better when it's actually worse.
01:00:30Chachi PT five. Uh, honestly, you know what though, this speaks to how good it is at deception.
01:00:38We were fooled. We didn't notice. Yeah. They're going to listen. If they keep this up by,
01:00:45by the next model, these charts, we won't even notice how, how off these charts are. The higher
01:00:51the deception is actually the lower the deception score goes because you don't realize that it's
01:00:55fooling you vibe charting. This is yeah. 50% is exactly the middle where you're like,
01:01:00like, like you catch up. You got to squint. Okay. Next story. Tariff crisis across the globe,
01:01:11but specifically in the United States, we've been talking about tariffs for a while now. Uh,
01:01:16not, uh, and it's super exciting for our wallets. Um, I, this really, I think hit in a very material
01:01:22way in the past week. Um, number one, because quite literally, uh, Trump's tariffs did go into
01:01:28force, I believe Thursday morning. Um, but most notably we actually saw a bunch of companies
01:01:34raising prices and raising them pretty significantly. I think probably the biggest one that most people
01:01:38are going to experience is that Nintendo raised the price on a bunch of switch hardware, um,
01:01:43including the switch one, the original one, it's going up in price by $30. And some of the other
01:01:48models, um, including switch OLED and switch light, they're going up in price to a bunch of accessories
01:01:54are going to be going up in price. They did not raise the price on the switch too, which I'm assuming
01:01:58is because, uh, that thing is already very expensive and they do not want that to look worse. Um,
01:02:04um, but this keeps going right. Uh, Fujifilm, uh, they raised one of their cameras went up in price
01:02:11by $800. Now this is already a very expensive camera. It was a medium format camera, but what
01:02:17is happening? This, this is a lot of money. Tim Cook is on, you know, Tim Cook actually,
01:02:22he hung out with Trump this week and gave him this beautiful, um, statue that he made with like a 24
01:02:29karat gold base. How are we not going to mention this? Thank you for mentioning this.
01:02:32This is led with this. We should have led with this. Um, yeah. Cause he went to the white house
01:02:38to announce this big, you know, American manufacturing push, but before he did this,
01:02:42he announced that they're blowing a billion dollars just this quarter on tariffs.
01:02:46Well, clearly what needs to happen is any company that is raising its prices because of tariffs,
01:02:51you know, Sonos is another example this week. You need to bring a solid gold version of your products
01:02:57to the white house. You need to set it on the desk in the oval office and you need to do a photo
01:03:02off with Trump. That is how you avoid tariffs. So, so Trump's name on it. Yeah. So Sonos,
01:03:08Nintendo get on it. Solid gold switch. Solid goes solid gold. Sonos make it happen.
01:03:14Addy. Am I wrong? I think that what he described actually is the new tariff policy. Like, didn't
01:03:20he announce this that if you just like make, make Trump happy, uh, he will waive the tariffs.
01:03:26ChatGPT told me he did. So let's just scroll with it. Okay.
01:03:29Is that he also announced a new tariff during the Apple thing, but then was like,
01:03:33the tariff won't apply. If you build in the U S uh, it's going to be a hundred percent tariff
01:03:38on semiconductors. But if you build in the U S there's not, uh, we, and like everything,
01:03:44it's incredibly ambiguous what that means.
01:03:46Right. It's like, if you say you will build in the U S, if you like make a good effort
01:03:50and maybe bring a gold bar to the white house, you will not get tariffed.
01:03:54Five tariffs. Tim Cook, uh, he, he somehow he's playing both sides of this, right?
01:04:00Everyone is afraid to, uh, trash talk the tariffs. Tim Cook, he's on the earnings call. He's like,
01:04:06these were right. We're wasting a bunch of cash on tariffs. He said this two quarters in a row. Then
01:04:12he just goes to the white house and he's like, we're all good. We're all good. Uh,
01:04:16and it's sort of weird, like for, for them in particular, if, if those, I mean, again, big,
01:04:21if, if these 100% semiconductor tariffs do actually happen, they have an Apple loophole built in.
01:04:28Like that is what it is. Right. They, they were like, Apple, you said you're going to build stuff
01:04:32in the U S. So we were, we just won't apply this to you. Right. That's, that's like fantastic,
01:04:37uh, for Apple's business. Tim Cook is really good at hitting that middle point where he
01:04:42doesn't seem like a threat to Trump, but also he doesn't make Trump mad. He's very good at this.
01:04:50I don't really mean that as a compliment. And this is how you get to him presenting
01:04:55a statue to Donald Trump. Um, I, I was bewildered by this thing. So there's a 24 carat gold base,
01:05:05which Tim Cook said was from Utah, um, right. American made gold. And then on top of it is this
01:05:12big like circle of glass that, uh, is from Corning, which is where they're going to get the American
01:05:18made glass for iPhones. And for unknown reasons, this piece of glass just says Donald Trump on it
01:05:25and has Tim Cook's signature on it. And it it's like, I guess it's commemorating that Apple is
01:05:31going to build in the U S and they're giving it to him, uh, for some reason, um, because they did a
01:05:37photo op at the white house. It's very funny. It came in like an apple product box that he like opened
01:05:43up on, uh, you know, the, the desk in the Oval office. Um, yeah, it was quite a presentation.
01:05:49Do you mention Trump is going to have a competing phone at this year? Supposedly things are going
01:05:55to get spicy between those two. It's going to be an interesting year if, uh, the Trump phone
01:06:01really does take on the iPhone. Okay. Right under the buzzer, Addy, what's next?
01:06:07All right. Sometimes it's fun to just have some old fashioned legal drama. That's not going to
01:06:11result in a baffling Supreme court ruling. Uh, and it is that Epic has beaten Google in court again.
01:06:17Uh, the ninth circuit court of appeals on July 31st, uh, said it would not overturn the unanimous
01:06:24jury verdict from 2023 that Google's app store and payment apps are illegal monopolies. And Google
01:06:30now has, uh, at this point, a few weeks, it could get extended, uh, time to start cracking open the app
01:06:37store by not requiring people to use Google pay billing and by not enforcing anti-steering rules
01:06:43that stop people from stop app developers from directing users outside the store. It still does
01:06:49not have to implement the really extreme stuff, uh, yet like allowing third-party stores in the
01:06:55Google play store, but it's a pretty significant loss, especially compared to how Epic's case went
01:07:02with Apple where Epic mostly lost. Yeah. I mean, this is, this is like potentially huge and it sounds like
01:07:06it's, it's still, what are we waiting on? Right. It's like, is the, is if the Supreme court decides to hear it or not.
01:07:12For the really short term, we're waiting to see if they appeal and then are granted a stay on, uh,
01:07:18having to implement this stuff while, yeah, we find out if it goes up to the Supreme court.
01:07:24Yeah. But this is like monumental, right? The, you said the bigger stuff doesn't come into effect,
01:07:30but that's, that's, it's just because it has a longer timeline. Like they gave Google more time to do it.
01:07:35Whereas Google is like basically out of time in this immediate stuff. The immediate stuff doesn't feel
01:07:39like as huge of a deal to me. Like these are important, but, uh, the, the next set of impacts
01:07:48fully crack open the Google play store. And I think it's so interesting because Android is,
01:07:53uh, it's an open platform. You can do whatever you want. And I used to think it's like very,
01:07:57it was very surprising to me, at least that Apple, which has a fully locked down platform,
01:08:03mostly one and Google, which has an open, if you know, strangle held platform, uh, mostly lost,
01:08:14but, um, on Google, uh, on Android, you can install whatever you want. You just have to jump through a
01:08:18bunch of hoops, but these new systems will allow you to just download another app store through the
01:08:24play store. And that new app store can be fed by the play store. If I'm remembering this all correctly,
01:08:29um, which just sort of like monumentally shakes things up, right? Like Android is an open system,
01:08:35but the reason that Google is able to maintain it is that its apps are so powerful and so important.
01:08:43And the play store is really at the center of that in a lot of ways.
01:08:46It also impacts the way that they can make deals for pre-installations
01:08:49on say Android phones and with carriers. So that's just another huge part of their ecosystem.
01:08:56This is wild. I, I think Android is going to, uh, potentially change dramatically if this goes into
01:09:03effect, um, or at least the way that Google handles it will. Um, I, I generally think,
01:09:10again, I am surprised, uh, that Google is losing more than Apple.
01:09:15Yeah, me too.
01:09:16We have a really good piece about this. Um, it is partly,
01:09:19I think it's specifically the thing you mentioned is why they lost, which is that Apple could make a
01:09:23really credible case that we have created this incredibly locked down ecosystem.
01:09:27It's central to our business model. We have all of these security concerns
01:09:31and Google by being kind of halfway there, it made it seem more like it was putting up hoops.
01:09:38Um, but I believe it's Sean who, uh, who has a really very good comparison of why Apple won and Google lost.
01:09:44I would love to see both of these ecosystems open up in a big way. Um, and it's, again,
01:09:49it's surprising to me. That is really interesting. It's really interesting that by being a little
01:09:54more open, but it really has always been sort of, it's, it's been a little fake. Like I,
01:09:59I've gotten a chance to look at the, at the terms of the deals that Google has with mobile carriers.
01:10:04And you start to realize that Android is, is only sort of theirs, right? Like they have to write,
01:10:13they make carriers or they make phone operators basically agree to these deals. And if they
01:10:18want the play store, which of course they want the play store, that's where all the apps are.
01:10:22You got to put Gmail in a certain spot on your phone. You got to put Google docs in a certain
01:10:26spot on your phone. It specifies where the folders can go. It specifies how many levels deep the folders
01:10:31can be. It specifies on which page of the home screen it is. And you start, I guess that clearly
01:10:37hurt them in court. When you, when you look at these terms and you go, oh, this is not,
01:10:40this is not as open as it is supposed to be. Um, and it will be very interesting, uh, next year,
01:10:47if these things actually go into effect and we'll get to see what happens if that, uh, Android really
01:10:52opens up to a much more dramatic degree than it currently is in practice. All right, Alex,
01:10:57I'm going to try to end us without the thunder on the last one, since we have failed at every turn
01:11:03on that. Um, Instagram rolled out an update that naturally has caused a ton of uproar.
01:11:11Uh, they added a map to Instagram. It's like the snap map where you can see people that you follow
01:11:19on a map and naturally people freaked out and Instagram has had to do a bunch of messaging
01:11:28around it. Apparently the, I don't know if either of you guys have gone through the process of opening
01:11:32this map is pretty confusing as to who you're sharing with and what context. And apparently,
01:11:38uh, if you would tag a location on a story that you shared, that would show up as if you were on the
01:11:45map at that specific location from, so there was a bunch of like just bad design stuff.
01:11:50Meta has been stressing, look, it's all opt in. We had like double opt in, but apparently they would
01:11:55also show you on the map even before you had opted in, which made people naturally think that they had
01:12:01already opted in to sharing their location with their entire Instagram friend list. Um, and I would
01:12:08just like to point out that there's just an inherent fall on this. We're like your Instagram social graph,
01:12:14who you follow. Yes. Are there people there, you know, very close friends, family members,
01:12:21significant others that you would be comfortable sharing your location with probably for a lot of
01:12:25people. Is anyone wanting to share their location with even a significant percentage of their Instagram
01:12:31following list? I'm going to guess no. And the fact that it's on you to curate that and to go,
01:12:37Oh, they're on a certain list. They're on close friends or whatever, just puts a lot of work on the
01:12:42user. And like, I think what snap has done is that people just use it for a different reason.
01:12:47Like people don't, it's not such a huge broadcast platform where you're following a ton of people
01:12:53in this way. And so Instagram is trying to shoehorn this interface into a social network. That's just
01:12:59not really built for this level of intimacy. And I think that's what they're experiencing right now,
01:13:04but also like they should have done better messaging this, you know, the, the onboarding
01:13:09seemed confusing and it's become like a meme that, you know, Instagram is showing you on the map now.
01:13:14I have to say they have had eight years, I believe to rip off snap maps. Yeah. It is rough that they
01:13:23could not do it. Right. Like that, that is pretty, pretty bad that, uh, they they're basically just
01:13:30lifting a feature and it's still bewildering and a bad user experience. I believe they've been testing
01:13:35this for months too. Yeah. I do think your point about the, the following list on both platforms
01:13:42inherently being different is really interesting because I've always like, I I'm like a little
01:13:48bit above Snapchat age. And so I, this is not a platform that I have used with any regularity.
01:13:54Um, and so the snap maps thing has never really been appealing to me or made sense to me, but you're
01:13:58right. The, if you're using Snapchat, you have a much tighter friend group, right? You don't have
01:14:04to worry about that in the same way as Instagram, which for the past several years for, for a while now
01:14:10has pushed you to have this much broader relationship, right? It is the influencer
01:14:14platform. And that is just not true in the same way on snap. Yeah. And look, I'm sure people will
01:14:20use it. Instagram is huge. It has billions of users. I'm sure there are people with, uh,
01:14:25fence does who, you know, keep their fragrance, who keep their friend list, super tight and curated,
01:14:30and we'll use the map. Uh, but yeah, they definitely could have rolled this one out better.
01:14:35Addie, big on broadcasting your location to everybody, you know, any service that has
01:14:39a location tracking feature, whenever anyone is building it at a website at, at an app developer,
01:14:44there should be like a button that they have to break glass with their bare hands to deploy it with.
01:14:52You should not share people's locations unless you really, really have to, and it should be as
01:14:58painful for you as possible to have them roll out. Again, they, there is like a double opt-in
01:15:04consent flow for this. So they are saying, but it's the way they presented it. And like the fact
01:15:09that you showed up on the map before you can send it, even though other people weren't visible,
01:15:12which then of course is going to make someone think, oh, I am already sharing, even though
01:15:17I haven't opted in, it's more about the UI and the way that, but they were not literally opting in
01:15:21everyone to sharing their location on the map. I'm not saying that the people should have,
01:15:25I'm saying that the developers should have to break the glass with their bare hands. You just
01:15:28shouldn't track people's location and social networks. There's also like something going on with
01:15:33meta platforms that there's this thing on Meta's AI app where you have to hit like a gigantic share
01:15:40button to share your stuff to the feed publicly. And people still hit it on really personal stuff
01:15:47that they're talking about with AI. So there's just some extent to which like, I don't know if it's
01:15:51that platform or that platform is user base, but, um, you gotta be careful and people clearly, uh,
01:15:57are not even if the thing is set up correctly.
01:15:59I do wish this had shown up earlier so that they could, uh, bring it up in the antitrust trial over
01:16:04whether Snapchat and Instagram were the same thing.
01:16:06Uh, if only. Well, there were also some TikTok features, but alas, we have been struck down.
01:16:13Alright, that was a good one. That's it for the Vergecast. Keep your eye on the feeds over the weekend
01:16:17because Vsong is hosting a bonus episode dropping this Sunday. And Gentooey is picking up guest hosting
01:16:24duties for the next couple of Tuesday episodes after that. Also check out Decoder where Alex
01:16:29Heath is going to have some good stuff coming up next week. I think you guys will really be
01:16:34interested. If you like what we do here, the best way to support us is by getting a subscription to
01:16:39theverge.com. Please check it out. We just added a bunch of new perks. We launched two new newsletters
01:16:46exclusively for subscribers written by our staff. I'm really excited to launch these and I hope you guys
01:16:52enjoy them. We love to hear your questions and feedback. Let us know what you want us to talk
01:16:56about this summer. Email us, vergecasttotheverge.com. I have access to that thing now and I can see what
01:17:01you're saying. Or give us a call, 866-VERGE-11. The Vergecast is a production of The Verge and Fox
01:17:07Media Podcast Network. Our show is produced by Eric Gomez, Brandon Kiefer, Travis Flarchup,
01:17:11and Andrew Marino. We'll see you next week.
Recommended
42:41
|
Up next
4:19
1:58:33
1:55:33
1:40:27
1:34:28
1:45:23
1:06:15
1:17:12
1:49:51
1:45:51
1:31:50
1:31:52
1:27:32
1:33:52
1:10:11
1:40:05
1:26:59
1:53:06
1:22:42
1:45:41
1:16:20
1:28:19
1:35:11
Be the first to comment