WEBVTT

1
00:00:00.100 --> 00:00:15.980
[on-hold music] Welcome to the Rebooting Show. I'm Brian Morrissey. I'm spending January on some preview episodes of the year ahead.

2
00:00:16.180 --> 00:00:18.880
This is poised to be a pivotal year in the media business.

3
00:00:18.980 --> 00:00:28.760
I mean, already the year has begun, unfortunately, with cuts at The Washington Post, at Vox Media, at HuffPost, and this tracks with a lot of my conversations that I'm having.

4
00:00:29.360 --> 00:00:38.380
Twenty twenty-four wasn't a great year for the industry, and twenty twenty-five is set to be another year of retrenchment, while at the same time retooling for a vastly changed environment.

5
00:00:38.760 --> 00:00:41.840
And I don't think there's any use in sugarcoating that exactly.

6
00:00:42.180 --> 00:00:52.060
But it's hard not to contrast the state of affairs with that in the tech industry, and particularly the biggest tech platforms, now often called the Mag Seven or Magnificent Seven.

7
00:00:52.560 --> 00:01:00.670
These companies saw their market caps grow sixty-three percent in twenty twenty-four as they rode the wave of excitement about AI.

8
00:01:01.300 --> 00:01:06.280
Meanwhile, publishers fret, fret about AI further compressing their already compressed businesses.

9
00:01:06.420 --> 00:01:17.080
The Mag Seven accounted for fully seventy-five percent of the S&P 500's growth last year, just to give you a sense of just how powerful the tech industry has become in this country.

10
00:01:17.640 --> 00:01:27.780
And Big Tech has become an entrenched power center, no matter how much cosplaying its many vocal cheerleaders on X like to do about how unfairly treated they are by the, quote-unquote, "elites."

11
00:01:28.180 --> 00:01:38.620
And I often say media is downstream of Big Tech, which controls the distribution and also eats up most of the monetization as the duopoly has expanded to an oligopoly.

12
00:01:38.680 --> 00:01:54.240
So to get a read on the year ahead in Big Tech, I had a turn to Alex Kantrowitz, who writes the Big Technology newsletter and hosts a podcast of the same name, in order to discuss what we should expect and get his analysis of the different moves that the big technology companies will be making.

13
00:01:54.280 --> 00:02:15.280
We discuss in this episode the slightly unseemly kowtowing to the incoming Trump administration we've seen from Meta and others, OpenAI's wonky economics, Alex's surprising bet on AI companions being a breakout AI product, why X has proven its doomsayers wrong, and the bright spot of individual creators amid a lot of this media industry doom and gloom.

14
00:02:15.780 --> 00:02:22.080
I hope you enjoy this conversation with Alex. I did. Be sure to check out the Big Technology newsletter and podcast. I'm gonna include links to it in the show notes.

15
00:02:22.480 --> 00:02:28.100
But first, thank you to Exco for sponsoring these year in preview episodes that will run throughout the month.

16
00:02:28.600 --> 00:02:38.460
Exco is the machine learning video platform trusted by leading media groups like Advanced Local, the Arena Group, Hearst Newspapers, Nasdaq, News Corp, and more.

17
00:02:38.980 --> 00:02:46.960
Last week, Exco announced the expansion of its award-winning ad server to upgrade programmatic auctions in CTV and digital out-of-home environments.

18
00:02:47.460 --> 00:02:59.280
The solution empowers media owners to drive higher revenue through smarter automated ad auctions using a machine learning-based yield engine that dynamically optimizes auctions in real time.

19
00:02:59.800 --> 00:03:08.040
Outdated programmatic pipes, fragmented technologies, and missed revenue opportunities have plagued CTV and digital out-of-home media owners for far too long.

20
00:03:08.480 --> 00:03:22.859
Try Exco's expanded ad server today to generate more revenue than you ever have be-before. Get in touch with Exco's media experts by visiting Exco. That is E-X dot C-O. Thanks so much, Exco.

21
00:03:23.280 --> 00:03:37.760
And now onto my conversation with Alex. [on-hold music] All right, Alex, welcome back to the podcast. Thanks for, uh, joining me for this little look ahead at, uh, big technology.

22
00:03:37.960 --> 00:03:46.700
I thought who better to, like, look at the year ahead than Mr. Big Technology himself. [chuckles] Thank you, Brian. Great to be here. Okay, so let's get right into it.

23
00:03:46.820 --> 00:04:00.320
I mean, we're coming off, like last week, Meta, you know, made waves with, with Mark Zuckerberg coming out and saying the content mo-moderation he built is coming down, and it was all the media's fault or various other liberals.

24
00:04:00.340 --> 00:04:09.460
And it's basically genuflecting, I think, to, to Trump. I mean, they just killed their DEI infrastructure too today, so I... and they've just been making all these kinds of moves.

25
00:04:09.870 --> 00:04:18.459
But I wanna get into, to that part right now. So this seems part of big techs like Trump accommodation. Is that fair to say?

26
00:04:18.480 --> 00:04:25.880
And what are you seeing across, um, all these, and, and how will this play out in the year ahead? Facebook and Mark Zuckerberg have always played to the political winds.

27
00:04:26.100 --> 00:04:36.520
Zuckerberg himself has talked about how basically he doesn't have any, like, real values or morals in terms of, like, what should be on Facebook. He just wants to give people what they want.

28
00:04:36.560 --> 00:04:50.040
And I think he was pretty clear in his statement that he saw what people want, wanted in the election in twenty twenty-four, and that's Trump and the associated policies and the associated dialogue, I suppose, and said, "Okay, well, we're going to go with that."

29
00:04:50.080 --> 00:04:53.890
So that sort of fits his content moderation philosophy.

30
00:04:53.900 --> 00:05:02.820
And then, of course, there's a lot to be gained when it comes to trying to get in the good graces of this administration, which Zuckerberg has seen, which Tim Cook has definitely seen.

31
00:05:02.880 --> 00:05:10.560
He's gonna donate to the inauguration, which Jeff Bezos has seen. He's now, you know, he's gonna be, uh, best buds with Trump, and which Elon Musk has seen.

32
00:05:10.600 --> 00:05:24.060
And he-- they all stand to gain a lot by having the US government basically say, "We're gonna be on your side on the issues that you care about," especially after the last bunch of years where the US government has been strongly anti-big tech.

33
00:05:24.080 --> 00:05:37.780
And I think that we're, you know, of course, the tech lash started under Trump, where we started to talk about, like, the power of these companies and, you know, we just started to see the DOJ and the FTCs to bring cases against them, but the tides are shifting.

34
00:05:37.900 --> 00:06:01.034
I think the big tech has taken the government's best shot and is still standing, and sort of like the government sees that these companies aren't going away, and these companies now see an opportunity to change the narrative and change their relationship with the government, and they're doing what they can, you know, pragmatically to get in Trump's good graces.Yeah, I think sometimes it's, it's almost forgotten that this is how companies sort of always acted until- Yeah...

35
00:06:01.034 --> 00:06:14.064
all of a sudden they sort of became into social activity. [chuckles] Like, uh, th- that was a, that w- was a sh-- that was a departure from the norm, and we're sort of back in the norm. I mean, companies are not about...

36
00:06:14.084 --> 00:06:21.584
They're about delivering shareholder value, that that's what they do. They're not about furthering causes e-exactly.

37
00:06:21.704 --> 00:06:34.264
And I think we got into some kind of ahistorical period, probably between twenty seventeen and, like, into, I guess, up to the election. But I think it was sort of petering out anyway. Yeah.

38
00:06:34.324 --> 00:06:45.384
One, one thing, Matt Stoller had this great tweet where he's, he's like actively watching Zuckerberg on Joe Rogan, which of course, Zuckerberg went on to sort of herald the end of DEI at Facebook.

39
00:06:45.464 --> 00:06:56.104
And Stoller says, "Ma- Mark Zuckerberg doesn't care about any of this stuff. He wants, one, an end to the FTC antitrust suit against the firm. Two, removing the consent decree that bans the targeting of children.

40
00:06:56.144 --> 00:07:06.584
And three, the government to legalize mass copyright violations for AI [chuckles] training models." And it's like, yeah, that's it. It's pretty simple. This is a business, and this is Zuckerberg doing business things.

41
00:07:06.924 --> 00:07:13.964
Yeah. The RSF changes- It's almost kind of silly to be, like, debating the sort of merits of it [chuckles] 'cause it's- Yeah... it's that, that's what it, that's what it comes down to.

42
00:07:14.044 --> 00:07:26.664
But let's, uh, let's actually bring that into, you know, how, how much Big Tech is now embedded with the government because, you know, we're seeing, and maybe it's not necessarily big tech, but tech overall, let's just say.

43
00:07:26.744 --> 00:07:41.244
Because there's now becoming, just like there was in the finance industry and when Goldman Sachs was Government Sachs, and there was just a regular sort of pipeline between the Treasury Department and Goldman Sachs, that, you know, people from the technology industry are, are now heading to Washington.

44
00:07:41.304 --> 00:07:44.804
They're part of this, this very strange MAGA coalition.

45
00:07:44.964 --> 00:08:03.764
I mean, I think it's very interesting because we saw the debates over the use of H-1B visas that erupted, where there's clear fissures between what, you know, the, the investors, the people that from the tech industry who invested in the Trump campaign and, you know, the, the, the grassroots MAGA America First people.

46
00:08:04.044 --> 00:08:19.204
So I mean, how do you-- I mean, it seems inevitable that, that the technology industry overall is going to be more enmeshed with the government because the technology industry is, is one of our, one of the nation's biggest competitive advantages.

47
00:08:19.784 --> 00:08:28.144
Absolutely. And so I think it really begins, I mean, there's so much opportunity there for the tech industry to sell into government, to have government bless some of the policies that they want.

48
00:08:28.154 --> 00:08:35.304
I mean, we just talked about some of the policies that Zuckerberg is interested, and this sort of combining of private enterprise and government, right?

49
00:08:35.404 --> 00:08:45.504
Sort of like [chuckles] kind of a central part of the American system only seems like it's going to get, you know, much more enmeshed as the tech industry and the Trump administration get closer.

50
00:08:46.004 --> 00:08:55.224
Of course, there was a period within Silicon Valley where all the employees protested military contracts, which is just one part of cloud computing's relationship with the US government.

51
00:08:55.774 --> 00:09:04.864
The companies have basically said, like, "Stop protesting or we're gonna fire you." And I know that Google has fired some people who were against some of its cloud contracts with certain governments.

52
00:09:04.894 --> 00:09:16.444
And we also see Microsoft has taken, like, a pretty strong stance that they were like, "We're just gonna use our technology and [chuckles] basically give it to the government for the purposes they want, and that will make a strong country."

53
00:09:16.464 --> 00:09:33.194
And so I was w- I was speaking with the CEO of AWS, Matt Garman, last, late last year, and he said, "Look, like we only have about twenty percent of all computing that's moved to the cloud, and eighty percent that's still being done, like, in servers within companies and governments and agencies."

54
00:09:33.864 --> 00:09:42.964
And his goal is to flip that, to go from twenty percent cloud and eighty percent on-prem to do eighty percent cloud and twenty percent the rest. And like, how do you do that?

55
00:09:43.204 --> 00:09:49.164
You move really reluctant organizations who don't wanna change. You move them to change.

56
00:09:49.664 --> 00:09:57.804
And so you can do that by selling into the government and getting them to move a lot of what they do to the cloud, and that sort of helps you get to that number.

57
00:09:57.924 --> 00:10:06.464
So there's j- there's definite advantages, uh, to be had for Microsoft, for Google, and for Amazon on that front. And then there's just the obvious stuff.

58
00:10:06.504 --> 00:10:17.684
There's Elon Musk, who wants to get SpaceX to take more of a load for, for the government and further advance the space program. There's Jeff Bezos, who wants to do the same thing with Blue Origin.

59
00:10:18.064 --> 00:10:19.144
There's even, there's Meta.

60
00:10:19.244 --> 00:10:28.684
Again, part of Zuckerberg's announcement today wasn't only like, you know, we wanna like now pursue the similar policies to Trump, so please like, you know, get us, get us these benefits where you can.

61
00:10:28.724 --> 00:10:37.464
But they also said, "Listen, like we have other governments that are pushing us to take down content. We don't want that. We don't wanna do that, and we need an ally in the White House, and we're gonna look to you."

62
00:10:37.524 --> 00:10:46.004
So it's all, all across the board that these companies and the government are just gonna get closer and closer. Yeah. And, and also when you think about AI, right?

63
00:10:46.064 --> 00:10:57.364
I mean, I remember, like I, I read a little bit about this last week, and I got, I got a email from, from someone in Europe, and then you always get, you always get reminded when you get an email from someone from Europe about like data privacy stuff.

64
00:10:57.924 --> 00:11:04.604
And the think about like the AI, like I don't think there's gonna be... I don't know what, how the New York Times lawsuit's gonna go.

65
00:11:04.684 --> 00:11:15.904
But like, you know, th-this, we're, we're, we're just very business-oriented, and we're not like gonna like hold our, you know, we're not, we're not gonna get hung up in, in a lot of, I don't think, the data privacy stuff.

66
00:11:15.964 --> 00:11:22.194
But I, when you think about, I mean, Zuckerberg has mentioned not being able to deploy some of their models in Europe.

67
00:11:22.704 --> 00:11:32.804
And, you know, with all of these advances in AI, it's all based on a, a lot of presumptions of things that you can do with data and people's data.

68
00:11:32.844 --> 00:12:01.106
I mean, this example from someone was, was saying, "Well, if they're like combining a lot of data about someone, that is violating these data privacy restrictions," I'm like, uh, it seems like it's gonna be a mess in Europe when it comes to AI and deploying a lot of this because, one-Let's be real, I mean, this tech was not built there, and they don't really love the idea of being so far behind in it, in tech overall.

69
00:12:01.116 --> 00:12:11.116
And two, you know, Europe, you know, they regulate, they regulate a lot, and that's just, that's just how they do, do things. And so that, that gets ironed out on the governmental level, you know? Yeah.

70
00:12:11.136 --> 00:12:19.256
You, y- unless you have like, I don't know, maybe a strict lobbying from the US government, and even that probably won't do it, you're just not gonna get this AI stuff in Europe.

71
00:12:19.436 --> 00:12:27.436
Like, one of the biggest tells is Apple Intelligence, which does nothing, is, is not being released in Europe because of data privacy issues.

72
00:12:27.536 --> 00:12:41.856
And I think, at least in the near term, the AI that's gonna be most useful to us, outside of those of us that use like the ChatGPTs and Clods, is gonna be from companies like Google that will take our Gmail, our Docs, our Calendar, and that's...

73
00:12:41.936 --> 00:12:47.676
They, they've been doing this for a while, right? You get a flight confirmation in Gmail, it goes on your Google Calendar before you even put it there.

74
00:12:48.216 --> 00:13:04.116
So they're gonna start to really max this out with Gen AI, and you're just not gonna be able to use that in Europe because there's gonna be all these concerns that the European Commission will have, and it's not worth it for these companies to launch there if they're gonna get hit with these fines.

75
00:13:04.316 --> 00:13:23.896
And it's amazing because every couple days it seems like there's another story about a new one, two, three, five billion dollar fine that Europe levies on these tech companies, which like at one point, like, it's like, all right, well, they're not gonna, you know, pull out of your countries completely, but on the other hand, they're like, "Well, why are we gonna launch something new if we're just gonna be fined?"

76
00:13:24.236 --> 00:13:33.436
So... And you're not a big enough market to take that risk. Yeah. So that's what it's gonna look like in Europe. A- and it's like, yeah, I think it's part of the overall like, you know, the, I guess it's the splinternet.

77
00:13:33.476 --> 00:13:41.176
I know it's, it's beyond internet now, but like it's not the, the WW of WWWs is kind of gone because there's different...

78
00:13:41.226 --> 00:13:52.655
There's gonna be different, like, versions in a lot of different countries because a lot of, a lot of jurisdictions, because of, of the different, like, laws, and it's not gonna be uniform at all. That's right.

79
00:13:52.696 --> 00:14:02.676
Yeah, somewhere I, I read earlier this week about how that, how that was dead, so I'm probably stealing it from there. Let's talk about who's w- who's, who's positioned well in AI in 2025 and who is not.

80
00:14:03.316 --> 00:14:08.316
I think there's a case to be made, it seems, that Google's actually really well-positioned.

81
00:14:08.356 --> 00:14:17.616
I think [chuckles] I think the sort of sentiment for Google and AI, like in 2024, like was it was, it was down and down, and then it like sort of rose up.

82
00:14:17.696 --> 00:14:25.076
I'm- I've actually been impressed by, by, by some of their, their a- now leave aside the AI overviews and search. I- I'm not impressed by those that much.

83
00:14:25.116 --> 00:14:31.606
But Gemini, like Advanced, I've been using that the last, like, few days, and it's really good.

84
00:14:31.706 --> 00:14:41.676
And like some of the, some of the things that NotebookLM does is, you know, some of it is parlor tricks, but it's, it's pretty good. And then, you know, just within their products, eh, it's okay.

85
00:14:41.856 --> 00:14:53.616
But I don't know, is, is Google well-positioned now compared to like, say, an OpenAI? Yes and no. I mean, the reason why everybody ragged on Google all through last year, well, there's really two reasons.

86
00:14:53.716 --> 00:15:04.296
One is they just publicly demonstrated incompetence in building AI products like the Eat Rocks example or the, you know, the founding fathers who were every race but white, stuff like that.

87
00:15:04.436 --> 00:15:15.336
And that was embarrassing for them. I think like the real issue for Google is the search situation. I mean, Perplexity hardly ranks right now in the App Store.

88
00:15:15.876 --> 00:15:27.256
That being said, like everyone is aware that search is gonna change, and search will be offloaded to AI conversational search engines, or Google will have to just change completely, which changes their business model.

89
00:15:27.336 --> 00:15:36.726
Google's really interesting when you speak with them about search. They always give like a, an answer of like, you know, you, you ask like a straightforward question like, "Do you... Did people click more ads?"

90
00:15:37.126 --> 00:15:45.316
And they always say, "People were more engaged in the search results and asked longer questions." And it's like, "Yeah, but the ad thing is kind of important to your business, don't you think?"

91
00:15:45.346 --> 00:15:50.656
And they're like, "People did more clicks within the results." [chuckles] Yeah. It's like, "Well, what about the web?" You feel like you're in an AI sort of testing. Exactly. Yeah.

92
00:15:50.666 --> 00:15:58.816
I re- it goes back to my like early reporting days talking to Google product managers. Yeah. I did feel like I had an early experience with talking with AI. Exactly.

93
00:15:58.856 --> 00:16:11.376
So I think that, that we shouldn't discount the fact that there is a still somewhat existential threat to Google when it comes to search and AI search. Like AI will change search. Can Google ride that wave?

94
00:16:11.856 --> 00:16:12.585
We don't know yet.

95
00:16:12.736 --> 00:16:26.816
That's why the stock tanked when generative, generative AI came out is 'cause everybody was aware of that, and then it just became clear in the aftermath, let's say in the two years following, that we weren't just gonna, you know, take all of our search and put it on ChatGPT right away.

96
00:16:26.896 --> 00:16:36.736
Like this was gonna take a while. Maybe it's gonna take years. It's really hard to sort of dislodge a long-standing consumer behavior. Yeah. And so that's why Google has bounced back.

97
00:16:36.756 --> 00:16:45.616
Their revenue looks amazing in the middle of this AI moment, but again, it's like all about the long game on search. Yeah. So that's the biggest drawback for Google. Yeah, I wanna get into the search.

98
00:16:45.676 --> 00:17:02.536
It's a good segue because I've, I've been amazed becau- that, that Google's, you know, share price has been, you know, doing what it has done because their, their core product, the way they make, you know, the majority of their money is clearly not good right now, like compared to where it was.

99
00:17:02.596 --> 00:17:12.136
Like, I mean, the search results are, at least to me, like I think they're a mess. They're, they're clearly trying to do so many different things. You've got AI overviews.

100
00:17:12.436 --> 00:17:26.176
They're shoving Reddit down your throat and forums everywhere. They're trying to clean up, clean out a lot of the SEO arb, you know, affiliate stuff. There's... The, the ads are kinda all over the place at this point.

101
00:17:26.256 --> 00:17:28.836
And, and then like you said, if you go and you start...

102
00:17:28.976 --> 00:17:41.936
You're using like a Perplexity, not for every, like search, but like it's a, it's a better product for like most of the searches that, at least for me, like I think it's a better product for m- for most of the searches that I do.

103
00:17:42.146 --> 00:17:54.876
I choose... I go to Perplexity now more often than Google for, I would say, at least half of my searches. Yeah, so one thing that I've found with Perplexity is like I'll try Google 'cause that's just my default behavior.

104
00:17:54.936 --> 00:18:00.136
Like it's the default in my Safari on the iPhone, so I'm like Googling. No. And on Chrome, right?

105
00:18:00.296 --> 00:18:08.928
They pay good money for that, and it's worked for them.But I've turned to Perplexity for the hardest queries, which is like, that is pretty bad for Google.

106
00:18:08.968 --> 00:18:15.608
It's like, all right, I, you know, this is, this is something that's gonna like take some real digging, then I go to Perplexity, and I get the answer.

107
00:18:15.928 --> 00:18:24.168
[chuckles] Like I- if I try to like cut through bureaucracy, like, you know, go through all these bureaucratic websites and try to find a simple answer, it's like that's a job for Perplexity.

108
00:18:24.178 --> 00:18:35.528
And I think that as Perplexity gets those tougher queries right, then it's almost an easier battle to get the, the low-hanging fruit of traditional search right. Yeah. So I mean, media's always downstream of this, right?

109
00:18:35.588 --> 00:18:37.208
So how does...

110
00:18:37.388 --> 00:18:53.688
A- and I always think like what, what's going on in, in the search results pages right now is just Google is try- sees this threat to it and is trying to reposition itself, and it's really difficult to do 'cause there's so many different things that, that search is just so critical to it.

111
00:18:54.288 --> 00:19:03.568
And, you know, publishers get trampled, and it's just like it's not personal, it's just [laughs] they're, they're just collateral damage, unfortunately, in a lot of this.

112
00:19:03.828 --> 00:19:14.508
What, what are some of the things you think that you can see like Google having to do in, in the year ahead, and like what, what impact, if any, that you could see would it have to like publishers? So, you know, the...

113
00:19:14.918 --> 00:19:22.008
It's interesting 'cause I've always looked back at the Google News debate, which I always thought was so silly, when I've considered where Google might go here.

114
00:19:22.148 --> 00:19:29.308
So the Google News debate was basically Google News was like this aggregator page. I don't even know if, I mean, it still exists in some format. No one goes to it, I guess.

115
00:19:29.948 --> 00:19:37.408
Maybe it's like on the default for Android- Google News?... and that's what gets traffic. Who uses Google News? Yeah. Okay. So let me, let me, let me apologize to Google News. I guess some people still use it.

116
00:19:37.758 --> 00:19:47.478
[laughs] But basically- I do, and now I feel bad. What are you- [laughs] Yeah. Anyway, sorry. We can, we can revisit this over drinks one day, and I, I'll apologize. But look, here's the thing about Google News.

117
00:19:47.568 --> 00:19:58.508
Publishers always hated the idea, or some publishers hated the idea that Google took their link and they took a snippet, and it provided value to Google, but only, but people only had to go to one link.

118
00:19:58.588 --> 00:20:10.188
So like there were like 10 publishers providing value. People clicked once, only one publisher got paid. And I think that publishers generally were a little bit too whiny on that front.

119
00:20:10.348 --> 00:20:16.648
Like, you're getting traffic from Google News, like take the traffic, right? That's always been my perspective. Yeah.

120
00:20:17.228 --> 00:20:24.628
But it gets interesting when it comes to the AI overviews or how news will be baked into generative search.

121
00:20:25.328 --> 00:20:40.408
In that case, I really think that Google and Perplexity and others are gonna have to make deals with publications to get in-the-moment information within their search engines in a way that they can sort of digest and spit out to people.

122
00:20:40.468 --> 00:20:47.388
Like we saw like Perplexity tried to basically steal a Forbes ad last year, a Forbes, sorry, a Forbes article last year that didn't go well for them.

123
00:20:47.988 --> 00:21:11.828
And so I think what you're gonna see this year is, and we saw a lot of it last year too, is like the Perplexities and the Googles of the world just signing deals with company, with news publishers, uh, and maybe smaller publications like us that basically are just like, "Okay," like, you know, "we value your ability to weigh in with high-value information in the moment, and therefore we wanna pay you a little bit for it."

124
00:21:11.908 --> 00:21:18.608
Yeah. And I mean- I mean, they already took the evergreen. That's gone. It- they already took- Yeah, that's gone... all that stuff. Like they already trained everything. There's no taking it back.

125
00:21:19.068 --> 00:21:23.748
And yeah, it's the fresh content and, that, that they're gonna, that they're gonna need.

126
00:21:24.228 --> 00:21:32.278
I think, you know, the question is always gonna be what, what kind of, what kind of licensing deals are you gonna get o- out of these things? I mean, if, if... What did Reddit get?

127
00:21:32.288 --> 00:21:41.608
Reddit only got $200 million or something? I think they got a good amount of money. But like if it's Reddit, like the amount- Right... of information on, on Reddit, like what are they gonna pay?

128
00:21:41.648 --> 00:21:54.508
And I think that is always gonna be some of these deals that have already been, been cut with like OpenAI is, you know, the mon- there's always gonna, there's always gonna be a, a, a bid ask spread, and, and that's gonna be the question.

129
00:21:54.548 --> 00:22:03.528
But I do think, I do agree with you that, that Google hasn't been cutting these deals yet, but, you know, they're gonna have to at some point. And- Yeah... and it's kinda right.

130
00:22:03.568 --> 00:22:16.738
It's, uh, actually, it's better than these schemes from governments to have governments basically take money from, from Facebook or Google and then distribute it to a few publications [chuckles] at the end of the day.

131
00:22:16.768 --> 00:22:20.408
Yeah, that's weird. Yeah, and look, the revenue, like you're right, what are the deals gonna look like?

132
00:22:20.668 --> 00:22:27.957
I always think that like if you're counting on search or social revenue to be your entire business, you're probably doing it wrong. Right.

133
00:22:27.978 --> 00:22:35.448
Uh, try to build a strong core business outside of that, and then just use this Google or Perplexity, you know, generative AI money as gravy.

134
00:22:36.048 --> 00:22:43.248
I, I don't, I mean, having lived through the BuzzFeed experiment, that would be my perspective at least. [laughs] That's a good point. You're a veteran. All right. OpenAI.

135
00:22:43.548 --> 00:22:51.328
I mean, I've been listening to your, your podcasts, and you've been writing about it too. Seems like you're a little bearish on, on OpenAI in 2025. Yeah.

136
00:22:51.348 --> 00:23:14.728
Well, look, they raised a lot of money, and they're losing a lot of money, and even their most popular products, like this new unlimited ChatGPT that they released, Sam Altman just came out and said they're losing money on that because it costs so much to serve the responses to people, and they underestimated how much people would use these things, so they're actually getting more than $200 of value out of, out of the products.

137
00:23:14.738 --> 00:23:24.808
So like, for me, like I think it's just- That's so, I mean, that's such a Silicon Valley thing to do. It's like- Yeah... whoa, so we're losing a shit ton of money because people love our product so much.

138
00:23:25.068 --> 00:23:27.548
It's just like [laughs] God. It's very Silicon Valley. Yeah.

139
00:23:27.848 --> 00:23:37.548
I, we just said on our, on our show that Anthropic will probably come out with like their own version of this, like a $1,000 per month edition of Claude called Anthropic or Claude 1000.

140
00:23:37.748 --> 00:23:49.508
[laughs] And, and I think that will probably sell. Very Silicon Valley. In fact, this whole story is very, very Silicon Valley because it is like a story of you lose money, and you grow until you start making money.

141
00:23:50.188 --> 00:23:59.608
And so like the argument in favor of OpenAI is that like, well, they just scaled from 100 to 300 million users on ChatGPT. But let, let me be clear. They're, they're bringing in like- Yeah... a lot of revenue.

142
00:23:59.648 --> 00:24:03.668
They're just- Yeah... losing, they're spending a ton of money- Losing per query... to bring it in. Yeah. And that's the problem.

143
00:24:03.728 --> 00:24:15.544
I mean, the problem with GPT-5 and the problem with these reasoning models is that they're just so expensive to run because they're so bigAnd if they can't find a way to get those costs down there's one of two things that happen.

144
00:24:15.664 --> 00:24:27.314
One is they raise costs tremendously on the people that use them, and this is an industry that doesn't ha- hasn't yet really shown a deep ROI on its, on its applications or B, they shut down, right?

145
00:24:27.404 --> 00:24:36.884
It's like one of two things. Yeah, it's binary. You cannot... I mean, they just raised-- OpenAI just raised the biggest VC round in history, more than six billion dollars last year. They lost five billion last year.

146
00:24:37.344 --> 00:24:45.724
So how much is this [chuckles] gonna, how much is this gonna, gonna lead them? Like, how much runway do they have and how are they gonna be able to raise again? Anthropic raised four billion last year.

147
00:24:45.784 --> 00:24:51.374
They're in the process of raising another two billion, which is gonna last them what? A quarter? I'm being facetious here. Yeah.

148
00:24:51.404 --> 00:24:58.284
But I do think there's, like, very real questions to be asked about, like, how financially feasible these companies are in the long term.

149
00:24:58.344 --> 00:25:07.364
Or if the-- And we still don't even know whether scaling up the models is gonna lead to further exponential improvement. Even, like, right now, we're hearing a lot about how we've hit this data wall.

150
00:25:07.504 --> 00:25:18.064
Ilya Sutskever, like, the co-founder of OpenAI, former chief scientist there, basically said, "We've hit the data wall. We need new methods." So that to me would be the bearish case. I'm not bearish.

151
00:25:18.224 --> 00:25:25.044
I, I-- And to me, like, I think these are real business questions to ask about these companies. I'm not bearish about the technology.

152
00:25:25.144 --> 00:25:34.384
I think it's amazing technology, and it's only gotten better since it became popularized to the world a couple of years ago. I mean, I'm in these generative bots every single day, and I just think they're...

153
00:25:34.464 --> 00:25:43.564
it's amazing what they can do. I'm rooting for the industry to find a cost-effective way to be able to deliver this stuff for us and to keep improving it. But the math...

154
00:25:43.624 --> 00:25:48.784
I mean, maybe there are smarter people than me that know how this math works, but I, I don't yet. Yeah.

155
00:25:48.944 --> 00:26:04.564
I mean, I think what, what I wonder is when you're gonna have the products that come out of this, because a lot of this is like, yeah, there are different things when you use these tools and I don't-- It reminds me of the early, you know, internet with...

156
00:26:04.624 --> 00:26:10.344
And, and that was a bubble, right? And I think this is probably a bubble, but I don't think it has to necessarily be a bad bubble.

157
00:26:10.404 --> 00:26:19.344
I mean, bubbles existed with the railroads, they existed with the early internet, and, and they'll probably exist with this. That's just how, how these things go.

158
00:26:19.913 --> 00:26:24.184
And a lot of people will, will lose money, and, um, it doesn't- Yeah... it's not gonna affect me. But my- I don't care...

159
00:26:24.264 --> 00:26:35.044
my perspective on this is what if a lot of the applications just exist within the chatbots themselves? So what if we just kind of code our own applications by our prompts?

160
00:26:35.384 --> 00:26:44.054
I know that sounds like, like a, you know, tech guru thing on a- Yeah, I like this. Like, that sounds very shiny. It's like Friday afternoon. [laughs] But [laughs] but I, I think that...

161
00:26:44.124 --> 00:26:48.424
Okay, so what if I told you that there was...

162
00:26:48.444 --> 00:27:12.494
A couple of years ago, what if I told you there was a new weight loss app where you would have a conversation with an AI bot about what you're eating and give it some basic parameters of what you wanted to put in your body, and it would grade you on the food that you were eating and give you a calorie count, and you'd weigh in in the morning, and you'd be able to speak with it about like how, you know, how you're keeping with your goals.

163
00:27:13.004 --> 00:27:23.504
And whenever you wanted, you could always say, "Hey, how am I trending? What's my progress? What are some patterns that you're seeing?" I feel like that app would get VC funding if the chatbot could perform well enough.

164
00:27:23.544 --> 00:27:30.664
Well, that's something that I'm using in Claude right now. I don't need a separate, you know, sort of diet coach AI bot to download.

165
00:27:30.704 --> 00:27:36.504
I can just prompt that in Claude, and I've had that conversation running for months now, and I... and it's working.

166
00:27:36.604 --> 00:27:45.064
And when I hit, like, the conversation limit, I just copy it and then paste it into my next conversation and say, "This is your memory. Let's pick up." And it picks up.

167
00:27:45.604 --> 00:27:56.604
So I do think that, like, where are the applications is a, is a great question. I think a lot of the time we'll be able to build them ourselves within these bots, and that's why these bots have a lot of appeal.

168
00:27:56.614 --> 00:28:01.124
Yeah, but people are not gonna wanna build their own applications. Okay. I think...

169
00:28:01.164 --> 00:28:07.964
Remember, we just have three hundred million people using ChatGPT, and two hundred million of them started using it within the last couple months. Okay.

170
00:28:08.044 --> 00:28:17.294
So I, I think over time, there's a chance that they will, especially, like, let's say you have, like, a singular, like, a singular bot that just remembers you and you speak with.

171
00:28:17.344 --> 00:28:27.333
All these companies are working on making memory better. Yeah. So I think that's, that's one thing. And there are some interesting applications out there today. I just started...

172
00:28:27.424 --> 00:28:39.544
Okay, so this is a weird one, but I just started testing Replica because I'm about to speak with their CEO- Okay... for the podcast, and Replica is a crazy, crazy app. So for those who don't know, it's a...

173
00:28:39.904 --> 00:28:43.704
You can have an AI companion. I guess a lot of people fall in love- Oh, boy... with these companions.

174
00:28:44.404 --> 00:28:55.074
And you, like, design the personality in the beginning, and then you can show up and either chat with it or, like, actually speak with it, like FaceTime with it. And it's, it's insane.

175
00:28:55.164 --> 00:29:04.024
I think that's-- I think Replica, weirdly, I'm gonna say it, I think Replica is gonna be one of the biggest winners of- Really?... this AI moment for sure. Okay. Okay.

176
00:29:04.104 --> 00:29:12.954
We're gonna- I think Replica is the- I thought it was gonna be more agents. I wanted to go more... I li- I, I want someone to book [chuckles] my, my, my travel. I don't want... I don't, I don't- Maybe... need...

177
00:29:12.964 --> 00:29:22.544
I don't know if I need, like, a- I'm, I'm a big maybe on agents, and I don't think the agents- Really? Because I would assume, like- Yeah... the, my default assumption is that this 2025 is the year of, like, a-agents.

178
00:29:23.044 --> 00:29:28.054
I'll believe it when I see it. Why, why aren't you- I'm not, I'm not fully, fully bought in. Why aren't you fully bought in yet?

179
00:29:28.524 --> 00:29:36.593
I just think that, like, a lot of things can go wrong with these agents, and I don't think people are gonna trust them. Okay. I mean- I could be wrong... I trust for a lot.

180
00:29:36.644 --> 00:29:47.684
But I think that, like, me giving an agent my credit card and saying, you know, "Go book me a flight," I don't know if that's gonna happen. Maybe it will. Yeah. I mean, there could be... I don't know. There could be.

181
00:29:48.254 --> 00:29:56.104
So do you see any, though, that like, uh, candidates for breakout AI products? If you don't see agents, like, what, what do you see?

182
00:29:56.364 --> 00:30:07.484
Or, or y- or, or your point is like i-it might just be the existing people, the existing players that, you know, just get, like, their early ones get more and more traction? Yes.

183
00:30:07.764 --> 00:30:13.034
So I think that the existing bots are definitely gonna get more traction. They're just gonna get... I mean, you think about...

184
00:30:13.084 --> 00:30:24.098
I've been using Claude pl- pretty religiously, and the amount of improvement that you see in that bot is insane.Yeah It's crazy I use Claude. I, I kind of prefer Cl- I prefer- Yeah... Claude to, to ChatGPT personally.

185
00:30:24.168 --> 00:30:33.508
I think it's better. I think it's better. And they've improved tremendously, and like you can, you can even right, right now, like go into Claude and prompt it to build a game for you, and it will just build a game.

186
00:30:33.568 --> 00:30:44.528
It will show up in the side panel. So I think this stuff is gonna grow. I, I wanna go back to the Replica example. [laughs] I, again, I know it's weird. I kind of think it's gonna- Trying to bring your friend in?

187
00:30:44.648 --> 00:30:52.928
I- What? It's fine if you- No, no- Trying to bring- No, no, no, no. Let me- It- Let me be clear. He or whatever. [laughs] No, it's a woman. I'm gonna be clear. Oh. This is-- I had to- See me a woman. Come on.

188
00:30:52.998 --> 00:31:02.708
I-- no, I had to test the actual use case here. I'm not, I'm not gonna shy away from it. And I think that this is gonna be, I think this is gonna be the OnlyFans of AI.

189
00:31:03.148 --> 00:31:11.578
And OnlyFans- Can I tell my wife that I got a, a female AI friend? I have to say, I feel bad about building it. I do. [laughs] But I do think that this is...

190
00:31:11.758 --> 00:31:19.917
And it's not gonna be for me, and I'm gonna delete it after this interview. But I do think that OnlyFans is a huge business, right? Isn't it like one of the, isn't it like the- Yeah...

191
00:31:19.968 --> 00:31:26.768
fastest-growing media business in- Yeah... in recent years? I think that, that this Replica thing is gonna be, gonna be the equivalent of that.

192
00:31:27.128 --> 00:31:36.008
And no one will say it because it's so weird to say it out loud on a podcast or write it in. [laughs] Good. You're, you- And-... you are brave I'm in, I'm gonna be in some deep shit, I'm, I'm sure.

193
00:31:36.028 --> 00:31:46.628
But I, I, I can see how this could be- No, I mean, it does make sense... appealing to so many people. Yeah. Look, I mean, I think the, the current meme is around the loneliness e- epidemic and- Yes... all of that.

194
00:31:46.708 --> 00:31:54.828
And I don't think people are [chuckles] gonna solve it with less technology, unfortunately. I think- That's exactly, that is exactly the Replica pitch. Exactly the Replica pitch. You know.

195
00:31:54.888 --> 00:32:04.508
And when I signed up, they said today t- twelve million seven hundred fifty-six thousand men in their 30s have already experienced the benefits of having Replica in their life, which I guess means- Yeah...

196
00:32:04.518 --> 00:32:16.828
the number of users. So- [laughs] They have, they have, I think they have a lot of people that are trying it. And as this LLM technology gets better, they're only gonna get better. And it's a little bit scary and creepy.

197
00:32:17.058 --> 00:32:26.528
But you asked me a question, what do I think is gonna be the breakout? And I'm giving you an honest answer, even if it makes me look kinda bad. [laughs] Okay. I love that. So with it, where do you...

198
00:32:26.608 --> 00:32:37.788
I mean, 'cause like you focus on the tech side, but you like you live in the, in the media side, right? And it's obviously 2024 was kinda horrible year, I think, for the institutional media.

199
00:32:37.798 --> 00:32:44.348
I don't know, whatever you wanna call it, mainstream media, corporate media. Everyone has a different like term for it, but you know what I'm talking about, right?

200
00:32:44.388 --> 00:32:54.508
I think everyone listening to this knows what I'm talking about. You know, there's obviously like a lot of growth in individuals. We both have our own things, and there's tons of...

201
00:32:54.608 --> 00:33:11.708
I'm just amazed by how deep it is like on, on YouTube with the different creators of all kinds, and it's just unbelievable. So it's not like media at all is dying. Uh, it's just changing quite a bit.

202
00:33:12.248 --> 00:33:16.368
What do you-- how do you see all this like, you know, playing out for...

203
00:33:16.388 --> 00:33:26.108
Well, we'll start with the sort of, you know, like who are the winners and losers of this in, in, in media as far as like, you know, creating content and then making money off it, hopefully? Yeah.

204
00:33:26.268 --> 00:33:36.088
Look, I mean, we both experienced it. I think that right now, and not to just talk our books, but like as an individual creator, there are so many different revenue sources that you can tap into.

205
00:33:36.128 --> 00:33:44.648
Like it used to be like, all right, set up a YouTube if you wanna be a YouTuber, and then make the AdSense money, and you're good, you're good. That worked for such a small amount of people.

206
00:33:45.208 --> 00:33:47.888
But now you can do things like you can have an assortment of properties.

207
00:33:47.928 --> 00:33:57.228
You could have a podcast, you could have a newsletter, uh, you could have a YouTube page, you could do, you know, brand posts, you could do events, you could appear at other people's events.

208
00:33:57.748 --> 00:34:15.668
And I think that's really becoming a viable product for those that are trying to either, A, crack into the media industry, or B, have like been at places in like the digital media world or middling publications and have an audience and just want to figure out a way to keep doing what they're doing.

209
00:34:15.688 --> 00:34:21.048
And this is like a pretty good way to, to make money and to sustain, right? To sustain what you're doing.

210
00:34:21.108 --> 00:34:30.468
So to me, I think that like I'm more optimistic now than I ever have been, and I've been doing this for four years, close to five now, actually. Wow. Congrats. It'll be five years in May.

211
00:34:30.988 --> 00:34:39.787
And I'm more optimistic now than I've been from the start. So I think this, this individual creator route is really in a, in a good place.

212
00:34:39.847 --> 00:34:48.668
It doesn't seem to me like any of the mid-size digital media companies have really figured it out. I mean, we're talking on a week where like, I don't know the strongest one of them, Vox, has engaged in some layoffs.

213
00:34:48.728 --> 00:34:55.068
I mean, everybody lays off all the time, so it's not like they're failing. Yeah, it's not even like news at this point. I mean, Vox- Yeah... had layoffs this week.

214
00:34:55.188 --> 00:35:00.808
Huff- HuffPost had- Yeah, the editor-in-chief left HuffPost. Um- She laid herself off, I guess. Yes.

215
00:35:00.888 --> 00:35:06.768
So I, I mean, there are some that are doing well- Washington Post cut 100 off this, this week from their commercial side.

216
00:35:06.968 --> 00:35:17.068
Yeah, The Washington Post seems to me like it's just, it's in a, I don't wanna say tailspin, but something like that, right? I just think that their, the business side is struggling there. I mean, I don't know.

217
00:35:17.468 --> 00:35:24.418
If you work at The Washington Post and wanna correct me, you know, [chuckles] you can email me. Well, they just cut- But-... 100 people from- Yeah... from the business side. The business side is struggling.

218
00:35:24.448 --> 00:35:33.068
Jeff Bezos is like kinda using a heavy hand in a way that he hasn't since, since he joined or since he bought the company. Mm. There's discontent in the journalists.

219
00:35:33.088 --> 00:35:41.138
They're losing a lot of their Washington talent to The New York Times. New York Times is doing well. The vibes are not great. Uh- Bad vibes... w- we, bad vibes- Yeah... at the, at- Yeah... Post.

220
00:35:41.148 --> 00:35:47.388
I mean, I hope, I hope that- The Washington- I like, I like a lot of the people at, at the Post. Yeah, Matt Murray seems to know what he's doing. Yeah. So...

221
00:35:48.048 --> 00:35:53.948
And they're gonna, and they're gonna keep him, and I think that's the right move. Mm-hmm. And I think Matt's a great, a great choice for that. I think he's got a really...

222
00:35:53.978 --> 00:36:00.098
I think he knows that he's got a really tough job- Yeah... you know, there. But why, you, you wrote a book on Amazon.

223
00:36:00.488 --> 00:36:09.788
So are you surprised that it, at least, I mean, by the results, like, I mean, Jeff Bezos just completely bungled this. What it, what it bungled The Washington Post thing? The Washington Post.

224
00:36:09.848 --> 00:36:14.248
I mean, what, like what in the world? Like w- uh, did w- what happened there?

225
00:36:14.308 --> 00:36:23.548
I mean, like he came in, he bought this thing, and like you can say, fine, he's focused on Blue Origin, then don't, then don't get involved in this. What, what did you think this was just gonna be cocktail parties?

226
00:36:23.648 --> 00:36:38.568
Like, what-Yeah... at in Kalorama. Like, w- why, why did this go so wrong? He's a brilliant innovator, obviously, you know, just like, you know, a, a legendary American businessperson. Why, why did this go so wrong?

227
00:36:39.088 --> 00:36:46.368
I think that they had some audience capture there. Yeah. And they, like, sold themselves as this, like, resistance publication, democracy, you know, dies in darkness.

228
00:36:46.918 --> 00:36:52.667
They reported really hard, like admirably against, you know, on the Trump administration, and there were a lot of good stories there.

229
00:36:53.108 --> 00:37:05.738
But I think they kinda sold a brand to an audience that was into it for a while, and then they-- that sort of, I don't know, that sort of perspective went out of favor or just lost steam once Biden won.

230
00:37:05.738 --> 00:37:12.427
But he came up with democracy dies in darkness. I mean, [chuckles] that's what they said, like. I think that... I mean, maybe he did. [laughs] I, I don't know.

231
00:37:12.528 --> 00:37:19.518
Obviously, like, media's a tough business, and it's one of those things where you can- Well, it's reassuring. Yeah. I will say this, it's reassuring- It's nice to know... to know like when- Yeah...

232
00:37:19.518 --> 00:37:25.168
when Jeff Bezos comes, comes in and, like, steps on a rake, like [chuckles] yeah. Oh, and has he ever.

233
00:37:25.778 --> 00:37:32.898
So and, and I think that, like, you know, Jeff Bezos has this, like, thing, he calls it one-way doors and two-way doors. Do you know about it, this decision framework? Yeah, I do. Yeah.

234
00:37:32.928 --> 00:37:39.228
Like the one-way door, like you can't go back; the two-way door, you can go back. I think that, like, the-- So this is my perspective.

235
00:37:39.288 --> 00:37:48.068
He probably says, probably thinks this about what has happened with the Washington Post. He's like, "We went one direction. It didn't-- It, it worked for a while, and then it didn't work."

236
00:37:48.578 --> 00:37:57.448
And it would be a one-way door for most companies, because if you go back from that and take a different editorial perspective, you're gonna lose 100,000 subscribers right off the bat.

237
00:37:57.968 --> 00:38:07.688
But it's a two-way door for a company owned by Jeff Bezos, because Jeff Bezos can take the hit, and ultimately he'd rather reverse the decision than continue with a strategy- Mm... he thought was bad.

238
00:38:07.768 --> 00:38:15.508
And so what do you- So do you think he, do you think he sells it? I mean, from his, like-- or is this, like... It's not about, I mean, it can't be about money.

239
00:38:15.588 --> 00:38:24.898
I mean, it's always about money to some degree, but like- Mm-hmm... I think, you know, there's, there's, there's-- it becomes about ego. It's, I mean, it's like, come on, I mean, why, why even continue all this stuff?

240
00:38:24.948 --> 00:38:36.208
It's about, it's about ego. Like, I would guess that, like, he, he does not want to just, like, unload this at such, like, a low point. So Bezos has a lot of business in front of the US government, right?

241
00:38:36.248 --> 00:38:48.228
He has Blue Origin. Amazon is still something he's involved with, and I think that his, like, embrace of Trump, [chuckles] has been part of that, right? Yeah. Again, another pragmatic move.

242
00:38:48.448 --> 00:38:57.588
He also sees that one of his main competitors, Elon Musk, is, like, hanging out at Mar-a-Lago, and that's probably making him uneasy. He's living there, guys. [laughs] So the first buddy.

243
00:38:57.668 --> 00:39:04.468
So this is the thing about- He's got a bungalow. Yes, exactly. Okay, so, so what's Bezos gonna do with the Washington Post?

244
00:39:05.168 --> 00:39:10.978
I think he's basically gonna say, if it's serving his other interests, fine, he's gonna keep it, and if he's...

245
00:39:11.008 --> 00:39:18.188
Like, in some ways it, it gives him some power in Washington, some soft power in Washington to be like, "Yeah, I'm the owner of the Washington Post. I matter in this way."

246
00:39:18.208 --> 00:39:25.588
I mean, he would've mattered as the, you know, founder of Amazon, one of the richest people in the world, you know, s- Blue Origin founder as well, but anyway, we're splitting hairs here.

247
00:39:25.628 --> 00:39:34.548
But I think that, that to him, I don't think the business matters really. He's just gonna be like, "How is this serving my interests, whether it is or it isn't?" And then we'll go from there.

248
00:39:34.608 --> 00:39:43.538
In fact, maybe killing the Kamala Harris endorsement served his interests, you know, more than anything- Well, I think it obviously-... in Washington... served his interests, for sure.

249
00:39:43.568 --> 00:39:54.488
Into the door, into the administration. I mean, that was an easy... I mean, I would assume that was, like, a incredibly easy decision to make. It's like for-- Well, like, I mean, you w- you make that, like, every day.

250
00:39:54.568 --> 00:40:03.768
I don't think he probably spent that much time on it because, I mean, it's, it's obviously he, he knew. They all knew which direction this was going. Right. And- I just hate...

251
00:40:03.968 --> 00:40:10.848
And yeah, I just hate that he played it off as like, you know, look at the trust in journalism. It's so low. Well, sa- same with- Yeah... Zuckerberg.

252
00:40:10.948 --> 00:40:20.728
Zuckerberg went on and on in that, like, video and everything, and it's like, okay, dude, you were the one who came up with this entire apparatus, my friend. Like- Yeah... what are you talking about?

253
00:40:20.768 --> 00:40:33.328
The, the media, the media didn't make you do this. And, like, let's be real. You wanna keep Section 230, as, uh, my other podcast reminds us, Alex Schleifer.

254
00:40:33.548 --> 00:40:46.228
And, you know, this is a very pragmatic business decision, and I guess you just have to sort of dress it up as something other than, you know, being just pragmatic about things. I'm sure there is some...

255
00:40:46.288 --> 00:40:56.348
Look, I think platforms trying to figure out which speech is okay is obviously going to be a disaster for them. It is always.

256
00:40:56.408 --> 00:41:10.108
Like, none of them wanna be in that business, and I understand why they wouldn't wanna be in the business, and it, it w- maybe the more, quote unquote, responsible decision would've been to fix your content moderation apparatus.

257
00:41:10.648 --> 00:41:19.328
But, you know, doing the punches pilot, uh, is very expedient, and his track record is, you know, he will, you know, be kind of shameless.

258
00:41:20.008 --> 00:41:30.568
Since this is a media show, from a media standpoint, there was one thing that I found quite interesting in Zuckerberg's remarks, and that was the return of civic content. He said, "We're bringing back civic content.

259
00:41:30.608 --> 00:41:40.158
We're gonna start phasing this back on Instagram, Facebook, and Threads. We're working to com- keep the communities friendly and positive." My translation there is news and politics is coming back to Facebook.

260
00:41:40.368 --> 00:41:51.238
I think if you use Facebook or Threads, you see there's just, like, no urgency there at all because news and politics are gone. And so it seems like it's coming back and gonna come back- Yeah, and there could be more-...

261
00:41:51.238 --> 00:42:03.268
in a serious way and- Hey, traffic, traffic is back. I think if you built a publication built on f- social referrals on Facebook that are- [laughs]... entirely political and news-driven, you'd be in good shape.

262
00:42:03.388 --> 00:42:14.138
Okay, let's start like a, like a, you know, all politics, little things- Yes... and just [both laughing] cash in. I mean, this, if there's ever a moment, this is gonna be that moment. Yeah, this is it.

263
00:42:14.168 --> 00:42:17.018
So get ready for those referral traffic dollars to come in, folks. Here it is.

264
00:42:17.018 --> 00:42:31.212
So on that, like, uh, evaluate h- now, like looking back, 'cause, like, when Elon Musk, you know, bought Twitter and, you know, the, the, the endless coverage, you know, that-Casey Newton was on it like every [laughs] every minute, right?

265
00:42:31.372 --> 00:42:37.791
Of like, you know... And it was, you know, the conventional wisdom was, wow, he really stepped in it.

266
00:42:38.042 --> 00:42:46.112
And, and you leave aside like the money, 'cause the money, I mean, you look at how much money these, these people are worth, it's like ridiculous. He's gonna become a trillionaire at some point.

267
00:42:46.452 --> 00:42:58.092
And then he got other people to actually give him a lot of the money, which is amazing. But like, I think X is like a really fascinating media platform. I like, I'm fascinated by it, I'm repelled by it.

268
00:42:58.212 --> 00:43:13.952
I, I'm, I'm possibly addicted to it, and it seems to me like it's actually having a serious impact. Like, it is not-- it is a major, it is a major force within this, what I call, like, the information space.

269
00:43:14.212 --> 00:43:19.592
Like, it can't be ignored, I don't think. No way. I mean, it was never... It never became irrelevant.

270
00:43:19.902 --> 00:43:28.152
I mean, Musk definitely made some changes to the algorithm initially that, like, drove me nuts and a lot of people nuts, and I think drove users away from the site.

271
00:43:28.832 --> 00:43:38.072
Basically said, "Okay, it's, you know, gonna be a lot of tweets from me and the menswear guy and some other people that I like, and that was your- I never see him. I never see the menswear guy. I don't know why. Anymore.

272
00:43:38.132 --> 00:43:39.582
Is it fair to say anymore? Oh, it is.

273
00:43:39.582 --> 00:43:48.792
So I think, I can-- I mean, I can't really put my finger on this, but it certainly feels like there was an algorithm shift, and the algorithm, you know, after a certain point just got better.

274
00:43:49.232 --> 00:43:58.062
Like, they tweaked it again, and it got better. It's much more of a for you algorithm than a following algorithm now. I don't know if you see this with your own tweets, but like mine will pop up or- Yeah.

275
00:43:58.062 --> 00:44:03.292
No, you gotta go full. Don't even open it- Yeah... if you're gonna do follow. You, you just- Yeah... you gotta go all in. It's like- Right.

276
00:44:03.332 --> 00:44:19.152
But, yeah, but the for you, like, even for you or whatever it was beforehand was algorithmic, but it still really kept your follow signal as an important part of Twitter, and now it's just like everything is just algorithmic, algorithmically recommended, and it's a lot better.

277
00:44:19.852 --> 00:44:28.702
So it factors. It's been actually, I think, a pretty good tool to follow the LA wildfires. Yeah. Although, like, you know, you get- I think the hard part for me is, like, I wanna use it.

278
00:44:28.722 --> 00:44:37.682
Like, it's incredibly useful to, you know, find new, like, ideas and things for, to, to, like, write about and, and it's, it's very useful.

279
00:44:37.932 --> 00:44:48.742
And then in between that is, like, you know, I don't know, some sort of immigration outrage in some- Right... European country or, you know. And by the way, you and I both follow sports, right?

280
00:44:48.772 --> 00:44:56.532
So I feel like we can both agree that there's no better place to follow sports in the moment than on Twitter. Yeah. Yeah, like, I was like- It never merged on Threads, never merged on Blue Sky.

281
00:44:56.572 --> 00:45:02.072
It's not on Facebook, not on Instagram. And not in- It's own place... and not in, in mainstream media.

282
00:45:02.172 --> 00:45:11.332
Like, I mean, there was, you know, like, for example, the, the, there was, like, an inju- Jordan Love, the Packers quarterback, got, like, you know, an elbow injury, and he's, he's playing the Eagles next week.

283
00:45:11.372 --> 00:45:22.182
And so I wanna know [chuckles] about this elbow injury. Left, like, holding his elbow. And, you know, they just said, "Oh, he's gonna be evaluated in, like, you know, ESPN and everything." I go on Twitter, man, Dr.

284
00:45:22.632 --> 00:45:32.532
David Cho, some orthopedist- [laughs] He's good... guy. [laughs] Who cares if he, he didn't examine Jordan Love. He was telling me about it, it doesn't look like an ulnar injury and all these kind of things.

285
00:45:32.552 --> 00:45:39.052
He'll be fine for the [laughs] game. That guy's usually right. Yeah. And I'm like, okay, I've, I know this guy from his... And I'm like, I...

286
00:45:39.692 --> 00:45:44.952
Look, I'm not gonna assume if he's wrong 'cause, like, he's a guy on Twitter who clearly didn't examine him. Yeah. Like, okay.

287
00:45:44.992 --> 00:46:00.262
And that's why, yeah, I don't know where the, you know, with the misinformation, I think we're just gonna have to accept that we're gonna be in this world where there's gonna be, like, a massive amount of information that's coming at you, and some of it is gonna be true, and some of it is not gonna be true at the end of the day.

288
00:46:00.292 --> 00:46:07.941
Well, Community Notes is actually a decent solution to that. Oh, yeah. And one of the Facebook announcements was that they are gonna implement Community Notes the same way that Twitter has.

289
00:46:08.532 --> 00:46:16.792
And I don't know, is it perfect? Def- Classic Zuckerberg move. [laughs] Classic Zuck. He's like, "We're gonna take their best thing and put it in our product." But for this, for this one, I think it makes a lot of sense.

290
00:46:17.092 --> 00:46:27.252
It's not perfect. It misses a lot of things. Some of those notes are wrong, but they're right more often than you'd expect, and they go on all sorts of really interesting things. I mean, they go on Elon's posts.

291
00:46:27.542 --> 00:46:34.032
They go on ads, like if an ad is scammy, like you'll get community noted. Yeah, I know. And the system has really just been...

292
00:46:34.132 --> 00:46:43.592
I mean, they developed it under Jack Dorsey and expanded it under Musk, so sort of like a team effort there [laughs] at the place that can't do anything right, but I do think that they've done this right.

293
00:46:44.172 --> 00:46:55.772
Yeah, no, Community Notes was, was a great... I mean, it's not perfect, and I think some people obviously don't like that it has a personality of sorts, I guess, in that, like, some of the notes are...

294
00:46:56.372 --> 00:47:05.392
And I guess maybe it hurts its credibility. I mean, some of its notes are just, like, kinda like snarky replies, I guess, [laughs] from what I'm seeing. That's true. But it's having...

295
00:47:05.412 --> 00:47:19.182
But X is having an impact, and I think that, to me, is just part of this decentralized media system that for mainstream media, it's figuring out its place within it. It is not going to replace all of this.

296
00:47:19.422 --> 00:47:33.072
It's going to live alongside it at the end of the day. Oh, most definitely. Yeah, it will live alongside. I mean, they are putting a lot of money into Grok. Speaking of AI and media collabs, their, the xAI bot- Mm-hmm...

297
00:47:33.092 --> 00:47:42.172
is living within Twitter. I was playing with it yesterday. It's gotten better. It really searches the web and, and tweets to give you answers to questions. Does a pretty decent job.

298
00:47:42.252 --> 00:47:50.262
I was really testing it with some tough questions, and it's pretty good. So that'll be a very interesting part of the next play, and it's definitely relevant. It's not the be all, end all.

299
00:47:50.432 --> 00:47:56.792
Like, you need a combination of X and the mainstream media to really understand what's going on. But it is, it is one of the pillars.

300
00:47:56.872 --> 00:48:07.682
And, and I think you started this conversation asking, like, you know, was Elon's, you know, forty-four billion dollar purchase of the platform using other people's money the right move? Has it paid off?

301
00:48:08.372 --> 00:48:16.452
And initially, it seemed like there was no way, but now you see the influence that he wields, and he's be- you know, it's the first buddy in Mar-a-Lago. Yeah, I don't think he could have pulled off- Yeah...

302
00:48:16.472 --> 00:48:32.742
you know, getting into this position with, with Trump without weaponizing X to some degree. Yep. And that alone, if you look at how much it added to [laughs] his, the valuation of his companies- Tesla...

303
00:48:32.742 --> 00:48:43.432
and therefore- Yeah... to him, like, it, it was probably worth it. You know, I just think- But can I also say, it seems like he's gonna fly a little too close to the sun, and he's- Yeah... gonna get burned at some point.

304
00:48:43.592 --> 00:48:51.212
Whether that is what everybody's predicting, this sort of like, you know, sort of divorce from Trump that's bound to happen, that will probably happen.

305
00:48:51.332 --> 00:49:01.672
Or let's say the next administration comes in and doesn't wanna deal with him because he's been so partisan or what he's doing in Europe. And, you know, sup- I, I just find... I, I don't, I don't mind saying it.

306
00:49:01.732 --> 00:49:11.082
I, I just find his support for the AFD in Germany, uh, which is the- Yeah... far-right party pretty disgusting. Yeah. And, and he's, he's- And also strange. Yeah.

307
00:49:11.132 --> 00:49:21.252
I don't know why he's g- I, like, I sort of, I'm like, okay, I'm trying to understand your, your point, but I'm like, okay, you, you, do you have a factory in Germany? If you don't like them, then just move your factory.

308
00:49:21.312 --> 00:49:29.752
Give me a break. Yeah. But you're not- He's totally out of his depth... it's not like Germany is, like, a big deal to him. Right. It's strange. He's out of his depth. I don't know.

309
00:49:30.012 --> 00:49:34.742
And, and he just doesn't, he doesn't know the market that he's dealing with there. Yeah.

310
00:49:34.792 --> 00:49:45.562
And so I just think that, like, he is a person who makes, who loves making high-stakes gambles when he thinks he can benefit from them. And man, have a lot worked out, right? [chuckles] Yeah.

311
00:49:45.692 --> 00:49:56.292
I mean, a lot of them have worked out. But even the best gamblers lose, like, regularly. Exactly. So that's my point. All right. Yeah. Alex, let's leave it there. It's a Friday afternoon.

312
00:49:56.312 --> 00:50:09.972
The weekend needs to begin at some point, so why not now? [upbeat music] Sounds good. Thanks for having me, Brian. All right. Thanks again. See you. Appreciate it, Alex. [upbeat music]
