WEBVTT

1
00:00:00.100 --> 00:00:16.690
[upbeat music] This week's episode is presented by Beehiiv, the platform trusted by enterprise publishers like Newsweek and Time.

2
00:00:17.220 --> 00:00:26.320
Newsweek is in midst of an exciting transformation. With AI disrupting search traffic, they're building direct relationships with their audience through newsletters.

3
00:00:26.540 --> 00:00:32.940
Newsweek has aggressive newsletter plans designed around adding new audiences and launching new products.

4
00:00:33.320 --> 00:00:44.480
Barak Kriech, Newsweek's Chief Product Officer, said this about why they chose Beehiiv: "Beehiiv's consumer-first, tech-driven DNA is exactly what we were looking for.

5
00:00:44.560 --> 00:00:51.850
The platform came from a B2C background and grew to become enterprise-ready. That startup mindset resonated with us.

6
00:00:52.080 --> 00:01:01.430
I'm excited about partnering with a team that's tech-first, engineering-first, and has an entrepreneurial mindset. We want to collaborate and grow together."

7
00:01:01.550 --> 00:01:11.660
With more and more industry leaders like Newsweek migrating to Beehiiv and 2026 planning in full effect, now is the time to see how Beehiiv can take your content to the next level.

8
00:01:12.380 --> 00:01:27.420
If you wanna see what your next stage of growth could look like, and I hope you do, go to beehiiv.com/trb. That is spelled B-E-E-H-I-I-V.com/trb.

9
00:01:28.120 --> 00:01:40.600
And you can meet with Beehiiv's team of growth and newsletter experts today. Thanks so much, Beehiiv, for your support and partnership. Welcome to the Rebooting show. I am Brian Morrissey.

10
00:01:40.640 --> 00:01:46.720
This is a real treat because I'm joined by Vineet Khosla, the CTO at the Washington Post.

11
00:01:47.180 --> 00:01:59.540
Vineet and I had what I consider one of, if not the most interesting of my conversations at Cannes this past summer, and so I'm hoping to reprise it in, in podcast form here. A little bit about Vineet.

12
00:01:59.580 --> 00:02:06.860
He worked on the Na- Natural Language Core of Siri at Apple as one of the earliest engineers building conversational systems.

13
00:02:07.220 --> 00:02:12.560
He spent years subsequently to that at Uber working on routing, mapping, real-time infrastructure.

14
00:02:12.960 --> 00:02:26.400
Then he joined the Post in 2023 as CTO to modernize the technology stack, and, you know, one of the things that I've seen, like under your leadership of the Post is, you know, expanding particularly into audio as an interface.

15
00:02:26.440 --> 00:02:37.380
I mean, obviously, your background is in that, and I wanna get into all of that. But I wanna start with a fundamental question, Vineet. I've, I wrote, uh, a few weeks ago about media people 'cause I think this is a type.

16
00:02:37.800 --> 00:02:46.080
There is a... They think in narrative. They like to like, you know, cause trouble. They like heroes and villains. They're more storytelling. They're...

17
00:02:46.420 --> 00:02:55.980
I have, throughout my career, I've been, I feel like I've been trapped in this like men are from Mars, women are from Venus, but it's, it's usually tech guys and media people. That's my...

18
00:02:56.240 --> 00:03:00.780
And I know that I don't mean it gendered in either way. But there's always been this divide.

19
00:03:00.860 --> 00:03:16.320
You have spent your career mostly on the tech side, and I know that like tech and media is like enmeshed, but you know what I mean, right? Right. Give me like what... You've now been at the Post for two and a half years.

20
00:03:16.880 --> 00:03:20.870
You know, give me the differences that you see on a

21
00:03:21.820 --> 00:03:42.900
w-with building product and just the instincts for it from a tech engineering-driven organizations like Apple and Uber and, and not necessarily just the Post, but like, just like what you see in the media because tech has never been the, the, it's not the main thing that these organizations do.

22
00:03:43.800 --> 00:03:57.840
Thank you for having me, Brian, and I hope I can bring back some of the charm of South- The magic. The magic of all the rosé we had that day. I'm sitting in an office right now, so for my PR here, I'm not drinking rosé.

23
00:03:57.900 --> 00:04:09.020
That's a great question you ask because as I... I feel like I am so lucky that I lived in the tech world, and I get to live in the media world, and I get to have multiple lives.

24
00:04:09.700 --> 00:04:18.409
When I was in the tech world, I thought I was so lucky that I got to build Siri, and then I went and did Maps, which is like an entirely different problem. Right.

25
00:04:18.440 --> 00:04:31.960
So when you ask what is fundamentally different between tech people and media people, I think one thing is media had too much success. And what do I mean by that? News is a really good business.

26
00:04:32.420 --> 00:04:47.480
It has a product market fit like nothing else on the planet. For three thousand years, we were etching it on tablets. We were writing it by hand. Then the technology of press came. Like it's a great product market fit.

27
00:04:47.900 --> 00:04:58.600
And in the tech world, you know, people literally die for this level of product market fit. So naturally, when you find a really good business and you get good at it, right, like it's not just a business.

28
00:04:58.640 --> 00:05:10.720
You get really good at it. It's serving a real purpose in the world. You tend to aim towards perfection a whole lot. When something is put in print, it's there for the rest of your life.

29
00:05:11.340 --> 00:05:16.280
And by the very instinct that tech people are more experimental. Right.

30
00:05:16.700 --> 00:05:28.240
So in our DNA, getting something slightly wrong is not the end of the world, but getting something slightly wrong in the world of media is often perceived as the end of the world. So there- Yeah...

31
00:05:28.300 --> 00:05:40.400
I feel like becomes a fundamental difference, and I can see both sides of it, right? Like, I want to be very clear. I'm not saying, "Oh, this makes media people like dinosaurs, and they're not willing to change." No.

32
00:05:40.620 --> 00:05:49.320
I mean, the opposite of that. I mean, this entire industry is named after the biggest technological innovation of its time, the press. Yeah.

33
00:05:49.700 --> 00:05:54.300
[laughs] So it's not like people here don't want to experiment or do something new.

34
00:05:55.140 --> 00:06:11.474
I just feel the training has been such, it is kinda hard to overcome.So just I, and I want to talk about the printing press, but just to stay on that for a little bit, how does that translate into the product that is shipped?

35
00:06:11.654 --> 00:06:22.904
Because you, you were very complimentary of media people, and as I said, I, I self-identify. I think I do. I'm definitely not a tech guy, so I'm, I, I have to be a media person by default in this conversation.

36
00:06:23.424 --> 00:06:28.624
And my evaluation of the products that media companies ship is that they suck.

37
00:06:29.104 --> 00:06:46.904
Uh, not all of them, but when you compare a product-focused technology organization to the, the websites that I visit to get great content from, there's no comparison. It's not even close.

38
00:06:47.004 --> 00:06:56.674
You can tell who is a product-driven organization and who frankly is not, and we can go through all the reasons for, you know, with monetization and the market, et cetera, et cetera, et cetera.

39
00:06:57.444 --> 00:07:10.584
But I don't think that you could be within an organization like, say, Uber, and have products that have that kind of usability, no matter the excuse, and that seems to me...

40
00:07:10.594 --> 00:07:28.424
'cause it's not unique to any organization that with where things are going, that is the biggest challenge because nobody will go to your products, nobody will use them if they suck.

41
00:07:28.744 --> 00:07:42.524
Like I, I... like to me it's just fundamental. Yeah. No, a-and I think this is the question I ask. Maybe I should ask you first. Uh, I've heard of this- Don't fear this. [laughs] I'll turn the tables on you, Brian.

42
00:07:42.683 --> 00:07:57.084
[laughs] We're only seven minutes in. There is a belief that news is the product. Yeah. Right? And for majority of the lifetime, that was very true. I am not saying news is no longer the product.

43
00:07:57.504 --> 00:08:08.904
It is still the product, but product is also the product. See, what has happened is there is a container that has been put on our interactions with the internet. Right.

44
00:08:08.944 --> 00:08:15.264
Which is a beautiful app, which is audio and vertical videos. Like these are the expectations that have been set.

45
00:08:16.084 --> 00:08:24.844
Since they have been set by parties outside of the media, media needs to catch up and needs to play within those containers very effectively.

46
00:08:25.064 --> 00:08:39.524
That's the area where I think a lot of big media companies did not in a timely manner recognize that their audience is moving away from them, and going to Instagram is not because Instagram has better journalists.

47
00:08:40.184 --> 00:08:51.204
They have zero journalists, right? They are moving because the container, the product that Instagram is putting out, is attracting some of their journalists to put their content on. Right.

48
00:08:51.724 --> 00:09:07.314
So if we get outside of our, you know, thinking that news is the product and nothing else can be the product, and say, "Well, news is the core of the product, but the product is also the product," then I feel we can get to success.

49
00:09:07.384 --> 00:09:20.134
And I think that is in my assessment from, you know, in two years in this media world is we are hesitant to make that jump and say that statement. So let me... and, and, and then we can move on.

50
00:09:20.204 --> 00:09:33.844
But like w-what in your diagnosis are like the three things that, that likely hold back a lot of media organizations from shipping I wouldn't even say great products.

51
00:09:33.944 --> 00:09:37.594
I, honestly I would say like good products from a u- from a user standpoint.

52
00:09:38.144 --> 00:09:45.184
I mean, I can come up with my own sort of like ideas, but I would love to hear yours from that 'cause, I mean, I think that there, there is an acute problem here and I just...

53
00:09:45.324 --> 00:09:57.904
I don't mean it to like, you know, denigrate like publishers and media companies. Yeah. It's just like you have to be honest that, you know, the, the product itself is, is objectively losing in the marketplace.

54
00:09:57.944 --> 00:10:06.664
There's a reason that people want to open Instagram, and the ads are, are a good experience. They fit and, and these are...

55
00:10:06.784 --> 00:10:18.064
yes, there's algorithms and they're like addictive and whatnot, but then, you know, sometimes we all feel this way. Like we'll, we'll click on a link and I'll like brace myself, you know?

56
00:10:18.264 --> 00:10:30.484
[laughs] Like I'm not sure what's gonna happen. Yeah. My phone might start heating up. I, I don't know if I have three nicely tied things in a bow. [laughs] But it is the investment, right?

57
00:10:31.004 --> 00:10:41.804
You have to invest in a tech team which is thinking like a tech team, which is thinking like a product team, and in the engineering and AI talent to match it.

58
00:10:42.564 --> 00:10:55.964
One example, one phrase I have used a few times is everyone has heard this. There is no longer just a company in the world. Every company is a tech company selling product X. Right. Which is very true.

59
00:10:56.044 --> 00:11:11.484
If you look at mattresses, Avocado mattress, like they are running like a tech company. Right. So you are a tech company selling product X, and that level of alignment needs to happen at the executive level, right?

60
00:11:11.544 --> 00:11:16.364
Like for, to not to toot my own horn, but at ex- at Washington Post it has happened, right?

61
00:11:16.944 --> 00:11:32.544
We have said that we are going to get the best of the best in the fields that we need, whether it is growth, whether it's advertising, whether it is technology, and we will let them do their job because we are the world's best news company, right?

62
00:11:33.144 --> 00:11:42.124
We are the world's best tech news company, and our core product is news. That will... we are nothing if we don't have great news, period. Right? Like that's the- Sure...

63
00:11:42.164 --> 00:11:50.984
foundation and the heart and the meat or however we want to define. Yeah. That's it. It's like a- The restaurant not having a good kitchen. Like- Exactly, right?

64
00:11:51.024 --> 00:11:59.074
Like- You, you can have the great ambiance, you can have a wonderful host, and all this stuff, but if you don't have a... If, if the food stinks, like you don't have much of a shot, I don't think. You nailed it.

65
00:11:59.074 --> 00:12:08.404
Maybe some. That's exactly it. We have the best chefs with the best ingredients, right? Yeah.And now we are going and making sure all tech companies should do that.

66
00:12:08.444 --> 00:12:19.334
Like, they should not have a crappy restaurant that smells bad. It should smell good, it should look good, it should make people want to spend money. It should add value to their life. Mm-hmm.

67
00:12:19.334 --> 00:12:22.004
It should not feel homework. The- Yeah.

68
00:12:22.044 --> 00:12:34.864
So let me, let, let me ask you this then off that, and, and then we'll get into, to more of the specifics, but this is a little bit philosophical 'cause I, I wanna get into, like, how you think the sort of news experience will be in three years or five years, okay?

69
00:12:34.944 --> 00:12:45.424
And five years might be too much to ask these days, but I ask this of a lot of CEOs in this industry, and I never get a coherent answer. They usually change the subject, so I'm gonna try to [chuckles] pin you down on it.

70
00:12:45.453 --> 00:12:52.464
[laughs] And I understand why they change the subject, because so much is changing right now. You mentioned the, the printing press, right?

71
00:12:52.574 --> 00:13:05.764
And you've compared this moment basically to the printing press in that AI is going to change creation, delivery, and consumption. Like, it will all be redefined. I do this other podcast called People Vs.

72
00:13:05.824 --> 00:13:16.304
Algorithms, and one of my co-hosts is not from the media, media industry. He's, he's the former head of design at Airbnb. He's, like, out in Silicon Valley, and he keeps saying, "The spaceship is over the White House."

73
00:13:16.424 --> 00:13:25.704
You know? [laughs] Like... And- [laughs]... and I think he's, he's, he's right. I don't know the timeline of that, but I, I think he is, is correct.

74
00:13:26.324 --> 00:13:39.364
So I think a lot of media organizations have to make the decision about they're downstream of technology. Technology cr- controls the tech industry, controls the interface layer.

75
00:13:39.744 --> 00:13:44.704
They are experts at interface, and therefore, they control the distribution.

76
00:13:44.764 --> 00:13:59.004
And, you know, all of th- the, the angst about Google Zero and AI overviews and stuff, all of that is because media companies do-- are, are not at the interface layer. They are downstream of that.

77
00:13:59.084 --> 00:14:12.144
And to me, if you play all that out, a lot of companies are gonna end up being content suppliers to those who control the commanding heights of the digital economy, which are the interface layer.

78
00:14:13.144 --> 00:14:24.924
Some companies are going to be able to compete on the interface layer. Am I wrong in thinking that? Or, or like how are you thinking about that? No, I, I, I think you're very correct in that.

79
00:14:25.704 --> 00:14:39.644
There are few things I would point out is- Yeah... I see more often than not media industry giving up the mantle to the tech industry, as if what they're doing is something very magical or very difficult.

80
00:14:39.704 --> 00:14:50.364
It's actually not. But when we start the conversation with saying, "Oh, we are the content providers," right? You, you already put yourself in a hole. You have defined yourself in a box. Mm.

81
00:14:50.644 --> 00:15:01.754
And my challenge to everyone is why are you not the platform? Why are you not the first place, the first destination that people come to when they think about news?

82
00:15:02.444 --> 00:15:11.164
I want Washington Post to be the place where people come, and the first app they open when they have a curiosity about something in the world, right?

83
00:15:11.924 --> 00:15:24.304
So that's my challenge number one to everybody who is dealing with this problem is do not let the world frame you. Go find your own audience. Go build your own audience. Give them the experience.

84
00:15:24.344 --> 00:15:36.644
This is not rocket science. This is just simple engineering. You guys have done the hard work of journalism, right? No matter what happens, at the end of the day, computers and AI cannot do the hardcore journalism.

85
00:15:37.384 --> 00:15:43.644
That will always be, like you have all the power. Just build the container around it. People will come to you.

86
00:15:44.484 --> 00:15:51.794
And if you don't, then you will exactly end up in that spot, which is you end up becoming content suppliers to somebody else's platform.

87
00:15:52.424 --> 00:16:03.954
And those platforms, as good as they are, they have very, you know, and they're not evil by design. [laughs] They have very little business incentive to promote you. Right.

88
00:16:03.964 --> 00:16:11.734
Like, if I am on Apple News, Instagram, TikTok, it's great. Like, people understand the brand. It's like a lead generation, right?

89
00:16:11.744 --> 00:16:17.024
But what is your strategy to get people back to your owned and operated to become subscribers?

90
00:16:17.124 --> 00:16:27.664
Do you have a content strategy and business strategy where you're giving some of it almost like you would do in an advertisement, you know, a trailer of a movie, and then what do you do to get them back?

91
00:16:28.044 --> 00:16:33.104
And people have just not really spent too much time thinking about that whole life cycle.

92
00:16:33.644 --> 00:16:44.574
But I, let, let me just push on that a little bit, is I think one of the central questions that, that, that brings up is, is that keeping up with user behavior? 'Cause like anyone who operates in a market, right?

93
00:16:44.604 --> 00:16:51.473
Like you, when you get into a market, I always say the market's gonna tell you what it wants. It, you might... It might just not be what you wanna hear, right?

94
00:16:51.544 --> 00:17:08.064
And it seems to me very clear with the various developments that AI is, you know, is hap- having already, the reason people are, the reason ChatGPT, it, has, has gotten unbelievable adoption so quickly, right?

95
00:17:08.424 --> 00:17:19.714
Is because people prefer this. People prefer to get the answer. They don't wanna click and go to someone's website. Like, they're voting with- Yeah... their, their, their swipes or whatever.

96
00:17:20.304 --> 00:17:32.504
And it might not be what people, what, what publishers or anyone operating on the internet want to hear, but the people building these systems are driven by data, right?

97
00:17:32.864 --> 00:17:39.984
And the data is clearly showing that this is what people want. And the entire... Y- you have to respond to what people...

98
00:17:40.024 --> 00:17:53.004
So my question is, and this is the sort of give me the news experience in three years, if publishers want people, want these people to not just be going to the platforms and say, "Give me whatever I need to know" or whatever.

99
00:17:53.524 --> 00:18:03.324
Like, let's leave this, the content aside because they ha- uh, like let's just assume the content is, is top-notch and everything. How are they going to...

100
00:18:03.364 --> 00:18:16.312
What is the news experiencelike that is differentiated and is attractive enough to people to not take the lowest common denominator, the easiest option?

101
00:18:17.082 --> 00:18:28.452
The kind of good news in this is people are actually telling you how they want their news. So all we have to do is make sure that we offer it to them in that format, right?

102
00:18:29.052 --> 00:18:44.992
Somebody else took in billions of dollars of VC money, ran GPUs, created this LLM technology, and created God knows how many billions in this to tell you, "Ah, look, this is how people want to interact with news."

103
00:18:45.072 --> 00:18:57.612
So our job's, like, really easy now, the way I see it, right? So this is the reason why we made Ask the Post AI. It's like all multi-turn and conversational. But going three to five years now, let's answer that question.

104
00:18:58.432 --> 00:19:12.672
News has two aspects, and this is like somebody who's two years into this industry making... quantifying it a little bit. News does two things. It tells you what is important, and it tells you why it is important, right?

105
00:19:13.012 --> 00:19:25.972
Yeah. When we would open our newspapers when we still had print, and I actually still do 'cause I love print. I, it relaxes me. Here, here. Any distractions of my phone, and we... I l- love the print paper for that.

106
00:19:26.532 --> 00:19:37.992
I almost feel like there is a generation of digital detox, you know, people like me who would probably subscribe more to print- Yeah... become involved. It's a trend. No, no, this is, this is a documented trend.

107
00:19:38.372 --> 00:19:44.072
But see, what happened at that time when we would open and it will tell us everything that is important in the world and why.

108
00:19:44.712 --> 00:19:57.132
Now that what has gone away, what is on Twitter, what is on Instagram, what is on TikTok, what is in my WhatsApp messages, I already know what is important, okay? For the most part. Mm-hmm.

109
00:19:57.152 --> 00:20:09.712
Then the why is something that we still surface. The why is why somebody would come to a newspaper and read a full article about what's happening in the White House and be taxed to get into the details of it, right?

110
00:20:09.772 --> 00:20:22.832
Yeah. So if you focus on the why, which now I believe fundamentally nobody other than journalists can do. They're the smartest people on the Earth. They can take any topic and talk in five, 600 words in details about it.

111
00:20:22.872 --> 00:20:26.872
Like, that's amazing. But not every why is equal.

112
00:20:26.932 --> 00:20:40.172
How silly it is to assume that one person, like, wrote an article for 800 words, and there are 100 million people who had the same curiosity, who had no further follow-up questions. Like, it fit them all.

113
00:20:40.232 --> 00:20:41.652
Like, one size fit them all.

114
00:20:41.712 --> 00:20:55.952
That is so silly, and this is why we see ChatGPT and Gemini and these type of interfaces that are curiosity driven, that are question-driven succeed because they let people follow their own path.

115
00:20:56.132 --> 00:21:04.382
They give them enough information, but as a follow-up, you can say, "But wait, I really don't get this. Can you go deeper into that topic for me?" Mm-hmm.

116
00:21:04.452 --> 00:21:17.992
Now, if you look at the Israel-Gaza war, for a 20-year-old, you probably need to spend few hours giving them context why this is happening, what is the importance of it, how did this all come about?

117
00:21:18.052 --> 00:21:24.892
Versus somebody your and my age who already has accumulated a lifetime context- Yeah... around these topics, right? Yes.

118
00:21:25.052 --> 00:21:33.212
And I think- We've read the, we've read the same article, like, 50 times over the course of 30 years, face it. [laughs] We've read our history books. We know what happened in World War.

119
00:21:33.252 --> 00:21:40.072
We know how these countries were formed. Yeah. We understand colonialism. There's... It is so multidimensional, right?

120
00:21:40.092 --> 00:21:53.532
But that is the reason why these kids, generally kids, are going towards these type of platforms first is because we are not able to serve them. That's why we made, you know, Ask the Post, made it multi turn.

121
00:21:53.592 --> 00:22:07.172
We let it go, you know, where people want to go. And we see with data, and we have done some studies with the university, we will publish that one very soon, it has actually led to more consumption of news.

122
00:22:07.412 --> 00:22:20.812
So people who use it, like, properly use it, and are coming- Mm... spending time with it, they actually consume Washington Post more. They go to more and more different sections 'cause we are not sending them away.

123
00:22:20.972 --> 00:22:30.512
We are basically answering their questions, getting them more curious, and now they wanna know more. And that's where we need to be in the three to five years as a media industry.

124
00:22:30.572 --> 00:22:38.372
This one size fits all as the article, I think that needs to go away. I mean, it's- So you think...

125
00:22:38.582 --> 00:22:49.392
So let me just jump in because I'm, I'm obsessed with, with, with webpages and the 700-word inverted pyramid article, having written many of them in my life. They're imperfect.

126
00:22:49.512 --> 00:23:00.572
I mean, a 700-word inverted pyramid article came from the limitations of, of, of l- physical print. You go back to the printing press. Yeah. I mean, it came from column inches, right? There was scarcity.

127
00:23:00.752 --> 00:23:11.082
I mean, anyone who has written for... I kinda miss it because, like, if you're writing for, like, a magazine, like, you've got, you've got s- specifics. You're trying to cut out little words. In digital- Yeah...

128
00:23:11.112 --> 00:23:25.132
like, you, you can just go on forever, and that's why I think I, I actually like print because constraints can lead to better products. But the shipping a one-size-fit fits all article, that is a hard for...

129
00:23:25.172 --> 00:23:36.932
That's hard for me to believe that in 700 words it... or, or, uh, is gonna be the case i- in, in three, in three to five years. I just don't see it. Everything is customized. Everything is, like, on demand.

130
00:23:37.472 --> 00:23:47.092
The needs of you a- i- are different than the needs of me, and they're very different from a 20-year-old who, who just found out about, like, the, the Gaza-Israel conflict.

131
00:23:47.692 --> 00:23:57.812
And to me, I compare it to, like, choose your own adventure. I used to love those books as a kid. You know, you could go in different directions, and you could go deep, or you could just skip ahead to the end.

132
00:23:58.472 --> 00:24:09.512
Give me an idea of what a news article is like in three to five years, and then we'll come back in three to five years and see if it's the case. I think the news articleYou actually get more free.

133
00:24:10.312 --> 00:24:22.752
You can write a whole lot more. You're not constrained by seven, eight hundred words. Mm-hmm. I think the future is if you write literally a book when you think about a topic, go ahead, write a book.

134
00:24:23.512 --> 00:24:34.572
If you think about a complex topic and you wanna do stick figure animation, do the stick figure animation. If your art, your skill is making a video or making a podcast, do that.

135
00:24:35.392 --> 00:24:49.052
Let that get fed into AI and let people consume it the way they want to consume it. Right. So I think it frees up people a whole lot more, 'cause you're now outside the restriction of these print columns.

136
00:24:49.772 --> 00:25:03.332
So will somebody ever read your original piece in its entirety? I don't know. I don't know the answer to that, but I do- Some, some percentage will. Some people want the package, and I get that.

137
00:25:03.872 --> 00:25:12.702
But th-that-- to me, it's, like, necessary but insufficient. Like, some people wanna see the background information. Like, for me, sometimes when I, I... I'm, like, a nerd, right? And, like- Very good...

138
00:25:12.702 --> 00:25:24.632
I want, I want to see, like, the actual transcript. I wanna see the original. Sometimes I read news articles, and they keep, like, describing, like, these things, and I'm like, "Well, why can't I just see it?

139
00:25:24.652 --> 00:25:32.352
Why aren't you just showing me the thing?" 'Cause I wanna, like, query it. Like, what is it? Like, I'm a fairly intelligent person. Just show- I feel like I can do this myself. [chuckles] Yeah.

140
00:25:32.732 --> 00:25:42.302
I think the news article, right, like, the point I'm trying to make is, like, I think it gets bigger. It gets more freer. It gets mixed modality.

141
00:25:42.332 --> 00:25:50.672
It basically becomes what you as a person are good at in storytelling, right? What is your special skill and your special art of storytelling?

142
00:25:50.772 --> 00:26:01.692
You just do it at that, and you get more free 'cause you're no longer- Yeah... by print. So there is no one-size-fit-all consumption, and there is no one-size-fit-all creation.

143
00:26:01.752 --> 00:26:04.252
Even creation doesn't have to be one-size-fit-all.

144
00:26:04.932 --> 00:26:14.612
Of course, somebody somewhere will either invent technology, of course, with people and editors in the middle to say, "Okay, I got this one-hour video from Brian and Vineet.

145
00:26:14.752 --> 00:26:29.492
I need to turn this into a six-column article for my Sunday special edition." Right. Sure. We will have the technology help you do that. But at least you and I are now free to completely practice our skill and our art.

146
00:26:29.932 --> 00:26:40.952
So- Yeah... this more is more type of world, I see this as a world where you kinda get a lot more unshackled. Yeah, for sure. So let's talk about the interface layer, right?

147
00:26:41.072 --> 00:26:55.252
Because, you know, you, you were a-at Apple working on Siri all the way back i-in, like, 2008, was it? Like- It was- When you started... it was an independent company 2008 and '9. We sold it to Apple in 2010. Right.

148
00:26:55.732 --> 00:27:05.012
So, like, you've been-- And, and just to be clear, like, you've been working on AI. You have a degree in AI from, like, 2005. So, like, you, you go back. This is not, like, a, a recent sort of thing.

149
00:27:05.052 --> 00:27:20.572
There's a lot of, like, you know, you're not an AI prompt engineer exactly. [chuckles] So tell me about what that experience and how that informs now, all these years later, how you think about

150
00:27:21.592 --> 00:27:34.222
interface and particularly the role that audio will serve in, on the interface layer. Because I'm always-- Like, I don't have a, I don't have a mouse right now, but, like- [chuckles]...

151
00:27:34.232 --> 00:27:52.012
you know, like, a lot of the way we interface with information is, to me, going to change completely in three to five years, and I also add to that thinking about the screens. Like, the screens are a scourge.

152
00:27:52.072 --> 00:27:59.112
I don't think anyone gets-- Very few people get to the end of the day and say, "You know, this was w-- this was a really good day.

153
00:27:59.892 --> 00:28:12.272
I wish I spent more time looking at my phone and scrolling on my phone, like, on the couch. That, that would've made it a better day." And so inevitably I always think, you know, we're gonna look back at this era.

154
00:28:12.992 --> 00:28:21.732
Like, for instance, I'll, I'll just go. Like, when I walk around New York City, I think about-- 'cause New York City is a very analog city. You're in New York right now, right? Like- Right... it's a very analog city.

155
00:28:21.802 --> 00:28:32.672
'Cause I think about it, I've, I've lived there s-uh, since 1999, and I think if I were to tr-transport from 1999 to today, I would walk around New York City and be like, "It's not that different."

156
00:28:33.032 --> 00:28:42.392
The biggest thing that would stand out to me is everyone walking around looking at their phones, nearly getting run over by Ubers instead of taxis, so that would be different. That would stand out to me.

157
00:28:42.412 --> 00:28:54.752
And it's just hard for me to, to not think that the way we interact with information is not going to be completely different in three to five years. Not Ubers, but self-driving Waymos and Ubers.

158
00:28:55.092 --> 00:29:06.692
[laughs] Well, we don't have them in New York yet- They're, they're ac-... for obvious reasons. Uh, it's a topic for another podcast. [laughs] They are really good. They're much more safer, the math and the stat.

159
00:29:07.072 --> 00:29:15.672
And the, you know, I had friends in 2003 and '5 when I was working on AI. I specialized in natural language, so I went the path of Siri. Yeah.

160
00:29:15.682 --> 00:29:23.332
And I had friends who specialized in self-driving 'cause this was a big DARPA project. This was sponsored research.

161
00:29:23.872 --> 00:29:34.492
And I know in 2005 and '6 they had the cars that could drive from New York to New Jersey and back on their own. So this te-- that technology- Yeah... is also very safe.

162
00:29:34.532 --> 00:29:41.012
It's regulatory and insurance that is, like, you know- Uh, the world will come... holding up. But coming back to the Siri question, right?

163
00:29:41.532 --> 00:29:47.612
I got contacted by the founders of Siri, Adam Cheyer, he was the main engineering head behind it.

164
00:29:48.472 --> 00:29:59.782
I was very curious, and I went and met with them, and I interviewed with them, and we had to, you know, do code exercise and everything 'cause I was a young college grad at that time. But it was a very easy sell, right?

165
00:29:59.832 --> 00:30:07.492
How many people in the world are actually using computer right now? Very few. What's really holding them back?

166
00:30:07.512 --> 00:30:16.156
Well, you kinda have to learn how to use computers.You have to understand the mouse and the clicks and these design paradigms and minimize.

167
00:30:16.196 --> 00:30:22.596
Like, I still try and get my dad to maximize and minimize a web window and it fails every time, you know?

168
00:30:22.656 --> 00:30:36.296
It's almost a joke of, like, parents are the only people who know three different ways to insert the USB in the wrong end. [laughs] But if you start talking about voice, if you start- You're talking to a computer.

169
00:30:36.836 --> 00:30:46.356
You're talking to a computer. Yeah. Everybody on this planet can talk. They will have a different language, but at- Mm-hmm... the heart of it, everybody knows how to talk. And that was it.

170
00:30:46.496 --> 00:30:58.395
That was the whole promise of Siri, is just talk to your computer. Tell it what to do. You could buy movie tickets, you could book restaurants, you could call a cab, you could order food, you could do shopping.

171
00:30:59.216 --> 00:31:09.336
I have gone to movies, you know, that I used... Like when we were Siri the app and we had this full integration with Fandango and movie tickets, I would just use to buy movie tickets and go.

172
00:31:09.436 --> 00:31:19.896
I would use just voice to book a restaurant- Hmm... and, you know, go for dinner and then have make sure the movie times align after that. So voice is really free.

173
00:31:19.986 --> 00:31:31.896
It, it just literally enables all of the planet to use computers. So the way we think about it, the way we thought about it at times is, yes, it's great, it's technology and

174
00:31:33.116 --> 00:31:44.376
it, it should, you know, be in everyone's hand, but really the core motivation of all of this was, this really frees up all of the humanity to use computers. Wouldn't that be awesome? Right.

175
00:31:45.036 --> 00:31:56.176
And when I look at that world and I come over to news, it is not that very different, right? People want information. People are curious.

176
00:31:56.876 --> 00:32:14.866
If you're forcing them to go down the design patterns of getting an iPad and learning how to use a browser or, or use their app, and there is no consistency behind these apps, every app looks different, then of course they're going to use less and less of it, right?

177
00:32:14.866 --> 00:32:27.416
Right. Now, if we do the same thing that applied to us in 2007, we bring it all the way to news and say, "This is just another modality. If you wanna click, if you wanna read a full article, please read a full article.

178
00:32:28.176 --> 00:32:38.356
Okay? If you wanna hear a podcast, hear a full podcast. But if you want something that is more conversational, we have an option for you."

179
00:32:38.436 --> 00:32:49.176
Even in the world of when you think about, you know, single unit that fits all- Hmm... when we, when I say no one size fits all, I don't just mean text.

180
00:32:49.496 --> 00:32:53.916
We are actually taking that paradigm and blowing it apart at Washington Post.

181
00:32:54.716 --> 00:33:05.996
And I don't know when this podcast will come out, but hopefully by then we would have released our AI podcast, which is the AI system for Washington Post- Hmm...

182
00:33:06.076 --> 00:33:16.636
knows what are the things that are you are most interested in and creates almost like a personalized daily podcast for you. Okay? With an AI voice? With an AI voice.

183
00:33:17.016 --> 00:33:24.216
It is based on what you read and what your topics are and what's happening in the world. 'Cause our personalization model is not a silo.

184
00:33:24.296 --> 00:33:35.136
We use like a twin towers model, which knows what's happening in the world and what your interests are, and matches those. But even in that podcast, Brian, we're not gonna say, "Oh, I know, I know it all.

185
00:33:35.256 --> 00:33:41.356
I'm just gonna give you like eight minutes of what you want." You will be able to interrupt that podcast and ask it questions.

186
00:33:42.176 --> 00:33:53.956
You will be able- Okay, so it's, it's giving me an overview of like, the latest, like Gaza, Israel, and I'm like, "Wait a second, what, what, why... What is the relationship between like Egypt and, and Hamas?

187
00:33:54.056 --> 00:34:00.846
Like, why aren't they, why, why does Egypt..." Right. Like, you know, whatever. Like, it's something like- Right... I, I would be able to say that and they, it, it would change. Yeah. Okay.

188
00:34:00.876 --> 00:34:11.216
You will be able to bring the voice-driven Ask the Post into the middle of that podcast- Okay... interrupt the host, get your questions answered, and then go back to your podcast.

189
00:34:11.656 --> 00:34:22.596
'Cause once again, one size doesn't fit all. One size might not even fit for you in different times of the day [laughs] you know? You could have forgotten things and you just wanna talk about it.

190
00:34:23.076 --> 00:34:27.396
So this is almost like the long answer also is like, why is voice so important?

191
00:34:27.436 --> 00:34:40.956
Because we can all talk, and if I have a curiosity in my head, I want to make it easy at a product level for people to ask that question, get that need satisfied, and maybe they exit.

192
00:34:41.016 --> 00:34:50.776
Like, that's the downside of it, by the way. You know? The world of internet is all around engagement metrics, and sometimes when you answer people's question, they kinda leave you a little bit too soon.

193
00:34:52.016 --> 00:35:01.856
[laughs] You don't get to show them ads. Yeah. I am fine with that, right? Like, it's a value exchange. We need to do value exchange. Well, ads, ads are friction. I mean, ads are friction.

194
00:35:01.916 --> 00:35:11.546
This is why, you know, Silicon Valley's typical... I'm gonna speak in complete generalities. Typical aversion to ads, only because I've seen it, like, so often. Yeah.

195
00:35:11.576 --> 00:35:19.926
Like, I always joke, like, the first, the first step of, like, a technology company running advertising is the vow that they will never run advertising.

196
00:35:20.076 --> 00:35:28.856
It's like along the sort of, it's like a Kübler-Ross kind of [laughs] you know? And we saw this. We've se- we're seeing this play out in real time with Sam Altman, you know?

197
00:35:29.746 --> 00:35:36.956
[laughs] Well, and eventually they're gonna, they're gonna stick ads in there because, you know, they get over the friction because, like, the math is the math.

198
00:35:37.136 --> 00:35:43.056
I, I mean, the other ironic part is Silicon Valley's built on advertisement. Yeah.

199
00:35:43.096 --> 00:35:53.616
That's how Google made its money and seeded a lot of companies in Silicon Valley were people in Google who made a lot of monies and they broke away and they funded other companies and so on and so forth. Yes.

200
00:35:53.816 --> 00:36:02.316
It's all been- But I will say this, having spoken to a lot of them over the years. They used to always talk about turning on the revenue spigot with ads, which I was always...

201
00:36:02.336 --> 00:36:13.086
As someone, it was a media person, I always found that kind of hilarious because, like, when you don't control the interface layer, you can't, like, say things like, "Turn on the revenue spigot." [laughs] Yeah.

202
00:36:13.096 --> 00:36:26.108
Like, you have to go out there and eat what you kill.But tell me about like, so the conversational part, 'cause I want to tie this into the voice because ChatGPT came out three years ago, and it came out with, as a conversational interface.

203
00:36:26.688 --> 00:36:38.548
AI, as y- you, you've been in this world, like has existed for a long time. You know, Google had all this, and to me, like the real sort of secret sauce of ChatGPT was that conversational interface.

204
00:36:39.068 --> 00:36:44.787
I used to cover the early search industry, and I always thought that Ask Jeeves was a great model.

205
00:36:45.228 --> 00:36:58.728
[chuckles] You know, because it was, it was basically a differentiated product to, in a differentiated appro-approach to 10 blue links. And the problem was they didn't have AI. [laughs] Yeah. The flaw.

206
00:36:59.348 --> 00:37:10.088
But ChatGPT made this thing conversational, okay? And the, and it exploded, and I think that's related. Yeah. Like, I don't think if they, they chose a different... I don't know if that's a modality. I don't, I don't...

207
00:37:10.188 --> 00:37:17.708
Like, but like- [laughs]... if they chose a different approach, I don't know if it would've been such a breakout success. Yeah. It's, it is the scale problem, right?

208
00:37:17.788 --> 00:37:32.078
Like the tech industry has a scale problem at a different level because pre-LLM technology, doing conversational was still very algorithmic driven and like what we call manual algorithmic, right?

209
00:37:32.108 --> 00:37:41.768
Like we had AI in it, but you kinda have to do a lot of templating and logic, so it will always sound very machiney and it will never be natural.

210
00:37:42.608 --> 00:37:55.528
This is what LLMs inherently unlock, is the conversational ability, 'cause they have been trained on these, you know, all of the internet. So they know how people write, they know how people talk.

211
00:37:55.548 --> 00:38:06.708
They actually know less about how people talk. When people complain, "Oh, AI is too verbose," well, it's because what we have fed it is a lot of verbose stuff that was printed on the internet.

212
00:38:07.528 --> 00:38:20.968
Once it starts to get a lot of audio, a lot of podcast, a lot of talking, you will see the LLMs evolve and they will become even more terse and better in conversations. So this is a...

213
00:38:21.028 --> 00:38:33.188
You know, that's the reason why the older version of Siri, you know, when we even started, the conversational part was just not natural, and LLMs inherently unlock it. Yeah. So

214
00:38:34.108 --> 00:38:48.488
tell me about how this is gonna ha- e-end up affect- changing like what the news product experience is. Like, if, if voice is going to be a major form of interface, like when people are...

215
00:38:48.528 --> 00:38:56.107
Like, they're gonna be using voice and, like, it's not gonna be using voice to get a bunch of, to get like a 700-word article, like, spit out at them.

216
00:38:56.188 --> 00:39:07.568
Like, I mean, it's, it's, it's going to be conversational and it's like voice on voice, right? And, and look, the AI voices right now, like in my view, like y- you take a hit.

217
00:39:07.668 --> 00:39:11.268
Now, I don't know whether that is just because it's new.

218
00:39:11.528 --> 00:39:28.628
Like for instance, I just saw the iHeartMedia, you know, they're having their DJs say, "Guaranteed human," to basically make cl- clear to people that this is not an AI voice, which is interesting to me.

219
00:39:28.718 --> 00:39:45.068
And I don't know whether that is just now, like as a human who speaks a lot, uh, i-into a microphone, like I'm hoping that human premium does, [chuckles] does maintain for, for a lot, and I know that I will be, I, I will be accompanied by robot voices too.

220
00:39:45.528 --> 00:39:50.428
But how... Let's just talk about that for a little bit. Is that just a temporary phenomenon?

221
00:39:50.468 --> 00:40:05.428
Because w- I find just having done a lot of podcasts in my, in my life, that sort of voice, the human voice has a lot of connection to people. Yeah. It's, it's qualitatively different than, than print.

222
00:40:05.508 --> 00:40:09.848
Like I've, I've done a lot of... And I print text. I've, I, I write a lot, right?

223
00:40:09.868 --> 00:40:23.668
But I've noticed there's a complete difference if I speak with someone who has read my articles but not listened to podcasts versus someone who has just listened to podcasts. The, the connection is absolutely different.

224
00:40:24.308 --> 00:40:39.688
Yeah. I, I think you're probably right for a decent amount of time. Okay, good. Uh, 'til AI goes quantum and then- Oh, no... I'll be off. Give me 10 years, Vineet. Yeah. I, w- we have, we have a lot of time for that.

225
00:40:39.758 --> 00:40:45.028
[laughs] But I do think this is like a product market fit type of question, right?

226
00:40:45.788 --> 00:40:58.588
There will be a group of people who will always prefer the human nuance, the uncertainty, the unnatural pauses, the ability to say, "I don't know"- Yeah... and explore together.

227
00:40:59.408 --> 00:41:10.828
And there will be products where I want just a briefing, right? Yeah. Then I might just choose AI over there. Yeah, that's true. The, the voice- It's like, it, it's like the, you know, the...

228
00:41:11.468 --> 00:41:20.488
I think about weather, right? Yeah. You can get weather. It's like such a... It's like the most commoditized information, right? I mean, first of all, the government corru- collects it, like all the other, like...

229
00:41:20.948 --> 00:41:31.108
But, you know, you can easily get the... The Weather Channel still exists. Yeah. I j- I just don't see, you know... My mindset is more is more. Okay. That's the world we live in, right?

230
00:41:31.688 --> 00:41:44.068
When technological advances happen, human beings decide whether this is a value exchange and they decide to use it or not use it, right? Like crypto for the longest time, you know, not so much.

231
00:41:44.188 --> 00:41:50.938
You know, the majority of the world did not get into it. But when you look at AI, the majority of the world got into it, right?

232
00:41:51.308 --> 00:42:01.008
S- that is because they're getting some value exchange out of it, and we should recognize that as media industry and do it. It doesn't mean we stop doing what we are doing, right? That, that's not the case.

233
00:42:01.908 --> 00:42:14.388
Speaking on voice, right? Like, I just had a great experience in Amtrak yesterday, of all places, when I was coming up to DC. We have a cafe car where you can go buy sandwiches and drinks and, you know, beer and stuff.

234
00:42:14.728 --> 00:42:25.158
They have a hot dog too. They have a- Not bad... not a bad hot do- I mean-I'm very- It's a microwave hot dog, but... Yeah. I'm very wary of doing hot dogs on train. You know, I just feel like- [laughs]...

235
00:42:25.168 --> 00:42:40.488
it will never end well. Fair. They could have used a automated machine voice to say, "Hey, the cafe car is open. Please come and get your stuff." But the guy announcing it, he literally made cafe car sound like a party.

236
00:42:40.718 --> 00:42:50.638
"We've got champagne for you, and I got-" Yeah..."this wine, and I got that red for you, and I got chips. And if you don't like wine, I got beer for you that goes with the hot dog." You know?

237
00:42:50.828 --> 00:43:01.797
I was like, "Man, I gotta get to that cafe car. This guy is so much fun." Yeah. We're all gonna become Southwest Air pilots, basically. But isn't that awesome, right? Like, now the premium on the human, right?

238
00:43:01.868 --> 00:43:12.308
Like, imagine the premium on that human actually doing something in the moment and infecting us with his energy. Computers are a long way from doing that, right?

239
00:43:12.368 --> 00:43:23.448
Like, they are still tools for us to live, live our best life, you know? That's how I use them, and that's how we should use them. That's how even in media we should think about AI.

240
00:43:23.578 --> 00:43:33.028
It is helping us get better at everything we do, so let's do that better. Yeah. Give it to our audience. Don't take it away 'cause then they're going to other platforms.

241
00:43:33.728 --> 00:43:45.378
So, so explain to me h- will, will, so will voice be the default for starting a news interaction? Like, will people- I don't think so... ask? I... No? Okay. No, I think it's still the back to the what and the why, right?

242
00:43:45.408 --> 00:43:52.828
Okay. If what is important and why it's important, there are definitely days I want surprise and delight. I want to open up a newspaper.

243
00:43:52.928 --> 00:44:07.288
I want to see what this really smart editor thought is a really important story that the world should know, right? So I, I really do want that expert opinion and the expert curation. It's, it's like being a DJ, right?

244
00:44:07.488 --> 00:44:18.158
I- Mm. You can think about it, there are infinite number of auto DJs possible for God knows how many years, right? Like, 30, 40 years you could have programmatic DJs.

245
00:44:18.748 --> 00:44:33.708
You could have this computer and software that could run an entire club for you or an entire playlist for you. But human DJs have never gone away. That value of having an expert blow your mind, you know, still exists.

246
00:44:33.768 --> 00:44:41.928
So even in the world of news, whether you do print or whether you do app or iPad, you want the expert view, and there are times you don't.

247
00:44:42.648 --> 00:44:55.728
There are times you come in with your question, and voice just happens to be the most easiest mode for you to ask that question, get an answer back and move on. Yeah.

248
00:44:55.768 --> 00:45:08.148
So I feel like with AI, let's talk about AI for a little bit. And so I feel like with AI, at least in the early part, it's, it, it's been good at summarization. It's been good at versioning, you know.

249
00:45:08.208 --> 00:45:18.068
Like, so here's a bunch of text. Give me a version of this i, a- as, you know, a presentation, where it's like NotebookLM is, is now pretty good at this, and it'll get better, right?

250
00:45:18.468 --> 00:45:32.648
I think the agentic stuff is not there. It's just not. Like, I'm, I am not, I'm not holding my breath for this century to be able to book a, a vacation through agentic AI. Now, I would take that bet.

251
00:45:32.728 --> 00:45:47.788
I just think that there's too many, there's too many variables in it, and this is the use case that a lot of the people building these systems have put out, and I don't think it's necessarily gonna hap- Like, I think we'll have full self-driving before we actually have that.

252
00:45:47.808 --> 00:45:56.008
Same for a robot who actually is able to fill and empty the dishwasher in- Right... not in a controlled setting. I just don't think...

253
00:45:56.408 --> 00:46:07.028
There's too-- I think a lot of the things, it's like a lot of these systems, it seems sl- like to me, need things to be very structured in such a way to be able to, to operate.

254
00:46:07.038 --> 00:46:17.128
And my, like, uber fear, right, is that society will have to conform to these machines and become less manic and chaotic. Like, you know?

255
00:46:17.168 --> 00:46:31.408
Like, I see delivery robots all over Miami Beach stranded and, like, confused because it's actually easier, I believe, to have a self-driving car go from point to point because you know where the roads are and everything.

256
00:46:31.468 --> 00:46:43.428
Sidewalks are, like, messy. There are dogs. Dogs are very unpredictable, and the robots, if a dog comes near one of these delivery robots- Mm-hmm... the, the delivery robot freaks out.

257
00:46:44.288 --> 00:46:54.368
A- and I don't know where I was going with that. But, like, e- e- explain to me, like, where, where you see that, that, that, uh, going when it comes to, like, the media experience, and then I'll let you go.

258
00:46:54.768 --> 00:47:05.608
The, the, te- My brain's also now firing in 10,000 different direction. [laughs] On the world of just agentic, I would take the other side of that bet. Okay. Right?

259
00:47:05.858 --> 00:47:18.048
I do think there will be tasks that you can do with agentic AI, and we will learn very fast which ones can be done now and which ones will come five, six years down the line.

260
00:47:18.548 --> 00:47:28.708
But I think it's gonna happen much faster than you imagine. And I say that because in the earlier versions of Siri that we built, you know, we did planning.

261
00:47:29.148 --> 00:47:40.648
This is simple planning, and I would say, "Hey, I want to go on a date tomorrow." And then Siri would have come back and said, "Here is a suggestion for a restaurant, and here's a movie," right? Mm.

262
00:47:41.308 --> 00:47:53.628
All of those type of interactions, they can start to learn from you. You know, what do you mean when you say this? When you want to book a vacation, they can come up with an itinerary. And it's not gonna be one and done.

263
00:47:53.668 --> 00:48:03.288
Like, this is a conversation. You're talking- Yeah... to a machine. So it's gonna come back and say, "Hey, Brian, based on your history, I see you like a hotel by the beach always.

264
00:48:03.328 --> 00:48:18.298
Do you want me to keep looking for that?" "Yeah, okay. Go look for it. Keep it under $600," right? This is all doab. This is no longer rocket science. We have the MCP protocols and just general old API style. Mm-hmm.

265
00:48:18.368 --> 00:48:29.076
You know what a software service can provide. You turn that into natural language, right? So when you look at, for example, a-Hotel booking website, it brings you all these options.

266
00:48:29.156 --> 00:48:35.216
It is not a stretch of imagination to say, "Hey, AI, can you translate this into a paragraph for me so you can tell me?"

267
00:48:35.256 --> 00:48:45.256
And the AI could come back and say like, "Brian, I got nothing around a beach, but I got something from the beach, and the reviews say the views are not that bad. Would this be a good compromise for you?"

268
00:48:45.476 --> 00:48:55.316
And you would say yes, and it would go ahead and block that for you. So agents doing commerce for you is gonna happen faster than you think- Okay... 'cause this is a lucky part of the internet, right?

269
00:48:55.376 --> 00:49:05.936
Like I always say- Yeah... this is... in, in Washington Post, I've gone on stage and said that let AI do the sucky part of your job. The sucky part of a vacation is planning the damn vacation. Right.

270
00:49:05.976 --> 00:49:12.806
So I would very happily- That's true... the sucky part- Although some people like the, the, some people like the vacation. And I know you gotta go, but like just- Yeah...

271
00:49:12.886 --> 00:49:26.226
r- real quick, I think a big question always in journalism and in news when it comes to AI is, is AI gonna write the news, right? Now, AI is used to summarize financial filings and, and all- Right...

272
00:49:26.276 --> 00:49:34.816
kinds of different things by, by, by news organizations. You know, Business Insider, for instance, now is like tiptoeing into like an experiment.

273
00:49:34.916 --> 00:49:45.256
They are, you know, they're very cautious about it, that they're gonna have an editor look at it, but th- that they're going to have specific pieces where th- that is gonna be created by AI.

274
00:49:45.656 --> 00:49:57.336
Where do you stand on the ability of AI to serve a primary role as the creator of news content? I think you nailed it over there, right?

275
00:49:57.616 --> 00:50:10.396
There are things that we do which can be done by AI cheaper and better than us or at the same level of quality, and why would you not spend time doing the harder things?

276
00:50:10.896 --> 00:50:18.636
And AI cannot do investigative journalism, and AI cannot do this deep analysis, and AI cannot have a human opinion, right?

277
00:50:18.656 --> 00:50:30.416
Like, when we have our opinion section, it's not the fact that they are saying some opinion that matters the most, it's the fact that a human had connected bunch of those dots and written that opinion. Right.

278
00:50:30.896 --> 00:50:35.616
So more is more. There will be a world where AI will write a lot of stuff.

279
00:50:36.376 --> 00:50:46.636
There will be a world where AI will r- write a lot of stuff, and some human will edit it, and there will be a world where a human will write everything, and there will not be an AI involved anywhere in the mid.

280
00:50:47.296 --> 00:50:58.516
So- Yeah... there is no one-size-fits-all, again, Brian. There- Yeah... it's all going to happen at the same time. Yeah. We as consumers get to choose it, right? Like h- here's a beautiful part of it.

281
00:50:59.136 --> 00:51:09.296
Let's say that financial report thing. I read this day in, day out, and there is no damn analysis in it, and I buy that stock, and I lose money. What do you think as a human? I'm gonna be like, "I'm done.

282
00:51:09.396 --> 00:51:21.096
I'm going to Brian Morrissey, my financial advisor, 'cause what he wrote actually made sense to me." So don't forget, we are humans on the other side consuming it too. Yeah. We get a choice. Got it. Awesome.

283
00:51:21.336 --> 00:51:28.756
Well, I think you, I think you hit on the theme of this episode, which is more is more, and that the one-size-fits-all era is, is definitely ending.

284
00:51:29.096 --> 00:51:46.476
I coulda done a Joe Rogan style three-hour podcast, but we're gonna leave it there. Vineet, this was a, this was a real pleasure. Thank you so much. Thank you for having me, Brian. Glad to be here. [outro music]
