WEBVTT

1
00:00:00.100 --> 00:00:09.140
[on-hold music] She had this one line that really struck me where it was, "All our existential questions are now engineering problems."

2
00:00:09.340 --> 00:00:16.520
[laughs] Well, I don't know that, that I agree with that, but look, I, I think- But then one of the things, it's like the soul, like when, when we- Yeah. I mean, look- It's very arbitrary.

3
00:00:16.600 --> 00:00:33.780
We've been trying to figure this out since really- Millennia... we started thinking. Right. I think that the human components of an organic soul, mind, brain, body is something that will not be capable by AI. However...

4
00:00:35.629 --> 00:00:53.500
[on-hold music] Welcome to the Rebooting show. This is Brian Morrissey.

5
00:00:53.840 --> 00:01:02.990
Each week, I speak to those in the media ecosystem who are either building sustainable media businesses themselves or play an important part in their construction.

6
00:01:02.990 --> 00:01:10.760
This episode is what I call a spotlight episode that I do with select partners of the Rebooting. I've been fortunate to work with several great companies, and KERV is one of them.

7
00:01:11.260 --> 00:01:24.500
If you don't know KERV, they have a very innovative technology that, yes, uses AI in order to identify objects within video. This allows KERV to do all kinds of things, including make shoppable video a reality.

8
00:01:24.540 --> 00:01:38.020
In this episode, KERV CEO Gary Mittman and I discuss realizing the promise of shoppable video, along with the need for media to move beyond the tried and true monetization methods of subscriptions and traditional ad formats.

9
00:01:38.060 --> 00:01:48.700
We also dip into the role AI will play in advertising, including how it will soon be able to, quote-unquote, "fix ads" that aren't drawing enough of what Gary calls active attention.

10
00:01:49.500 --> 00:01:56.240
And that is a perfect segue to tell you a little bit about the event the Rebooting is doing at Cannes in partnership with KERV.

11
00:01:56.740 --> 00:02:06.300
For three days, we'll explore the new attention economy from June nineteenth to June twenty-first. We have nine hours of programming lined up and over thirty speakers joining us.

12
00:02:06.740 --> 00:02:16.200
I'm recording several podcast episodes during the event. I hope I will not lose my voice, and they'll be published in this feed daily that week. A few select highlights of the agenda.

13
00:02:16.600 --> 00:02:27.119
I'm speaking with Hearst CRO, Lisa Howard, about the modern media playbook, as many so-called legacy media companies have emerged in far better shape than the digital upstarts that were supposed to replace them.

14
00:02:27.600 --> 00:02:34.120
Bloomberg CRO Christine Cook will join me to discuss reasons for optimism amid the doom and gloom.

15
00:02:34.160 --> 00:02:41.880
And Group Black's Bonin Bough, someone who I've known many, many, many years, will discuss how media needs to adapt to reach younger audiences.

16
00:02:42.180 --> 00:02:49.540
And I promise I will push Bonin on when Group Black will actually buy one of the companies it is always reported to be looking at.

17
00:02:49.580 --> 00:02:59.000
I'll put a link to register in the show notes, or just visit the rebooting.com and you'll find a link at the top of all of my recent newsletters, which I hope you're already reading. Please do if you're not.

18
00:02:59.440 --> 00:03:07.120
Also, send me your feedback on the show, good and bad, ideally helpful. My email is brian@therebooting.com, and leave the show a rating and review.

19
00:03:07.260 --> 00:03:20.440
I always appreciate those, particularly the good ones, I have to say, and supposedly they help people find podcasts. So here's my conversation with Gary. [on-hold music] Gary, thanks so much for joining me today.

20
00:03:20.500 --> 00:03:30.120
Where are you? Um, I'm- Are you in LA? Well, as you know, our corporate's in Austin, Texas, but I'm- Yeah... based in Los Angeles. Yeah. Okay, cool. So I'm really looking forward to hanging out in Cannes.

21
00:03:30.180 --> 00:03:37.140
You know, we're doing three days of programming together at the KERV Cafe- Yeah... which is on the Croisette. Well, it's off the Croisette.

22
00:03:37.180 --> 00:03:43.100
I guess it's not technically the Croisette, but it's, like, across, I think, from the Palais. I know that area from my many years in Cannes.

23
00:03:43.140 --> 00:03:53.500
I never would have thought, you know, honestly, like, growing up in, like, suburban Philadelphia, that I would, like, have go to the south of France as much as I have. [chuckles] And be knowledgeable of the town.

24
00:03:53.510 --> 00:03:58.049
[laughs] Yeah, I can, like, give people, like, "Don't go there. Go here." So anyway, yeah. Mm-hmm. I'm really looking forward to it.

25
00:03:58.120 --> 00:04:08.940
We've got a amazing group of speakers that are coming, and everyone should obviously join us there. I'm in a bias and says Gary, but- Yeah, no, we're gonna have a lot of fun. We're gonna have a lot of fun.

26
00:04:08.970 --> 00:04:15.560
And really, the speakers you put together are great. I'm very much looking forward to the conversations. Cool. Yeah. I'm really happy with it.

27
00:04:15.600 --> 00:04:23.920
And I'm gonna be doing a daily newsletter from there with help from Mike Shields, 'cause I'm only one person, and, and a daily podcast and a few other things.

28
00:04:23.980 --> 00:04:30.130
So it's gonna be, it's gonna be a busy week, but I'm gonna be in the south of France, so what am I... Nobody's gonna be crying for me.

29
00:04:30.180 --> 00:04:34.570
Whenever I'm like, "I'm gonna be so busy that week," they're like, "Whatever," [chuckles] which I get. So I wanna talk about...

30
00:04:34.610 --> 00:04:47.040
'cause one of the things that I think is gonna be a major, I mean, we're gonna be talking about it at the New Attention Economy event at the KERV Cafe, which everyone should come to, i- is AI is gonna be a major issue.

31
00:04:47.080 --> 00:04:49.990
And, you know, you guys use AI, like, with your tech.

32
00:04:50.040 --> 00:05:08.500
I wanna just, like, first start off with, for those who are unfamiliar with KERV, 'cause I, I think KERV's, like, really interesting company and technology because I'll be honest with you, like, I've said it in the ads for this, like, I was hearing about, like, making shoppable video, like, fifteen years ago when I was at Adweek, and it, like, never worked, right?

33
00:05:08.539 --> 00:05:18.980
[chuckles] Yeah. Yeah. And, like, the thing is, like, you know, ideas are one thing and sometimes, you know, technology catches up. So just give a quick background on, on what KERV does- Sure. Sure...

34
00:05:18.990 --> 00:05:28.090
for those who are unfamiliar. So to, to your point, people have been looking at shoppable since the infamous Jennifer Aniston sweater and chasing the blue dot around the screen. Oh my God, I remember that.

35
00:05:28.140 --> 00:05:35.520
The Hudson Seaw was hot spotting, right? Yeah. It was just... It, it's been a... It-- there's a long road of dead bodies to get here. Yeah. [chuckles] Let me tell you.

36
00:05:35.600 --> 00:05:45.240
But what makes KERV unique, what we do is we have a, we have patents around pixel edge recognition on video with AI. So that means that we don't use bounding boxes.

37
00:05:45.320 --> 00:05:54.320
We have a proprietary polygon solution that allows us to identify objects uniquely, and within clusters and overlaps we can identify.

38
00:05:54.860 --> 00:06:05.476
So in difficult positions or situations, we can be far more accurate than any other technology out thereSo we leverage that to make both ads and shows shoppable and/or content.

39
00:06:05.976 --> 00:06:17.296
We started with the lowest hanging fruit of ads, which made sense, meaning someone was running an ad in a digital environment and weren't giving the advertiser an opportunity or the consumer an opportunity to purchase.

40
00:06:17.796 --> 00:06:26.696
We consider that a lost opportunity. So if you're digital and you're running campaigns and you're not making it capable of being shoppable, it's a mistake.

41
00:06:27.196 --> 00:06:39.436
So we started there, and as you say, the shoppable capabilities have been a long road. Many of these other solutions were embeds and were not programmatically distributed.

42
00:06:39.716 --> 00:06:50.156
So we created wraps that allowed us to be programmatically distributed so that we can control this environment at any destination on any- So wait, let me just jump in there.

43
00:06:50.196 --> 00:07:00.645
What does that mean, programmatically distributed? So, uh- I gotta ask the obvious questions- No, go ahead. What is that? Yeah, no problem. So running an ad through a VAST tag or a VPAID tag or whatever type of- Okay...

44
00:07:00.656 --> 00:07:09.066
distribution you're doing, what we did was we created a solution that works within that. So we made it compatible with every player that could be a destination.

45
00:07:09.736 --> 00:07:21.946
So you're buying programmatically through multiple det- networks, and it lands on a publisher's website. It hits their player. The... Our intelligence within the wrap calls out for the overlay in real time.

46
00:07:22.676 --> 00:07:34.156
So it matches it up in real time at the player, and that was not an easy feat to do, let me tell you. It took quite a while to build that out, figure it out, make it agnostic to all players and distribution channels.

47
00:07:34.216 --> 00:07:43.336
But that was the starting point of allowing advertisers to leverage shoppable ads within a, their distribution platforms. Okay. Cool.

48
00:07:43.776 --> 00:07:49.206
So yeah, I mean, that was something that, a-again, like I- it's funny you bring up the Jennifer Aniston sweater because I like- Yeah... had...

49
00:07:49.216 --> 00:07:58.736
People can't see this 'cause it's a podcast, but I got this look of recognition 'cause it was bringing back memories from that time. Um- Yeah. And, you know, it a- But then it, it leaps into the TV world to that point.

50
00:07:58.976 --> 00:08:09.315
If you don't mind, I-I'll go there. Sure. So then it gets into the world of OTT, CTV, et cetera. And in those environments, today we're using a QR code as the second device.

51
00:08:09.956 --> 00:08:21.436
So on OTT, someone's watching a show, they pause a scene, little tiles pop up on the bottom, they can scroll left or right, select the thing of choice, and those tiles represent the objects in that scene.

52
00:08:22.016 --> 00:08:26.266
So if they open up the larger tile, it gives a description, price, other information.

53
00:08:26.836 --> 00:08:39.516
Using a QR code, they can then go to their second device and purchase with either one click or however that retailer wants to play with us. But additionally, it's going now to registration data.

54
00:08:40.036 --> 00:08:45.816
So these platforms who have credit card information now can integrate with the retailers.

55
00:08:45.996 --> 00:08:55.876
So the capability of having a one-click transaction off of television with your remote is where we're heading, and it's getting really exciting that it's getting to that place. Yeah.

56
00:08:55.956 --> 00:09:04.336
And it's something like, you know, for media owners, they just have to do. Like, I mean, I think Terry Kawaja says, like, you know, performance media is sort of eating the world- Yeah...

57
00:09:04.356 --> 00:09:14.325
because they're a major advertising player next to Amazon. And then retail media is gonna be a massive thing. And to be able to compete... In many ways, it's true, right? Yeah. Like, I mean, you have to...

58
00:09:14.356 --> 00:09:25.836
When-- I think one of the big things we'll see in Cannes, I mean, first of all, it's like we're gonna go to Cannes and we're gonna have, like, Apple having, like, a massive display in Cannes, which is kind of bizarre if you really think about it.

59
00:09:25.996 --> 00:09:34.216
You know, media owners have to really tie to transactions at the end of the day. It's-- The ROI is the ROI, no matter how you look at it. Yeah. You know what I mean?

60
00:09:34.276 --> 00:09:44.676
Is it lifting sales at the retail outlet or, you know, at the brick-and-mortar, or is it driving conversions digitally or remotely? But it's still, they have to look at it as an ROI.

61
00:09:46.996 --> 00:10:06.496
[gentle music] Yeah.

62
00:10:06.616 --> 00:10:12.096
So explain that because I feel like, you know, I-- for me, one of the big themes, and I'm interested to see...

63
00:10:12.256 --> 00:10:23.856
'cause I think Cannes, it gets a lot of grief and stuff like this for the excess and whatnot, but it is a really good temperature check for what is going on in the media industry 'cause all sides do come together for that week.

64
00:10:23.936 --> 00:10:33.936
And I think one of the things that I'm very focused on is that we're sort of between eras in a lot of parts of the media industry, and the media industry has many different parts, right?

65
00:10:34.556 --> 00:10:46.676
And one of them is in the streaming industry, right? Y- We went through this period where there was clearly a land rush, and w- we've seen this before. Everyone, uh, you know, pours in.

66
00:10:46.716 --> 00:10:57.786
It's all about growth coming out of a zero interest rate era, and that, that certainly fueled a lot of it. But I think what's very clear is that era is giving way to a new era.

67
00:10:58.176 --> 00:11:07.116
Now, whether it's an austerity era or just, like, a profitability era, how do you see streaming industry at that inflection point? So it's interesting.

68
00:11:07.516 --> 00:11:14.716
When we started this company seven years ago now, we knew that subscription fatigue would hit in. We knew it, you know? We were looking at it.

69
00:11:14.756 --> 00:11:24.696
And if you look at it logically, it's the same evolution as pay websites and how it all started, right? Where people were paying for access to The Wall Street Journal and then not.

70
00:11:25.196 --> 00:11:37.060
And [chuckles] you know, all the newspapers asking and then not because it wasn't working, right? People are not gonna pay for 100 different channels at five, $10 a month. It's just not gonna happen.Right?

71
00:11:37.100 --> 00:11:40.900
So we knew that was gonna come. So we were looking at it as, what's the solution?

72
00:11:41.600 --> 00:11:53.120
And now everybody's looking at the interstitial ads, a 1950s model being the solution, which doesn't make so sense 'cause Netflix has trained the consumer to watch without ads, right?

73
00:11:53.220 --> 00:12:04.560
So this has become the standard in streaming. So how does streaming people live in making money beyond subscription with these consumers looking to watch with no ads?

74
00:12:04.820 --> 00:12:19.800
That's their dilemma, and we think that the solution is in-show monetization and user-curated experiences, meaning that John Doe is watching the show, or Jane Doe is watching the show, and they see an object, a, a couch, a sweater, whatever.

75
00:12:20.309 --> 00:12:28.730
They pause the show, they see that object, they can interact with it and purchase it, and that's an ad opportunity, and that's a user-curated ad opportunity.

76
00:12:29.170 --> 00:12:43.020
So I think a hybrid of what the world sees today is where we're heading. Yeah. So a hybrid of potentially sponsored ads, meaning opening, closing, potentially w- in the middle, but not your traditional interstitial.

77
00:12:43.080 --> 00:12:50.660
I think that's annoying to consumers, and I think people are not looking for that experience. Yeah, but just to push back, like, I mean, I can remember back in, like...

78
00:12:50.800 --> 00:12:53.050
So we're going back to the Jennifer Aniston sweater. Yeah.

79
00:12:53.060 --> 00:13:01.200
We'll just, like, stay there for a little bit because at the time, you know, when we talk about Cannes, there was this notion that the 30-second spot was going away.

80
00:13:01.360 --> 00:13:13.860
And, like, I kept-- 'cause I was covering the digital stuff, these, like, you know, digital agencies like RGA and whatnot, and they were all saying, "Oh, the 30-second spot's, like, done, and it's all gonna be, by the way, these websites that we happen to build.

81
00:13:14.320 --> 00:13:20.560
Just c-convenient." But it's endured, right? And it-- and one of the things is consumer expectations.

82
00:13:20.600 --> 00:13:31.280
Like, and I think what's interesting with your example of Netflix is whether that really has broken the consumer expectations because, you know, we're both of an age where we were s- we're used to it.

83
00:13:31.460 --> 00:13:35.740
We're used to a lot of things, right? [chuckles] Whereas younger people just simply aren't.

84
00:13:35.780 --> 00:13:43.530
Like, I mean, you know, we just would go to the kitchen and get a snack during the ad breaks, and Chuck Woolery would say, "We'll be back two and two." Right. Which I never understood- [chuckles]...

85
00:13:43.540 --> 00:13:50.480
but it was two minutes and two seconds, and I finally found out. [chuckles] Or four minutes, right? And these days it's like, yeah, why don't we just make them four minutes?

86
00:13:51.200 --> 00:14:00.820
E-explain to that the consumer expectations part of it, but then also normalizing this behavior of shopping within video.

87
00:14:00.940 --> 00:14:11.480
That's not something that people have been used to, to doing, and I think w-we're seeing a bunch of different, you know, expectations changing in consumers, but it's also just, like, habits changing. Sure.

88
00:14:11.880 --> 00:14:23.320
So, so let's first look at, like, DVR. Yeah. So people that have traditional, uh, cable or satellite are DVR'ing everything, right? And they're fast-forwarding through the ads.

89
00:14:23.820 --> 00:14:32.340
So that is already a demonstration of the consumer behavior, how they don't wanna have the ads, right? They're not looking to. It's a force-fed thing.

90
00:14:32.660 --> 00:14:41.900
So I think major films, especially when you're looking at the feature films that are streaming across these platforms and you have major feature films that people go to theaters to watch.

91
00:14:42.320 --> 00:14:55.760
They're sitting at home with their, you know, $50,000 sound systems and watching this and getting ads. That-- They're not happy with that. So I, I think that's an example of the consumer wanting it to be ad-free.

92
00:14:55.780 --> 00:15:00.460
But how does it work for in-show is your question, and will people actually buy? Mm-hmm. Right?

93
00:15:00.620 --> 00:15:12.440
So the testing we've been doing in Australia and some of it with major networks domestically has clearly proven that consumers will buy. They will engage. They need to understand how to do it.

94
00:15:12.980 --> 00:15:23.120
So the user adoption is really the hurdle here- Yeah... getting the consumer to understand the behavior and what it is. But I think it's a pretty natural thing.

95
00:15:23.480 --> 00:15:37.660
I mean, the consumers now with social media have gotten very accustomed to the in-content click and action, so that's a behavior that the mostly younger generation who are growing quickly have become accustomed to.

96
00:15:38.100 --> 00:15:45.800
So I think we're simply adopting a behavior they're comfortable with in a new environment that needs innovation. Yeah.

97
00:15:45.820 --> 00:15:56.460
And it's also, I think there is, there has been a normalization 'cause, again, with sort of going back in time a little bit [chuckles] is, you know, co-content and commerce was something I was hearing about, like, in, in 1999.

98
00:15:56.540 --> 00:15:59.760
Yeah, sure. Um, [chuckles] if you like- Sure... forget about Jennifer Aniston's sweater.

99
00:16:00.340 --> 00:16:11.030
But now it's a reality, and people are-- I think, you know, people are used to that blending of entertainment and, and shopping to some degree.

100
00:16:11.100 --> 00:16:28.400
I mean, one of the things that I'm-- I don't know if you've an opinion on this, but, like, I, I'm sort of, like, surprised that, like, live shopping has not really taken off in, like, US and European markets comparatively to China in particular.

101
00:16:28.740 --> 00:16:32.020
I don't know why that is exactly. I don't know if that's one of these cultural things.

102
00:16:32.080 --> 00:16:36.279
I, I actually have an interesting-- Yeah, I have an interesting point on that 'cause we have a very large investor based in Hong Kong. Yeah.

103
00:16:36.300 --> 00:16:44.480
And we spoke to her about that, about what's her opinion of why live shopping is doing so well in Asia and not in these other regions.

104
00:16:45.100 --> 00:16:52.800
And the point she made was that she doesn't believe that people are following the brand. They're following the talent.

105
00:16:52.900 --> 00:17:06.000
So when they have somebody, like, a, an influencer, let's say, selling shoes on a podcast or on a live streaming and in wherever, "Hi, I'm gonna be at X store- Yeah... streaming live.

106
00:17:06.120 --> 00:17:14.420
Check out the shoes I'm gonna be doing," that attracts the viewers, right? So the viewers are gonna come. Then that same person says, "Oh, I'm gonna be at the market streaming this in.

107
00:17:14.480 --> 00:17:25.240
Come check it out," those viewers follow. If it's a sales rep saying, "I'm gonna be at the store doing a live shopping experience," are they gonna go? Yeah. That's not really inviting.

108
00:17:25.320 --> 00:17:38.150
So I think what we're seeing is the evolution of people perceiving the term live stream shopping is the new thing, but consumers being normal behavior reacting to what's attracting them.Right?

109
00:17:38.520 --> 00:17:47.910
If a live stream says 50% off Louis Vuitton, well, yeah, people are gonna watch and go there and buy. So there has to be a hook. There has to be something to drag the people there. Yeah.

110
00:17:47.910 --> 00:17:56.360
I think that's what's missing with the most of these people's perceptions of live shopping. Yeah. So explain then how does like a, a institutional brand...

111
00:17:56.370 --> 00:18:02.340
'Cause one of the things that I really track is this shift from like institutional brands to individual brands.

112
00:18:02.390 --> 00:18:08.260
And like I hope with the rebooting to sort of, you know, I don't want it to all be about me like [laughs] particularly long term, I really don't.

113
00:18:08.300 --> 00:18:21.360
But like, you know, the individual part, the personal part, the reputational part obviously gives it, you know, you know, some leverage in the marketplace that I think that, you know, regular institutional brands don't necessarily, if particularly if you're building it from scratch, don't have.

114
00:18:21.460 --> 00:18:34.140
And I wonder about how like you see legacy brands, you know, dealing with that in that like, like what you're saying is that it's really personality driven, like with the live shopping. Mm-hmm.

115
00:18:34.160 --> 00:18:49.260
And I'm wondering about how you see like partners trying to have that divide where if it totally works with an individual, how does like, you know, a brand, say on NBCU or something, how do they end up, you know, realizing the potential of this?

116
00:18:49.920 --> 00:18:57.840
Well, I don't know that I have all the answers to that [laughs] but- Well, it's a podcast. You can just say that you do. [laughs] What do you think I do, Gary? [laughs] Yeah. And you do it well.

117
00:18:58.240 --> 00:19:15.980
No, I think that the, the error in the live shopping stream strategies is the lack of perceiving it as an event, right? So it has to be looked at as something that draws attention. It has to be an event.

118
00:19:16.100 --> 00:19:25.680
It has to be a marketing strategy. They have to come up with why would I as a consumer go to watch this event and buy something. There has to be- Mm... a hook, right?

119
00:19:25.800 --> 00:19:39.970
So if it's, like I said, if it's 50% off deal on highly in, uh, recognized brands, that's a big attraction. If it's some unique release like Nike doing a dunk drop, right? Uh, you know, like th- that's- Yeah...

120
00:19:39.970 --> 00:19:50.500
a big deal. I mean, these kids go on there at 7:00 AM to buy these shoes 'cause they know there's a limited drop happening, right? Oh, I know. So that's a hook, right? So, and my kids do it, you know?

121
00:19:50.620 --> 00:20:17.120
[laughs] [instrumental music] But I think that it's really looking at these as events as opposed to just making a s- live streaming strategy.

122
00:20:17.840 --> 00:20:21.320
Yeah. My last job, we had an office on Mercer Street in New York- Mm-hmm...

123
00:20:21.340 --> 00:20:35.080
and it was sort of next to VFiles, the Nike store, and one other place that, I don't know, Parade or what's the-- I forget the name of the streetwear brand, but our street would be madness every drop. Yeah.

124
00:20:35.100 --> 00:20:43.080
And I remember, I re- I remember this was many years ago. I remember when Ky- Kylie Jenner dropped her lip kit and there were like...

125
00:20:43.700 --> 00:20:51.079
Y- you would've thought that the president was like visiting the neighborhood with the amount of security that was needed. Yeah. And I was like, "Whoa, something here is going on."

126
00:20:51.100 --> 00:21:02.740
And this is many years ago, and it really does speak to the power of individuals. But one example of what you were saying, like e- explain a little bit what you're doing with the Must Shot TV in partnership with NBCU.

127
00:21:03.700 --> 00:21:12.640
Sure. Sure. So the people at NBC are, are really smart and there's a, a couple of champions there that we're working with that are leading the charge and really get it.

128
00:21:12.670 --> 00:21:20.650
And what they've done is they've created their own catalogs of products, and Kurv is the engine behind the correlation and matching.

129
00:21:21.320 --> 00:21:31.620
So our artificial intelligence is identifying the objects, correlating and matching against product catalogs, bringing them together and pushing out the shoppable component.

130
00:21:31.700 --> 00:21:39.930
But because NBC owns the catalog, it makes it fairly easy 'cause it's the one catalog. We're working with other companies where it's multiple catalogs- Mm-hmm...

131
00:21:39.960 --> 00:21:44.900
where we have to query against millions and millions of SKUs in near real time to match it up.

132
00:21:45.620 --> 00:21:53.300
But that experience is one that we're very excited about because they really get it and they're pushing it out in a really smart way in the Must Shot TV.

133
00:21:53.460 --> 00:21:59.880
Obviously they have the power of being NBC, so, and Peacock, so it's exciting to roll out with them. Yeah.

134
00:22:00.040 --> 00:22:09.780
So what kind of challenges do you see to this becoming sort of normalized and habitualized, like as, you know, both behavior but also getting the economics right?

135
00:22:09.900 --> 00:22:24.120
Because a lot of these things, like I have a lot of meetings with people, and one of the meetings I had like this week was with someone who is at like a broadcast network, um, and this person was going through all these different opportunities and, "Yeah, newsletters are interesting.

136
00:22:24.160 --> 00:22:37.920
We've got personalities. Maybe we build like a little franchise around them. Yeah, like, you know, the events are kind of interesting." Like, but he's like, "This doesn't replace what is eroding at a very fast pace."

137
00:22:38.060 --> 00:22:49.310
I want this to be totally optimistic, but I'm also a realist, right? [laughs] And how do you make sure... 'Cause I remember over the years, right, I would always hear from people who are promising incremental, right?

138
00:22:49.320 --> 00:23:01.060
And I was like, "Oh, that means small." But like the incremental never makes up for what is being eroded. Yeah. Well, it's a new world, and you said it earlier, we're in a hybrid space and we're all gonna learn together.

139
00:23:01.160 --> 00:23:10.060
I mean, I don't think that there's any single answer. I don't think anybody right now at this time... Let me give you some examples of some of the hurdles, right? Mm-hmm. If you look at legacy content, right?

140
00:23:10.120 --> 00:23:23.260
Legacy content has talent agreements that could be a problem for creating shoppable content. Some of the people we've spoke to are a- some of the content owners are asking us to create negative recognition.

141
00:23:23.340 --> 00:23:30.572
So we do facial recognition, identify that talent, and that we cannot create shoppable things off of them.Wait, why can't you?

142
00:23:30.632 --> 00:23:40.172
Because just the agreements or the talent agreements don't- Yeah, the terms and conditions of the original talent contracts. Oh, my God. Right. So, so like, like- Hollywood loves lawyers. Oh, my God, do they?

143
00:23:40.392 --> 00:23:48.372
[laughs] No offense to the lawyers out there. I'm just gonna spread it on now. You didn't go to law school, did you, Gary? No, I did not. I did-- I am not a lawyer. Thank God. No. I take no relation to that whatsoever.

144
00:23:48.792 --> 00:23:57.052
Although I leverage- Well, you're in LA. All your neighbors must- I leverage great lawyers, I have to say. Um- Yeah, well, law-lawyers are always great to talk to 'cause they're billing six hundred dollars an hour.

145
00:23:57.132 --> 00:24:11.252
Yeah, or more. Be on the phone. [laughs] No, but I think that the moment in time that we're at is experiencing a lot of change. And you look at, like, reality TV and the 360 contracts there make it easy to do shoppable.

146
00:24:11.312 --> 00:24:19.032
That's simple, right? But if you take a look at legacy content, if you look at the new talent agreements, and I mean, there's a lot of variables.

147
00:24:19.372 --> 00:24:30.192
And remember that, that the engagement, the interactivity doesn't have to be shoppable either. It could be relevant ad-driven. It could be... I mean, there's a lot of components there. There's informational.

148
00:24:30.392 --> 00:24:40.442
There's, I mean, Amazon X-Ray and IMDb, you know. I mean, that's an example of somebody that's purely just informational. Um- I'm sorry, what is the... I don't know the Amazon X-Ray. Amazon X-Ray.

149
00:24:40.472 --> 00:24:46.932
So Amazon Prime has, if you pause a show, they can pull up information on the talent in the show.

150
00:24:47.552 --> 00:24:58.132
And so you pause, you see who the actor is, you can go to their IMDb profile, and that's the way they have you interacting with the content. And the users are engaging. People are doing that.

151
00:24:58.592 --> 00:25:11.672
So, uh, it's interesting to look at the evolution of where this is going content-wise into both shoppable and information, so commerce and content. Great.

152
00:25:11.732 --> 00:25:21.212
So let's talk a little bit about AI, where we, like, started, because, you know, like, these changes are coming, like, fast and furious. Maybe it's I'm getting older, but they seem very fast to me at least.

153
00:25:21.852 --> 00:25:28.332
[laughs] And, you know, I think they're causing a lot of, I don't know, anxiety, I would say, in, in, in a lot of quarters.

154
00:25:28.412 --> 00:25:38.721
I mean, I- as a person who types a lot of words, you know, I was like, I was much more into the technological changes that were focused on like, you know, automating more blue-collar work and stuff like this.

155
00:25:38.751 --> 00:25:48.082
So I was saying, "Oh, let's just... You know, this is how it goes. We'll have to retrain people." When the robots started, you know, making content, I was getting a little... I was like, "Well, let's put a pause here."

156
00:25:48.852 --> 00:26:01.402
Like, how do you see just overall-- obviously, AI is baked into what KERV does, and AI is going to be baked into pretty much everything, it seems like, that is operating in the digital world, right? Yeah.

157
00:26:01.512 --> 00:26:14.012
Yeah, a hundred percent. I mean, look at the new releases from Google and everybody else on generative AI opportunity for optimization and ad placement. That's now the hot ticket, right? That's what they're releasing.

158
00:26:14.152 --> 00:26:27.422
So on contextual, on banners, on, on all the different opportunities, rather than using your traditional algorithmic optimizations, they're now suggesting that people start using this new generative AI that they've released.

159
00:26:28.032 --> 00:26:39.132
So the AI can ideally not look at the history of analyzing but know where to place it in advance, is what they're suggesting. I haven't tested it. Yeah. I don't know how it works. Sure.

160
00:26:39.232 --> 00:26:46.212
But I do know that's new release that everybody's doing. So what is generative AI? Peop- I don't think people have a grasp on that.

161
00:26:46.242 --> 00:26:56.512
Artificial intelligence is about training models and computers being able to leverage those, right? So you get to generative AI, that becomes creative AI.

162
00:26:56.632 --> 00:27:07.962
So it isn't analyzing and doing something post, it's doing something requested in advance. So as an example, um, all the different image generative AI platforms- Yeah...

163
00:27:07.992 --> 00:27:18.101
where you can say, "Make me a picture of a unicorn on the beach," right? And it does that, right? Yeah. It, it has to have the training models to be able to know what to create, right, when you ask it. Mm-hmm.

164
00:27:18.192 --> 00:27:28.572
If you ask it something it doesn't have in the training models, it's not gonna be able to do it, right? So it-it's really, it's a hybrid of human training models into what becomes generative AI.

165
00:27:28.672 --> 00:27:41.422
Now, as the AI starts learning to create its own models and evolving it, that's where it starts getting to the massive explosion, like the ChatGPT and all the things that are going on out there text-related and now moving into visuals.

166
00:27:41.752 --> 00:27:51.182
I mean, one of the things that we're constantly looking at is the dynamic creation of visual content, right? So if you look at, like, Photoshop just released- Mm-hmm...

167
00:27:51.182 --> 00:27:56.132
Adobe Photoshop just released a brand-new release that's amazing. It allows you to push...

168
00:27:56.262 --> 00:28:06.112
If there's an image that has things in it, you just push it out, and it opens up duplicating those pixels and creating an image expanded of the same content.

169
00:28:06.572 --> 00:28:19.132
So it allows you to move things around extremely easily, just cut and paste kind of thing. So I think that the automation of creation of content is going to be very interesting.

170
00:28:19.151 --> 00:28:34.970
[gentle music]So let's talk about the automation of creation of content, right? So we're gonna be in Cannes, right?

171
00:28:35.020 --> 00:28:41.810
And Cannes is many things these days, but I think when I first started going there, it was still mostly a creative ad festival.

172
00:28:42.060 --> 00:28:54.140
I mean, it-- there was still, there was media there and some tech, but it was still the big night was Saturday when the, you know, the Grand Prix was for film- Right... which the rest of us know as TV.

173
00:28:54.720 --> 00:28:56.920
And, you know, that has changed quite a bit.

174
00:28:56.960 --> 00:29:08.340
And there's been like, you know, consternation in the creative communities obviously for a long time, and I think it's-- we see what's going on with the writers' strike right now, and AI is an issue in there.

175
00:29:08.440 --> 00:29:18.400
I don't think it's the main issue, and, you know, putting things in agreement for four or five years in advance, I can see why, you know, on the other side, they wouldn't wanna do it.

176
00:29:18.480 --> 00:29:24.320
But I remember I was talking with, you know, a writer and, you know, I mean, he was using some very colorful language.

177
00:29:24.560 --> 00:29:30.580
You know, he's a writer, so he knew a lot of different words, but he used some very colorful language when it came to AI getting into the creative process. Right.

178
00:29:30.820 --> 00:29:50.070
And I guess what I end up, like, wondering, 'cause I think everyone goes to replacement instead of, like, augmentation, but how do you see this augmenting and allowing for more creativity versus inevitably replacing the humans at the heart of true creativity?

179
00:29:50.320 --> 00:29:57.540
Or is it time that we sort of admit that a lot of, quote-unquote, creativity is, like, just versioning anyway? Well, I think that's true.

180
00:29:57.640 --> 00:30:06.370
I mean, like, you know, I have a background in the music industry, and, you know, look, there's a model. You know, it's intro, s-section, chorus. You know what I mean? Like- Yeah...

181
00:30:06.400 --> 00:30:14.140
there are models for building these things. But, I mean, you look at, like, the fake Drake stuff, right? That's really impressive. I mean, it's really impressive.

182
00:30:14.680 --> 00:30:21.200
So it looks like there's a whole new layer of legal liability and legal opportunity and various things there.

183
00:30:21.240 --> 00:30:33.660
I mean, using Drake's voice and using all the AI components of other music is much like when rap started using segments and pieces of songs and incorporating them.

184
00:30:33.720 --> 00:30:45.120
In the music publishing world, the lawyers started getting involved and started licensing out those segments to be used. So I think we're gonna see in the music world a similar process.

185
00:30:45.540 --> 00:30:58.969
When you start getting to scripting and automating the creation of script, that's formula stuff to start with. Not to belittle writers who are brilliant, but when you look at, you know, the animated show Cars- Mm-hmm...

186
00:30:59.040 --> 00:31:08.980
that's as formulaic as it gets, and it works. It's brilliant strategy. Cars 1, in my opinion, [chuckles] is one of the greatest formulas of content out there. I mean, it's...

187
00:31:09.080 --> 00:31:19.170
I have kids, so I've seen it a million times, but it's not something that can't be duplicated. You know what I mean? Okay, so, like, Cars 6, what I wonder is, like, when Cars 6 inevitably gets made- Yeah...

188
00:31:19.170 --> 00:31:27.659
'cause that's how these things go- Yeah, yeah... like, how is it made? And, like, where-- like, how much is done by quote, like I'm saying, quote-unquote, humans. I think I can...

189
00:31:27.700 --> 00:31:37.939
I think I'm just gonna leave out the quotes at this point. How much is it gonna be done by humans, and how much is gonna be done by AI, and what is the combination of both? You know what I mean?

190
00:31:38.500 --> 00:31:42.500
It's not unreasonable to think that the entire thing could be done through AI.

191
00:31:42.560 --> 00:31:53.460
I mean, it's not unreasonable to think that the script could be written and created by generative AI and the content visuals can be created by AI.

192
00:31:54.520 --> 00:32:04.000
Automating the creation of an animated show like that is not unreasonable to think that could be coming. Okay. So if [chuckles] so let's spin this into then advertising, right?

193
00:32:04.100 --> 00:32:10.440
'Cause, like, you know, I remember when I first started writing about advertising, I was just learning about this industry. I really had no idea about it.

194
00:32:10.540 --> 00:32:16.200
And, and someone was explaining to me the difference between, like, working media and non-working media.

195
00:32:16.710 --> 00:32:27.850
And there, I was like, "Okay, so wait, working media is when you pay to place the ad, you pay for distribution, and non-working media is the ads itself?" I'm like, "That's non-working?"

196
00:32:28.220 --> 00:32:39.510
And I always thought that was a little bit of a tell on the client side of what they really thought about, like, you know, what they were, how they were valuing the creation aspect, leave aside the Saturday night in Cannes- Right...

197
00:32:39.520 --> 00:32:47.420
'cause it was like, okay- Right... you guys get your little lions or whatever, but, like, the reality is we look at you guys as a cost center and, like, you know, overhead.

198
00:32:47.940 --> 00:32:59.260
How much of ad creation do you think can be done by AI or will be done? So that's a great question. We're actually doing a lot of attention analysis.

199
00:32:59.430 --> 00:33:02.780
We're releasing something, maybe I shouldn't say it here, but- No, you should.

200
00:33:02.960 --> 00:33:22.360
[chuckles] We're releasing something called Active Attention, which is a analysis of video attention metrics, being able to analyze the user's engagement with the video, and that includes ads and content, and then analyzing the content itself for where and when people are engaging and not engaging, dropping off, et cetera.

201
00:33:22.880 --> 00:33:33.660
So if you take that type of analytic data and you bring it toward content creation, it can provide a huge value towards automating the creation of this content.

202
00:33:33.720 --> 00:33:47.540
I mean, if an ad created by an agency is falling off at a certain place consistently, and that can be fixed by AI, is that a good thing or is that a bad thing? I don't know.

203
00:33:47.620 --> 00:33:53.730
Oh my God, a client's gonna choose it nine times out of ten and the tenth on Sunday. Like- [chuckles]... the, like, always, right? I mean, like- Right...

204
00:33:53.760 --> 00:34:01.540
the idea of, like, waiting, you know, it's like agency life kind of stinks in a lot of ways outside of the, like, you know, glamour of Cannes and stuff.

205
00:34:01.600 --> 00:34:08.140
There's a lot of, like, long nights and weekends because clients want it yesterday- Yeah... and they don't really care about, like, the personal lives.

206
00:34:08.220 --> 00:34:13.220
And you don't have to get the, like, you know, Droga to stay around all weekend.

207
00:34:13.280 --> 00:34:22.609
Like, you can, you can have an agent or whatever that is, like, optimizing to a goal and is sucking in all the data and then just, like, changing versions, like, itself, I guess.

208
00:34:22.660 --> 00:34:28.240
You could automate the version and by far, yeah. Which makes it extremely efficient.Time-wise.

209
00:34:28.640 --> 00:34:54.380
So in other words, if you, in a digital environment, let's use that as an example, you release an ad out there with mid-roll, post-roll, et cetera, and the ad campaign is running, and you see right away that there are flaws and reasons where people are falling off every time, and AI can potentially go and automate, change, edit, and re-release and push that out based on the actual user behavior, that's an interesting model.

210
00:34:54.420 --> 00:35:01.960
I mean, that's gotta be valuable to the brand who's looking to spend the dollars and get the ROI we talked about. Yeah.

211
00:35:02.020 --> 00:35:13.820
I guess what I end up wondering is, and maybe this is hopeful on my side, [chuckles] is if there ends up becoming, like, a human premium in that, like, we're gonna be surrounded by, I guess, what some would call synthetic content, right?

212
00:35:13.920 --> 00:35:27.120
I mean, by s-some estimations, 98% of digital content, it'll probably be more, is gonna be synthetically created, which is that it is gonna be, you know, created to some degree by AI, by a bot in some degree.

213
00:35:27.460 --> 00:35:36.900
And just like, you know, mechanized agriculture, like, I, I think the farm to table movement, like, grew up for a reason, right? And I don't think it's gonna be, like, a massive function.

214
00:35:36.940 --> 00:35:47.840
But I guess what I end up thinking is, like, what-- whether people will place a human premium on things that they know, like, are human made with all its flaws and whatnot. There's some Japanese term for that.

215
00:35:48.390 --> 00:35:51.080
But that, like, it was made by, you know, it was human made.

216
00:35:51.140 --> 00:35:58.760
You know how, like, they put, like, labels on, like, clothes, I guess it was back in the '80s, like, made in the USA became a thing, like, because it was like, "You're gonna pay more."

217
00:35:58.820 --> 00:36:03.660
Like, [chuckles] "If you want the stuff that's made in the USA, you're gonna pay more." And, you know, guess what?

218
00:36:03.760 --> 00:36:14.640
Like, you know, in a lot of ways, like, our, you know, clothing manufacturing supply chains and whatnot aren't as good. I don't know. Do you see, like, a human premium entering media at all? Am I being too hopeful?

219
00:36:14.650 --> 00:36:19.580
Well, I think it's a great- 'Cause we can't compete with robots. Yeah, yeah. I can't. I know that. It's a great question.

220
00:36:19.620 --> 00:36:33.060
Like, where's the line between the organic creation of something valuable versus a digital automated creation? Where is that line? And w- [chuckles] I don't know where it is, and I don't think anybody does right now.

221
00:36:33.100 --> 00:36:55.840
But I had a conversation the other day with a friend of mine, and we were talking about that the world's gonna lead to having, like, Phish concerts and Grateful Dead show fans w-who are 100% organic and, like, leaving their phones in the car and experiencing the human reaction to the music and the environment being organic, right?

222
00:36:55.900 --> 00:37:05.480
And so how is that leveraged in a world of marketing and advertising? Is it leveraged, or does it not matter? It-- I don't know. [chuckles] Yeah.

223
00:37:05.540 --> 00:37:14.740
It's gonna get messy, and I think we're gonna, we're just gonna figure it out as we go along. I mean, I'm kind of excited for it in some ways because it's such a massive change. It's gonna be weird and interesting.

224
00:37:15.100 --> 00:37:21.080
No, I think we're- I know that's that... on the precipice of good stuff, though. I mean, I think, look, what were they saying when radio moved to television, you know?

225
00:37:21.160 --> 00:37:32.520
What were they saying when, you know, the printing press moved to radio? What were they saying? Who, you know, this is another evolution, and we're simply at a precipice of automating, of animating, of- Yeah...

226
00:37:32.529 --> 00:37:43.240
the creation of something new. So the end result becomes the human engagement, right? Yeah. Are people enjoying what they see? Are they watching? Is there value in it?

227
00:37:43.480 --> 00:37:54.740
Because at the end of the day, somebody's gotta make a dollar, right? They're gonna sell a ticket or sell a subscription or sell an ad, you know? So- Yeah... are people engaging with it? We're gonna see. Yeah.

228
00:37:54.820 --> 00:37:58.700
And also, like, I think we always go to the doomsday scenario of, like- Yeah...

229
00:37:59.029 --> 00:38:05.360
AI taking over and, I don't know, killing us, killing-- ending civilizations 'cause they're trying to make a paperclip or something like that.

230
00:38:05.380 --> 00:38:11.260
But, you know, the more prosaic thing is I think people go to, you know, replacing jobs, a lot of this stuff.

231
00:38:11.300 --> 00:38:18.380
And I think the question ends up being, you know, is the amount of value created, you know, worth the value that is destroyed?

232
00:38:18.440 --> 00:38:38.180
Because the reality is, you know, there are winners and losers to any shift, and I think there's more focus on the, quote-unquote, "losers" or potential losers in this because the potential losers are more middle class or even upper middle class, and those people have voices, uh, that are loud and they are- Well, but is it destroying or repurposing?

233
00:38:38.700 --> 00:38:48.960
You know what I mean? Like, is it just adapting- Yeah... to a new repurpose? Is it finding the new careers that fit with managing within that environment? I-I- Yeah... there will be.

234
00:38:49.020 --> 00:39:01.000
You know, I mean, automating car manufacturing lost jobs for the assembly line, but it created new jobs for managing the machines. So I-I don't know if it's- Yeah, I know... killing it or adapting.

235
00:39:02.160 --> 00:39:28.460
[gentle music] I think one of the advantages, if it is, of, like, you know, getting older is, like, you see, like, cycles and stuff like this.

236
00:39:28.510 --> 00:39:31.760
Yeah. And, like, you know, nothing stops, and things, like, appear that you would never...

237
00:39:31.800 --> 00:39:41.700
Like, for instance, like, I was trying to, I was trying to actually use ChatGPT, it wasn't good for this, to figure out how many personal trainers there are now, and they literally, there were no personal trainers, like, back when I was a kid.

238
00:39:41.780 --> 00:39:44.890
[chuckles] And, like, now it's, like, a massive industry.

239
00:39:45.180 --> 00:39:56.000
Like, and betting on what people say i-is sometimes not a good thing 'cause remember it was like, it seemed like only a few years ago everyone was telling everyone that you gotta, like, learn to code and stuff like this.

240
00:39:56.060 --> 00:40:08.870
Well, like, code's going first. I mean, yeah, that is very ripe for... You know, it's not gonna take over all of that, but, like, a lot of the, like, you know, basic coding is already being d- Yeah...

241
00:40:09.160 --> 00:40:18.360
you can do it- Yeah... with ChatGPT. You absolutely can, and to what extent, I don't know. But I've been hearing that people are doing that. They're telling it what code to write, and it's writing it perfectly.

242
00:40:18.900 --> 00:40:24.594
SoYeah You know, TBD on that one I mean, look, and 'cause it's also another thing is...

243
00:40:24.644 --> 00:40:41.324
I mean, I could go on with this, like, forever, but, like, the-- It's gonna cause us to interrogate all sorts of things that we took for granted that are fairly arbitrary, for instance, like the notion of creativity, when a lot of it is really just sort of, like, pattern matching and, like, it is mostly just versioning and stuff.

244
00:40:41.764 --> 00:40:56.724
In the same way, you know, on a much larger scale, like, you know, we have a very arbitrary definition of, like, what consciousness is and intelligence that happens to bias us versus other creatures that we claim are inferior, but, like, they clearly have some forms of consciousness.

245
00:40:56.764 --> 00:41:02.114
And once the robots are more intelligent than us, we're gonna have to question [chuckles] whether or not- Yeah. It, it's funny.

246
00:41:02.464 --> 00:41:12.264
I saw, I forget where it was, one of these shows that had a, a robot, and they asked, "How are they different?" And the robot said, "Well, I don't have a soul. You have a soul." Okay.

247
00:41:12.274 --> 00:41:15.654
So I'm reading this really interesting book. We can cut this out later, Gary, if you want.

248
00:41:15.954 --> 00:41:34.904
[chuckles] But I'm reading this really interesting book, God, Human, Animal, Machine, and the woman who wrote it, she had grown up, like, Evangelical religions, and then she moved away from that, and she really has been studying, like, transhumanism and technology as the new replacement for religion.

249
00:41:34.944 --> 00:41:43.224
I mean, religion always was a way for us to explain the world around us, and then we just decided, like, "Oh, this is all very metaphorical" and stuff like this.

250
00:41:43.364 --> 00:41:53.864
Let's move into, like, more of a deterministic, let's just really test out the things, and that now re- technology has become, you know, the religion that, that we sort of look to.

251
00:41:53.874 --> 00:42:03.324
And she had this one line that really struck me, where it was, "All our existential questions are now engineering problems." [chuckles] Well, I don't know that I agree with that.

252
00:42:03.404 --> 00:42:13.064
But, look, I, I think- But one of the things, it's like the soul, like when we- Yeah. I mean, like- It's very arbitrary. We've been trying to figure this out since really- Millennia... we started thinking. Right.

253
00:42:13.464 --> 00:42:24.404
I think that the human components of an organic soul, mind, brain, body is something that will not be capable by AI.

254
00:42:24.964 --> 00:42:42.164
However, I have a relative in Sweden who is cloning cells for the purpose of things like fire victims and creating- Yeah... solutions to get their healthy skin back. So I mean, the world of bio is pretty sophisticated.

255
00:42:42.724 --> 00:42:53.084
So I mean, I can't say that I know where that's going. Yeah. But you-- I don't think we're ever gonna be able to create a soul. No. I don't think so. Yeah. I could [chuckles] keep going on this thing 'cause I just...

256
00:42:53.524 --> 00:43:05.364
We're, we're all-- We've always tried to become gods ourselves, and, like, I think AI is really going to test that because it is going to... Yeah, it's pushing the envelope farther than we've gone.

257
00:43:05.384 --> 00:43:17.244
But, you know, I'm a believer that we are a technological species and, you know, we will always push forward. And so the only choice is to adapt to some degree. Agreed.

258
00:43:17.284 --> 00:43:28.854
I think there's a role for the government to play, and I think societies need to have a voice, and it's not just a small group of people clustered in California o-oftentimes [chuckles] that get to decide- You know...

259
00:43:28.904 --> 00:43:36.404
the fate of humanity. Yeah. I feel like we should get a vote. [chuckles] Yeah. It's interesting. It's interesting. But I mean, like- Okay. So- All of it comes back to- Yeah...

260
00:43:36.464 --> 00:43:43.584
in, in the reality of media and content, that if somebody's watching and enjoying it, there's a way to monetize it.

261
00:43:43.604 --> 00:43:53.844
And the question becomes, is monetizing it going to be enough profit against the way these streaming companies are losing hundreds of millions- Yeah... of dollars consistently?

262
00:43:53.904 --> 00:44:06.114
So that's really the dilemma going on, is that where's the profitability in all this? And I think that there's a hybrid that we all think is where the answer lives. Yeah.

263
00:44:06.164 --> 00:44:22.004
So what are some, just to wrap it up then on that, like, what are some, what are, like, three, like, sort of key themes that, you know, you're focused on emerging, say, from, like, Canne and the discussions that you're having and the discussions we'll have at the New Attention Economy, which everyone should come to?

264
00:44:22.204 --> 00:44:31.844
Yes, absolutely. So I think the key component becomes what is attention? Because everybody out there is touting their perception of what attention is.

265
00:44:31.884 --> 00:44:50.184
But at the end of the day, video and attention is the most critical component because an ad placed in a video has to be placed based on relevance, based on is it targeted to the consumer or relevant to the content contextually or what, and how is that attention analyzed?

266
00:44:50.784 --> 00:45:05.364
And I think that's really where the core value of what we're releasing and what we're looking at as the future is being able to really provide the asset a place that gets being seen by the right person at the right time.

267
00:45:05.824 --> 00:45:12.944
Yeah. Yeah. And that's like, and meaningful attention, I think, is, like, a good sort of- We call it active attention, but yeah. Yeah, okay. Active, meaningful.

268
00:45:12.984 --> 00:45:28.024
We'll go with active because, you know, a lot of media has been trying to quantify attention in some very haphazard and superficial ways, in my view. Like, I mean, the viewability era was kind of ridiculous.

269
00:45:28.104 --> 00:45:35.644
Like, I mean- Yeah, no... it's just like- [chuckles] Hundred percent. I mean, look, you turn away from your computer, it's still playing, you know? Yeah. Are you watching? No. You know?

270
00:45:35.784 --> 00:45:42.824
And even the bot world figured out how to falsify that. Well, that's the thing. This is-- I mean, we're gonna, like, face this ho-honestly with AI.

271
00:45:42.904 --> 00:45:50.263
It's like, you know, bots can actually be better at looking like humans than humans. Like, you can, like, [chuckles] you know, they're, 'cause they're consistent. Yeah.

272
00:45:50.304 --> 00:45:59.004
Like, humans are inconsistent, but if you want the bot to be consistent, like, it'll be consistent. Yeah. No, the bots are trained to be inconsistent and look like humans. Yeah. All right.

273
00:45:59.044 --> 00:46:08.364
What are a couple other themes you're looking for? So I think one is the attention metric. Two is- Yeah... really this thing about contextual relevance. I mean, we're doing a couple of different things.

274
00:46:08.844 --> 00:46:13.584
We're now releasing a contextual relevant banner on a video.

275
00:46:13.684 --> 00:46:23.874
So as an example, you're watching a show, there's a Starbucks logo, there should be a contextual or a banner on the bottom that, that's placed in there that's a Starbucks banner as an ad, right?

276
00:46:24.264 --> 00:46:28.984
So that's a way to monetize the contextual relevance. Mm. Also ad slots.

277
00:46:29.044 --> 00:46:42.414
I mean, look, all the SSPs are on still 2.4 RTB and not 2.6, and when everybody moves to 2.6 RTB and implements it, the-Ad podding becomes ad targeting, right?

278
00:46:42.454 --> 00:46:54.514
So rather than buying an ad placement based on a show and pods, you can buy an ad by the target. So again, contextual relevance. That Starbucks logo in the scene, the next ad slot should be Starbucks conceptually.

279
00:46:54.854 --> 00:47:09.214
So I think we're looking at how the value of video AI in analyzing the content with accuracy and relevance, creating the opportunity for ad placement and ad insertion based on the content itself. Okay, cool.

280
00:47:09.234 --> 00:47:13.794
Well, I'm looking forward to being in Cannes with you. We're gonna have a good time. Everyone should... Yeah, it's gonna be great.

281
00:47:13.894 --> 00:47:25.874
And again, we'll, we'll put a link to the new Attention Economy event that we're having Monday through Wednesday at Cannes. I think that's the 19th through the 21st. We've got about three hours of programming each day.

282
00:47:25.914 --> 00:47:34.034
I think we're up to, like, 25 different speakers. We've got some really amazing people. It's really impressive, the people who put the... I'm really- Yeah. -looking forward to these conversations. Yeah. I am too.

283
00:47:34.114 --> 00:47:43.364
I have to do a lot of them, so, like, I might need, like, you know... I gotta stay hydrated, and I gotta, like, stay rested. Well, we have a whole bar and buffet and food and everything there, so you'll be fine. No bar.

284
00:47:43.394 --> 00:47:49.114
No bar. That's why, like, I gotta make it to the end of the week. I gotta make sure my voice is intact. I'm gonna do a lot of speaking.

285
00:47:49.554 --> 00:48:01.254
The real rookie move in Cannes, you know, I've been to Cannes probably, like, 15 or so times now. Yeah. Big rookie move is you lose your voice by, like, end of day. Chewing? Yeah. Like, no, I'm a veteran.

286
00:48:01.394 --> 00:48:09.194
I know, I know how this goes. There you go. Well, awesome, Guy. Thank you so much. I really appreciate it. All right. Great talking to you. We'll see you in Cannes. All right. See you there. All right, bye.

287
00:48:09.294 --> 00:48:20.374
[upbeat music] And thank you all for listening. Hope you enjoyed this podcast. I wanna thank Jay Sparks from Podhelp Us, who produces this podcast. If you are interested in a podcast, you should get in touch with Jay.

288
00:48:20.574 --> 00:48:28.714
You can find out more at podhelp.us. And send me your feedback, would love to hear what you would like to hear from this podcast.

289
00:48:28.794 --> 00:48:39.274
I did one the other week with Sara Fisher and Peter Kafka that I got a lot of notes about, so I'm thinking about doing more of those, and hopefully we can make that happen. So let me know what you think.

290
00:48:39.534 --> 00:48:47.463
[upbeat music]
