10# Tim Wilson (Sr. Analytics Director: Search Discovery, Head of Solutions: Facts & Feelings, Co-host: Analytics Power Hour, Co-author: Analytics the Right Way) on Thinking Before Measuring in the Agentic Commerce Era

Share:

In this episode of Knowledge Distillation, Katrin Ribant talks with Tim Wilson – one of the most respected voices in digital analytics, widely known as the Quintessential Analyst (a title he’ll deny but absolutely deserves) and equally famous for climbing on a soapbox and delivering the kind of rants that somehow leave you smarter when he’s done. Tim has been working with digital data full-time since 2001, holding senior analytics roles at Search Discovery, Analytics Demystified, and across multiple agencies and Fortune 500 consultancies, and is widely known for his no-nonsense, clarity-first approach to getting business value out of data – earning him a reputation as one of the industry’s most beloved (and self-admittedly cranky) analytical thinkers.

Continuing the agentic e-commerce series, this episode goes upstream: before you instrument, before you build the data layer, how do you decide what to measure? Tim draws on over two decades of experience to argue that the agentic commerce shift – comparable in scale to mobile, Amazon, and GA4 combined – demands business clarity first, not more data collection. Together they explore why organizations keep repeating the same measurement mistakes across every technology disruption, how to use hypothesis testing and primary research to cut through the hype, and why the analyst’s real superpower is resisting the urge to solution before the business question is clear. The conversation also dives into the evolving skills analysts need now, from understanding LLMs to prompt and context engineering as the new SQL.

All episodes on our website: www.ask-y.ai/knowledge-distillation-podcast

Learn more about ASK-Y: www.ask-y.ai

Chapters

00:00 Introduction and Setup Challenges
10:09 The Rise of AI in E-commerce
20:05 Understanding Measurement in the AI Era
29:53 Hypothesis and Experimentation in AI Analytics
40:08 Practical Steps for Organizations
49:57 Navigating the Future of AI and E-commerce
52:04 The Evolution of Marketing Strategies
55:54 Data Management and User Experience
58:43 Customer Relationships and Behavioral Data
01:01:41 The Challenges of Data Loss
01:04:39 The Importance of Intuition in Marketing
01:10:25 The Role of Loyalty Programs
01:17:59 Navigating Change in Analytics
01:20:44 Skills for the Future of Analytics
01:30:39 Prompt Engineering and AI Utilization

Katrin (00:00)
Welcome to Knowledge Distillation, where we explore the rise of the AI analyst. I'm your host, Katrin Ribant, CEO and founder of ASK-Y. This is episode 10, and we're continuing our deep dive into the agent e-commerce era. In episode eight, Sunny Manage showed us how to make websites readable by AI agents. In episode nine, remember this is episode 10, Josh Silverbauer.

walked us through the data layer implications, what breaks, what needs to be captured, and how sessions decouples from conversions. Today, we're going upper level. So before you even start this journey, how do you decide what to measure? Shouldn't you think before you act? And yes, I am insistently looking at you, Tim.

To talk through this whole thinking before acting and asking yourself some questions about what is really important to measure, I'm really thrilled to welcome today's guest, no other than Tim Wilson, head of solutions at Facts and Feelings. There's lots of feelings, right? It's Facts and Feelings, yeah. Founding co-host of the Analytics Power Hour ⁓ podcast.

Tim Wilson (01:05)
What?

Katrin (01:13)
and co-author of the book, Analytics the Right Way, a Business Leader's Guide to Putting Data to Productive Use and Dare I Say, also known as the Quintessential Analyst. That was not in the prep notes, but I was gonna put it in. Before we get started, were you even a tiny bit jealous when I called Michael the Quintessential AI Analyst?

Tim Wilson (01:25)
Had to get that in. No.

Alas.

I was, I was thrilled because I feel like, like the mantle has been passed and now I can like shovel that shit back at him at the same rate for, for an infinite period of time. So, ⁓ yeah. ⁓

Katrin (01:47)
That was the idea! I was very proud of that one. Also was nodding the prep notes for him, by the way.

So Tim, can you talk about your current focus and maybe please introduce yourself to or like, you know, I did it perfectly, but just like, tell us your version of it. And for the members of the audience who might not know you, which really, if you don't know Tim, please do go and listen to the analytics power, guys. And you have many sing out on, you know, wisdom and mostly rants. I'm actually jealous of the people who haven't heard any of that yet. So

Tim Wilson (02:26)
I

Katrin (02:27)
Tim, please, an intro.

Tim Wilson (02:30)
Sure. And I laughed. There's a guy at my gym who I've gotten to know and he and I do weekend runs periodically and somewhere on long runs, you never know what you're going to talk about. somehow he's kind of in a IT technical space and he, ⁓ the podcast, the analytics power hour came up and then he's now become something of a listener. now he like gives me grief. He was like, yeah, I know you're cranky, Tim. So I'm like, well, you've known me in person for like,

long enough. Like I don't really think I'm that different on the podcast, but whatever. yeah, so my background, I'm getting kind of long in the, in the tooth. I came up through architecture and then technical writing and then Markcom kind of in the early part of my career, and then stumbled into this thing called web analytics about 25 years ago. And that pretty quickly grew to being kind of broader customer analytics, marketing analytics, BI, kind of this exciting new.

space and that was kind of starting out on the, on, you know, in-house as, and when somebody says in-house, that means they must be a consultant. Cause, ⁓ that's the, the, words we use, but the last 10, 15 years I've been mostly kind of agency or consultancy side working with primarily large enterprises to get value out of the data that they have. ⁓ which means I think a lot of my crankiness has come through from

time and time and time again, running into companies that have all of this data and they are not getting the value out of it that they expected and kind of where they, their default setting is I need to change tools or I need to clean up my data or do a new data integration or I need to capture more data. And then once I've done that, it'll be three months or six months or 18 months down the road. And then

then I'll be ready to start getting value out of data. And that cycle just goes on and on and on and on. And when I've been most successful with companies, it's when it's like, put that to the side. My default setting is every company out there has enough data. It is clean enough for them to get value out of it without chasing any additional data, data integration, data cleanup.

And partly I can say that because that machine cannot be stopped. So that train is going to move forward. If any analyst says, I'm not going to worry about that at all. It's still going to happen plenty. So that's kind of what I do now. Facts and feelings is really, even though we are deeply grounded in data research analytics, experimentation backgrounds, the most of our focus is upstream of Getting helping organizations get focused and aligned.

internally on what it is they're trying to do as a business, what ideas they have that they want to validate with some evidence and getting real clarity on that before turning to the data. Because consistently what we find out is we need much less data. It's a much faster path to value if we've got that kind of upfront thought happening first.

Katrin (05:42)
Who do you typically talk to in these organizations? Who's your main entry point there?

Tim Wilson (05:49)

So that was kind of a big shift coming out of where I'd been historically had been hooked into the data people or the analytics people or the experimentation people and where we are now. And I think this is where we, this was very intentional is we are much more kind of our, our primary contacts are it's the head of digital, it's the head of marketing, it's the head of product. It is somebody on the business side.

who generally is feeling frustration of feeling like they're getting a ton of reports from their analytics team, but they're not, they're not getting the steady flow of actionable insights that they were promised. And every time they turn around, they're getting told, ⁓ and we need to, you need to invest more because this privacy thing is going on or you need to invest more because we need the instrument for AI or, know, it's kind of the

starts the ones who are feeling frustrated, like it feels like they keep having to pay more and more and more. they're still not getting to, you know, value from what's supposed to come back to them. ⁓ and I've got a million, that's like a whole other episode of like all of the forces that are, that are unintentionally conspiring to kind of perpetuate that un unproductive ineffective cycle, ⁓ are out there.

Katrin (07:15)
You're just gonna have to come back. What can I say? ⁓ Today's topic is so this is actually really the reason why I thought that it would be really important to talk to you about this in the middle of the series because it downed on me like this is not really you know, I haven't really Planned this very well. Literally. I was doing the episode with Sani and it downed on me that I Hadn't realized how big this was

Tim Wilson (07:18)
Hahaha

Katrin (07:43)
I did, I hadn't realized what it meant for brands actually to adapt to agent e-commerce and the, you know, the impact that the business questions, the what they should ask because this is going to be a lot of work. So my thinking is ⁓ open AI's instant checkout and the like is creating two distinct segments of valuable users, right? Because we have differently valuable users. These are like, you know, valuable users.

with very, very different needs. We've got humans and we've got ⁓ agents, bots. And adapting marketing and commerce assets like websites is basically forcing a work stream cycle that I think is comparable to the scale of mobile, the mobile shift, the re-platforming, the Amazon shift and GA4, sort of some version of those three combined, because it's about the customer

experience, home customer experience, right? It's about the relationship management, the ownership of relationship, the ownership of the relationship. And that's kind of the Amazon piece, as I see it. And then it's really a data layer migration, like GA4, but GA4 was a migration with a playbook, right? Like we had, you know, we were retagging, we're going from here to there, etc. agentic measurement is really

Tim Wilson (08:51)
you

Katrin (09:10)
a construction. Everything, the metrics, the models, the infrastructure, like everything kind of has to be invented, maybe not completely from scratch, but mostly from scratch. And if you gonna start doing that, like cleaning up the website, really doing the data, they are deciding what to measure, et cetera, you kind of probably wanna think before you act. And so then I thought of you.

So you've been through these a few of these cycles, obviously. ⁓ First of all, does this one feel different to you? Do you think there are new questions to be asked? Or is this just another version of like, you know, new technology breaks old measurement, you know, whatever.

Tim Wilson (09:52)
I mean, it's tough. think the answer is, it different or is it the same? And the answer is yes. So, I, and I think that, but that's the case with everything that was the case with mobile, that was the case with Amazon. So I think we're super, super high risk of repeating many of the same mistakes or taking many of the, much of the same kind of.

Katrin (09:59)
Hmm? Of course, it's both and it depends. Yeah, obviously.

Tim Wilson (10:20)
inefficient mentality, which is I got to figure it all out. And then I've got to get everything instrumented and, which is comical and something like agentic, you know, AI like that is, that is still such a moving target that there is no way that somebody is going to know today. Like there is going to be something a year from now where it's going to look really quaint today. But in that case, I think we're still in wait. Yeah.

Katrin (10:47)
in three weeks.

Tim Wilson (10:49)
And we're still caught up in the world. Like I remember early in my analytics career going to the data warehouse Institute conference. And that's where I started to hear that data warehouse projects kept failing and why we're all data warehouse projects failing. And it turned out because data warehouse projects were being treated like, ⁓ like software development projects where figure out all your requirements, run it through, build it, ship it, and it should be good to go. And it's like, well, that's not the case.

when you're using data, you should be getting smarter and more refined. And so it has to be something more nimble, which then people is like, so it needs to be agile. It's like, well, that doesn't necessarily help as well. ⁓ That's not the right path either. So I think we've been through the disruptions, but the fundamentals of like business haven't changed. mean, to throw out the like John Wanamaker quote, like early 20th century,

Half my advertising spend is wasted. trouble is I don't know which half like that gets kind of thrown around in the marketing analytics world is just kind of this sort of pap sort of cliche. It's like, we still can't figure it out no matter how more sophisticated we get. And I think there. I had a, I had a thread there that I was following that I have now slightly lost, but I'm back. I've got it at the end of the day, businesses like

They're trying to make money. They're trying, they're trying to sell stuff to humans. Now the, the means 50 years ago was people went into a store or they flipped through a Sears catalog and filled out an order form and mailed it. And now we're introducing like a new buying channel, but fundamentally the outcomes that for-profit businesses are trying to drive is profitable sales, you know, and that there's a tendency to say, yeah, yeah, yeah.

But I got to figure out all this stuff about whatever the latest new thing is. And that latest new thing was, was mobile. think mobile is kind of the better parallel because there were lots of things where it was just taken. There's so much ⁓ chatter out in the ether about mobile that every company thought they, they had to get a mobile app. Like that just came down from on high and not having spent a lot of time with agentic commerce. And I have not.

Katrin (13:07)
yeah.

Tim Wilson (13:14)
tried to purchase anything, but there are some things I'm having a hard time imagining myself, you know, sending my agent off to buy a new car or sending it to necessarily buy clothes unless the business model of where I'm buying from has changed to the point that I have some confidence that that will work. So I think we have to first recognize like,

Katrin (13:35)
Have you not found yourself

asking whatever you use as your go to LLM, Claude, in my case? I mean, I...

Weird story, I have to make meringue every day now because my nutritionist told me that meringue is very low calorie and I'm trying to lose weight. ⁓ Meringue, you know the egg whites, egg whites, battered egg whites, high volume, very low calories, amazing, have never thought about it, fantastic. Now I have to batter eggs every day. I don't have the thing to do that, like because I have nothing, you know, so like I don't have the right thing and I don't even know what it's called. So...

Tim Wilson (13:59)
Wait, mahog?

Okay.

Katrin (14:22)
my first reflex, I asked Claude. And it did what I would have taken half an hour to do with Google searches, not having the right words and not having the like, you not even knowing how you say that in English. Turns out you say it the same way in French. ⁓ That's great, just with a different accent. So maybe I did like I bought on the actual KitchenAid website.

But ⁓ because Claude doesn't allow me to do it directly from Claude, but I would absolutely have done that from Claude otherwise. You don't have any experience like that where you're like asking yourself about some stupid thing and you're like, let me just...

Tim Wilson (15:07)
Well, I mean, but that, I mean, I think from a, from a find ability and discover ability, which I think goes more into the like, what is to me is what is, what is AI doing to SEO? What is AI doing to SEM from a find ability? And I mean, I can certainly see, I mean, probably within the next year, yes, I will purchase something, but there's a lot of things I won't purchase. So I haven't purchased anything yet. There's a lot of stuff with being a.

Katrin (15:33)
course.

Tim Wilson (15:36)
middle-aged dude in North America, not proud, but a lot of stuff's gonna be, does Amazon have it? And I'm gonna be a slave to Amazon's kind of abuse of their vendors for a long time. So I have to look at it as like when, I think for a business to step back and say, okay, this is coming to some capacity. I think it gets treated as it's already here. it, because it's new,

Katrin (15:45)
Mm-hmm.

Tim Wilson (16:05)
There's just kind of a human nature instinct to think, to imagine that, my God, this is now impacting 80 % of our sales. Whereas when I go and talk to other consumers and we're talking, they played around with chat GPT. Sure. Some of them say that they just, you know, totally use it so that there's, there is a change in consumer behavior. But to me, I think there's a lot more value in understanding.

the pace and nature of those consumers' expectations, which is in some ways easier, some ways harder to gather data. that's primary research and probably primary research to commit to asking your customers what their expectations are. And then there's a separate one saying, how do I compete if whatever my segment is? mean, Sandy?

brilliant love, you've got to make some moves. think Josh, brilliant. You've got to do some instrumentation. I just think there's a tendency to do, let's do all of it. Let's do as much as we can. There's a lot of buzz about it. We can get the investment. We can get the funding to do it. And it's pretty easy to lose the thread of, really just trying to make money effectively. And that's what I need to worry about measuring, you know?

Katrin (17:02)
That's really...

So it's true that obviously there is always in every sort of hype, every new thing, et cetera, there's a stream of funding and there's opportunities to build careers on spearheading projects that come around and are funded and et cetera, and that are cutting edge. there is that and there is always going to be that. also, that's just, that's always the case. It's true. ⁓

Tim Wilson (17:57)
Yeah.

Katrin (17:59)
If we agree, like it was really interesting what you said about the primary research. hadn't thought of that. So I got, we see, we agree we should measure goals and those should be business outcomes. So not metrics. So something like a million dollars of revenue. That's a goal email click through rate, not so much a goal. in this case for agent commerce, I learned from Josh that

What you see in the medium source field is not really representative because there's a whole bunch of subtleties about how that's captured on that not captured depending on the behavior of the user in the actual LLM. So it's somewhat representative and may help you somewhat. ⁓

gauge an impact if you see that going up dramatically does probably mean something. But you would say some primary research would be a good idea. I never thought of that, but that is actually a great idea. Have you heard of anybody or something doing that?

Tim Wilson (19:01)
But I mean,

I'm every client I'm working with right now is thinking about AI and we have some hypothesis saying, the best way to kind of answer that is to, is to ask people, know, and they're, they're in imperfections of asking people what their intentions are. And you know, the, if, what was it, if it was at Ford, who was it? Is that a battery? What they want? They'd want to, you know, a faster.

a buggy or horse drawn, but I'm butchering the quote that I'm, but.

Katrin (19:34)
Yeah, If you ask

people what they want, they're going to tell you they want faster horses. Cause they can't imagine what is a car. Yeah.

Tim Wilson (19:39)
Yeah. So,

yeah, but I think so. So certainly understanding like, can I tease out this behavior? But I will say we've also as an industry tended to get into, yeah, but it's tricky because this thing will look like this other thing. This, this bot will look like a something. I mean, that's, you know, bot filtering and there is a degree of saying, well, is it close?

enough and it's a really, real hard judgment call to make to say, can I invest a whole lot more to still get imperfect data, but make it less than perfect? Or can I kind of find the broad brushes? I mean, I think it's a great point. You do want to identify the distinction between humans who appear to be clicking, probably want to know if they're on a mobile device or a desktop device. You want to know.

What are just bots? want to be able to exclude them if they're just kind of scrapers and they're not really whatever you probably want to know. You would like to have some sense of, of the agents, but then you start to heading down the slippery slope of like, yeah, but I want to know like which agent and I want to know exactly what they did. And is that precise or am I under counting by 10 % or am I over counting by 10 % and like once that gets cracked open, the entire analytics team, the whole instrumentation team.

can find all of these little gaps and start wringing their hands and spinning up new projects to narrow those gaps. And then all your, all your capacity is gone. And nobody's actually asked a question. They're just kind of of the, ⁓ things are changing. We better get as perfect data as we possibly can. And yet no one has actually said, but like, what are we actually trying to

do and what's really going on. I'm not, I'm not opposed. mean, I, yes, organization should be trying to figure this stuff out. They just, and make changes to their sites. But even that I would say it's coming at it from, I've read these smart pieces. Others have written or this research others have written. And some of that is how to get a sense of how much agentic traffic is coming to your site. Is it purchasing?

Can it purchase? We should try to write an agent and see if it can purchase on our website first and like keep the, keep the blunt and quick answers to the questions as opposed to kind of thinking through every, every corner case and every use case. And that's just how we get, that's how we have trained the whole world to operate.

is every corner case gets treated as one that must be, if somebody can think up a solution to this corner case, then we must go implement that solution. And it just bypasses the, but what's the opportunity cost for addressing that corner case.

Katrin (22:44)
But that's

a great idea. Also never thought of that before. So if like a gData officer or whatnot, one of your, how do you say that in English, interlocutors? People you talk to. If there were to come to you with these sort of questions and you were to think,

Tim Wilson (22:58)
⁓ yeah, sounds a lot better with a French accent.

Katrin (23:14)
what they should actually start with and start the questions with the idea of building an agent and going like, and if we build an agent, what do they see? Because then you can control, like you can actually check what the agent sees and what do they see and can they even buy on our website? It's a really interesting question. And that's not that hard to do.

Tim Wilson (23:37)
that I, yeah, I mean, I haven't done it. And I can see you say, well, we want to do two or three of them, but maybe that goes back to the parallel of, of mobile, like, well, figure out what the top five mobile devices are and buy them or find who, which employees have them and go navigate your website just to get that base understanding. could see with an agent, could say, but my agent is going to throw some added little logging in. So

I want the agent to, kind of record where it feels like or where it feels. There you go, Tim, little anthropomorphization of the agents, but is it hitting a stumbling block? Was it able to buy? You know what, what I was asking, which lots of value in getting comfortable with what even is this space? I think that's another challenge. I think when you throw it to somebody, if they're not really in

enthusiastic about it and you're like, here's the assignment and they're a developer who doesn't really care, but it's the next thing on the, the jerry ticket that got assigned to them, go develop that agent. Like, I don't think that's going to be that useful either. You got to get the people who are enthusiastic and then recognize that they are absolutely not representative of your user base almost certainly. So always apply that grain of salt, you know, they're like, well, when I, you know, used my

Katrin (24:58)
Not even remotely,

Tim Wilson (25:03)
wrote an MCP server and custom built this. was amazing. It's like, yeah, but you're selling toothpaste. Like in the grand scheme of things, do you think that's what our customers are doing? So, but that's where I think like pairing that with going out and asking your, you know, not that expensive to go ask whatever some representative part of your target target audience of like, where are they now? What are they thinking about?

Katrin (25:15)
This-

Mm-hmm.

Tim Wilson (25:31)
doing and I would suspect that most of the time it's going to be much lower. That adoption curve is going to be longer and slower than we think.

Katrin (25:46)
No, that's really great because it kind of answers my next question, which is, you know, in all this mess, like, how would you practically consider that getting, you know, getting started actually would make sense? Those two things are achievable for most organizations. Maybe not the building the agent, but I mean, it's really not that hard, right? It's really doable.

the survey definitely and the survey will give you enough information regardless of the state of your data layer and whether you see anything in your data layer or not to at least get you an assessment of how pressing an issue this is for you and what sort of impact it will have because as you said, all of us live in a bubble, right? The fact that you and your friends

do a certain thing. This is actually one of, I think, I started in a media agency and later on my first day, my boss asked me, so ⁓ what TV shows do you watch? This was in Belgium. So you know, nobody would know what it is. So I said the TV shows I watched and I thought those were the TV shows obviously everybody watched because they were the best TV shows. There was still linear TV at the time.

for the young people who are listening to us. So.

Tim Wilson (27:12)
We used to just call it

TV. Now we have to refer to it as linear TV. Yeah, because we're not talking about YouTube. Yeah.

Katrin (27:15)
Yes, so she, right, exactly.

So she said like, okay, so here are the audiences. So she goes into the audience tool and she's like, here are the audiences for, you know, for your shows, less than 2 % of the population. ⁓ Look at these TV shows, 40 % of the population, you live in a bubble.

Your friends and you live in a bubble. This is not what most people watch. This is not what most people do. If you think something's good, it doesn't mean anything about whether that thing is effective to advertising, to advertise X, Y, So, and that always stuck with me. Like, you know, like we really do not realize how much a bubble will live in all of us, whoever you are.

⁓ And so the fact that I do, you and I do or don't do certain things doesn't really mean anything about what do most people do in certain circumstances. So the survey to answer that question is great. The agent to get an understanding of am I getting into the context window? What do my competitors do? Is it possible to buy on my website? That totally is brilliant. ⁓

No, I really, I totally did not think about that, but I think that's great. ⁓ But then, ⁓ you know, you talk about, I'm going to butcher this. You talk about the metric side of think and the hypothesis side of things, right? So what would be like a set of hypothesis that would you would want to validate with this? Like, how would you say people should think about? These are really the questions I want to ask.

Tim Wilson (29:00)
I mean, think that's, that is where like that gets to the core of what I believe we keep skipping. And even some of this discussion has been part of that, like thinking through, you know, I believe that ⁓ people are trying or expecting AI to, or using AI to research stuff in my area.

And I, and I think we are not, we are uniquely not findable. ⁓ and the way I would say, okay, but what evidence do you have? And they'd say, I don't know, my cousin asked chat GPT and we didn't come up or whatever, maybe weak evidence, maybe strong evidence, and then answer the question. If that's true, if we can validate that we are uniquely unfindable in our competitive space, or we're not as findable as we could be in our competitive space, what would you do?

Are you ready to funds to launching an initiative to become more friendly in this world? Cause if you're not, then it doesn't really matter. ⁓ and so that's like one, every one of these is, you know, I believe that an agent can't success a main mainstream agent cannot buy on our website or actually before that, I think a hypothesis would be, I believe there are agents coming to our website, trying to make

purchases or maybe it's trying to do research. Those are two separate hypotheses and are not able to ⁓ successfully or not able to effectively do that. And I believe that because our website is built on horribly legacy technology. And I read a post about how WordPress is uniquely terrible for this. I have no idea if any of that is the case. And if, and if that's right, then we will explore re-platforming or we will explore

Katrin (30:46)
I'm sure it is.

Tim Wilson (30:54)
What updates we will conduct an audit of our site, know, something that's more detail. will hire Sandy to come in and take a look and make recommendations. So I think that it needs to start with those. Like you're afraid or you're excited and putting some voice to that about the, what do you really think the scale of this thing is?

And if the scale is on that order, the magnitude, the volume, what would you actually do? Like what action would you take? And you may not know exactly how you're going to fix your website, but first you have to, you should validate that this is, has enough magnitude or enough scale that it needs to be fixed. But I mean, they're, they're kind of infinite. could be, I believe a potential competitive advantage inside of the next year or two.

is if we really get out ahead of our competition with having an agentic friendly website, because I don't think they're going to catch up. Okay, well, that may go back to saying, well, how do we actually figure out that the adoption curve by our audience is such that we can get ahead of this? And we probably also want to ask some critical questions. Like what have we already learned about changes that have been made to

cater to the biggest models that now actually we have to undo because they've changed the way that they're working. So those aren't super specific. They just kind of all go to like, I want to get the people in a room who are asking for deep data dives and kind of gently, I don't want to say force them because that's not productive, but ⁓ facilitate getting more clarity around their thinking.

Katrin (32:35)
No, they're-

Nudge.

Tim Wilson (32:52)
Cause that also can get us to answering some of the questions a lot faster. Say, you have a hypothesis of this. we can, if we fail to validate that hypothesis, do all these other ones just drop away? Because it turns out you can table this for two years, you know, or something. So.

Katrin (33:08)
Well, and on that note, you distinguish between hypothesis that require experiments and hypothesis that can be validated with observational data, right? So for agent mediated purchases, in the light of everything we've talked about, ⁓ which type do you think we're dealing with? Like, can we even run experiments? Can we like, what would you say?

Tim Wilson (33:31)
Yeah, mean, I think experiments are pretty tough just because it's hard to do random assignment of agents. So I think you probably are kind of somewhat limited to kind of historical data analysis and making sure that you're capturing at an appropriate level of fidelity. The exception to that.

Could be if that historical analysis says, we have different categories and we have an idea that if we actually, and they have roughly the same demand, but they're addressing different needs. Can we potentially could experimentally say we're going to make updates to, do something that we think will be low-hanging fruit, more agentic friendly to one category and not to the other. And then.

see if that generates a result. And for a few weeks, not like this is kind of a permanent thing, but that does get more challenging to design an experiment because you can't, I'm sure somebody will be watching this and saying, here's how you can do clever experimental design.

Katrin (34:59)
And they

should contact us and we'll get them on the podcast. ⁓ But so in the meantime, what's your current idea of like the minimum viable measurement framework for agent e-commerce? What would you say today with, you know, that's going to be laughable in three weeks ⁓ about this?

Tim Wilson (35:00)
Yeah, there you go. Future episode.

So I think it boils down to

some reasonable fidelity of just identifying traffic that is agentic or not, I think is kind of a human or non-human. And if the non-human is kind of like bots versus like smart agents, ⁓ great. And I think there are times where maybe that means you need to go back to your log files.

I'm just not knowledgeable enough to know. I've seen enough posts where people say,

Katrin (35:59)
I think there's no way

you're not going back to your unlock files for this. There is just no way out of it.

Tim Wilson (36:05)
Okay. Is that probably because agents are going to be like not, getting picked up at all.

Katrin (36:10)
The only way I can imagine

doing this today with the data you have capturing average brand with average, know, messy, whatever, somewhat good, someone not good, like what basically everybody has, right? ⁓ It's obviously not going to be optimized to capture this well, and you're not even going to really know what to capture.

The only thing I can possibly think of is you go back to your log files and you do ⁓ user-based analysis, basically, user-agent-based analysis, and you look at what are behaviors that are sort of atypical in terms of consuming too many pages too fast, ⁓ comparing too many things, doing too many actions that do not resemble a classic crawler.

Tim Wilson (36:42)
User agent.

Katrin (37:04)
are too fast for a human and result into too much things that actually look like they're going to purchase if they could. So, so that's, that's kind of like the only thing I could possibly think about. And I, for that, you obviously have to go back to your log files and that's doable pretty much, I would say for everyone.

Tim Wilson (37:14)
Yep.

Well, mean, I don't even like when in the log file does the, mean, with we're so far down the bots kind of arm race where user agent became a completely unreliable field. Cause it's, can be edited and there are reasons that malicious or even non-malicious just bots want to mask their user agents so that content doesn't get suppressed. But I don't even know if, if what the, how identifiable

Katrin (37:49)
But these... these probably wouldn't.

Tim Wilson (37:56)
in a log file and this may be me just being old. does, does, does, does Claude, if it's going to go hit with an agent, does it say, kind of want to raise my hand and say I'm an agent and therefore it doesn't have to be behavioral. It's just like, look at this one value in the header and Claude is just raising its hand saying I am, am an agent. Like it could be a lot.

simpler. just, I don't know. And I mean, if you say, it turns out that these dominant, what we think are the dominant agents, it's very easy to identify them.

So I don't know. And this is one where I'm just not knowledgeable enough to haven't looked into it. I don't think it would be that hard to look into, but back in the days, when, when bots like search bots were crawling the site, they had kind of an active incentive to mask their user agent to look like a human. And I kind of wonder if like,

Cod's bots care or if they actually want to say, will tell you.

Katrin (39:07)
say it's probably the opposite,

they want to handshake with the website.

Tim Wilson (39:11)
Yeah, so I would think they may, and that's the question is where they say, well, in order to get a handshake with the website, have to write my user agent and claim that I am Chrome running on a Windows machine or whether they say, no, I am happy to use my user agent to say, I am the anthropic whatever, whatever. It's possible that before having to get to kind of detecting

behavior to try to infer what is a bot. It may be that there's a much simpler way not to get all of them, but, I just don't know, should have done more research before this. But trying to get to like the simplest, it's like, does it turn out that there are just user agents in the header that I can get from my log files and get that very like crude, just based on IP and user agent, how much traffic is coming from these bots?

Like that's the minimum. If that's available. Yeah.

Katrin (40:11)
So from what I know from

asking my friends, which means, you know, it's a bubble, obviously. And I say that obviously after having said that we all live in a bubble. So after asking my friends who are more technical than me around this, question currently, the consensus seems to be that currently it's inconsistent because most of these companies haven't really optimized for that yet, right?

Tim Wilson (40:17)
you

Katrin (40:36)
It's, very immature in their ⁓ model in the go-to market in general. the selling their go-to market towards brands, advertisers, anything that has to do with commerce, advertising, et cetera. It's really like very, very much early 2000s. It reminds me of the early 2000s, know, the first sort of steps into it. Well, contextual targeting, things like that. And ⁓ just not really very much thought ⁓

Tim Wilson (40:54)
Yeah.

Katrin (41:06)
put into those, how do we appear to the brand and how do we do, how does all that process work? So it's inconsistent depending on obviously which bot you come from, but also which platform you used the bot, are you using the desktop app, are you on your mobile, are you using it on the web, et cetera. So it doesn't appear necessarily in the same way. And it's also not very stable in time because they change a lot of things all the time. So things change. currently not necessarily, yeah.

Tim Wilson (41:33)
But I would argue that means I

don't care that it's not stable in time. mean, that's another pitfall that businesses fall into is like, we've got to, we better build up our total record. They have this like delusion that it's going to matter what their five-year trend for bot usage or bot agentic on their site. So if they're not thinking, like there's readily available secondary research that I can generally get a sense of like,

Katrin (41:52)
yeah, no, it won't obviously.

Tim Wilson (42:01)
What are the five most likely agents? Let me go and look at those and get a scale, but because it's just, it's so easy to fall into the like, yeah, but this new little upstart out of the Bay area, they figured out this way and they're way more sophisticated. I'm like, well, yeah, but 12 people are using that. So I don't really care. I'm right now we're in the broad brush. is, I think the mobile parallel that you had is like, that's like.

Katrin (42:06)
yeah. Yeah, yeah.

Yes.

Tim Wilson (42:30)
I'm going to be using that. Like that's genius. Like there weren't, there weren't a million different devices. By the time there were a million different devices, a nice tight responsive website would do in the place of an app many, many times. But in the early days, you're like, what are the top five, top three smartphones? I just want to go and use that as a proxy. is a conservative proxy because there are corner case ones and

You know, but so I think there's like an opportunity to say, if they're not really thinking about it, then it may be really easy to, to, to suss out. It's, like when Juliana Jackson and Jason Packer and Sonny, right. We're involved in figuring out that it was a chat GPT was literally passing entire queries into the Google search console. Cause

Katrin (43:24)
Yeah.

Mm-hmm.

Tim Wilson (43:27)
Yeah, they're sloppy. They're not thinking about it. They're chasing a bunch of other stuff. They're not thinking about, does my user agent need to cater to a question that a thoughtful business would be asking? Maybe.

Katrin (43:38)
They also really

don't have that much control over the application they build around the model, right? Those applications are, in fact, there is a floofy episode coming about that. I have built what I call the floofy factory around the model, and it kind of explains all the API gateway and the scanning the files and unpacking files and where all of that goes and what the security risks are around that. And the reality is those companies just don't really control that architecture.

Tim Wilson (43:49)
Yeah.

Katrin (44:07)
It's too complex. And so you have copies of data that stay in those layers. They don't really know where they are and those copies persist. So those leaks are happening and are gonna happen.

Tim Wilson (44:23)
No, yeah.

So it's interesting though. mean, the way is your, I mean, I kind of love this. Like this is everyone thinks about the measurement either they over index to it too early and they over invest because they don't want to make the same mistakes of the past and not track enough stuff. That's misguided. The flip side is the people who are kind of generating the data, the companies, the

they're not thinking about what that looks like. So it feels like it should be an opportunity to again, find like the quick and dirty. Does this need to be feeding into a dashboard updated in real time with exactly how many agents are coming to our site? Or does an analyst have a Python script that runs once a week and spits out a couple of static charts for now, like minimum viable, cause it's not changing.

Katrin (45:19)
Mm.

Tim Wilson (45:21)
week to week. some huge announcement comes out and they're like, this changes everything. It should blow everything up. Cool. The analyst goes and runs their Python script a second time that week to see if it actually is affecting that website. So that that's one part of the minimum bio. Well, the other part I think is actually understanding where your consumers heads are when it comes to adoption. And I think that is going to that adoption is curve is going to, it is likely going to hockey stick, but I do think it's also

Katrin (45:32)
Yeah. So, yeah.

Tim Wilson (45:51)
different if you take who your consumers are and where what you're selling to them like that intersection, I think fairly easily can find out that is very different from whatever the average of the masses are.

Katrin (46:03)
Obviously, yeah.

Depending on your segment of the market is going to be different, but I agree with you. think it's going to hockey stick. Absolutely. We're at the very beginning and I can, you can, there's a point you can just feel it with these things when you've seen enough of them. And this one, I'm pretty sure is going to hockey stick, but I want to move away from, I know you love the mobile, you know, the mobile example. I want to switch to the Amazon one because I actually think this is just as important if not more. ⁓

Tim Wilson (46:27)
example.

Katrin (46:36)
because it really is a very, very different and not good different ⁓ level. It's one of the first things that I learned when I moved to the US is when Americans say interesting or different, it's generally not good.

Tim Wilson (46:56)
You

Katrin (46:59)
So it's very, very different from the relationship that most brands have with their customers today. And so in your experience working with brands, first of all, how much of what they call customer relationship is actually built on behavioral data that they won't access anymore or part of it they won't access anymore? And how much is more of a sort of like tribal knowledge guesstimate, et cetera, like practically?

Tim Wilson (47:28)
⁓ I feel like I'm going to, I'm going to beat the same drum again, but I think it applies. do have a kind of a funny story. was working with, ⁓ it was long enough ago now that I can probably say CPG. was Purina for a number of years and they kept talking about, actually I had an experience with Procter and Gamble where they said, we're going to do e-commerce. I'm like, why? Like I am, I am not interested in buying.

I'm not going to come to your site. I've got other outlets and their idea was, what? Well, and this was like pre-Amazon. They said, we're going to set this up and that way we will learn about consumer behavior on these true e-commerce, know, Gillette.com. I don't know that was one of them. And we will learn about consumer behaviors and then we'll use that to go to our channels, to the Walgreens and CVS and

Katrin (48:02)
You'll go to Amazon. You'll go to Amazon. You'll go to Amazon. ⁓ yeah. Okay.

Tim Wilson (48:27)
Kroger or whatever and and help them improve their e-commerce experiences and the Totally fatally flawed because it's such a weird skew that you're gonna get ever buying on the brand's site So then I was working with Purina a number of years later Amazon had really come up and they kept talking about e-commerce. I'm like and they did have one Weird little actually owned e-commerce site and I was like, do you mean? You hired this like what are you talking about? And it took me a long time. It was like, oh, they're talking about Amazon is e-commerce

So this is all just an advertise that I can be clueless well into the adoption of something and not recognizing where it falls. But the hand wringing around, losing all of this behavioral data kind of goes hand in hand with that was the Nirvana was having hyper fidelity user level ⁓ data. That's what his spawned.

Much of digital analytics, customer data platforms, customer journey analytics, the 360 degree view of the customer. All of these things are this like impossible pursuit of if we could just get to a one-to-one marketing, right? That's 30 years old at this point. Yeah, at scale. So all of that, like there are so many of those out in the zeitgeist that it just feels like a pain.

Katrin (49:35)
⁓ yes, that one.

scale.

Tim Wilson (49:55)
when there's a channel shift and you're gonna lose that data. And CPG companies have always rung their hands about like, we're gonna offer coupons and that way we can get a person's contact so we can get them as an individual in our universe for the baby formula that we're selling. And it's like, to what end? Like that's not really gonna, so I think that this is where we still are under.

this idea that customers want to have a direct relationship with the brand, if you could force them to have that direct relationship with the brand, that you're somehow going to at scale.

move them because of that, that's like a super expensive thing to have as opposed to, yeah. Okay.

Katrin (50:40)
So that's actually not really what I'm thinking of. What

I'm thinking of is, ⁓ you talk to any senior marketer who's in any domain and who's been in that domain for a while. Generally when they're at that level, they've been in a certain segment or roughly speaking, a certain domain industry for a while. ⁓

They know more about selling whatever they're selling and their competition and the business and where the business makes money and like all of those things. Basically the subject of your latest episode, which I recommend. ⁓ The Analytics Power Hour for those of you who don't know. So also has an excellent ad for Ask Why. I suggest you listen to the episodes just to listen to the ads for Ask Why, they're worth it.

Tim Wilson (51:30)
that.

Katrin (51:33)
⁓ So anyway, so you know, they have all of this knowledge about the business, about the behavior of the consumers, like what they want, why they're buying their product versus the other, et cetera, all of that. And that has been built over years, decades, ⁓ based on talking to consumers, but also ⁓ data, data from different...

Tim Wilson (52:01)
Hmm.

Katrin (52:02)
different aspects. Yes, data about revenue, data about when revenue goes up or down, data, not just data from the website, right? Generally, facts, numbers that they kind of gather. Part of that is something that is built over time and continues coming from the stream of web data. And...

My hypothesis is that if that stream starts being seriously cut off from that early journey part, ⁓ you lose information, obviously. You lose information about that segment of customers and that segment of customers is going to become larger because I think it's going to hit a hockey stick.

You're not going to lose as a professional, you're not going to lose your knowledge of the customer fast because you have these decades, years, decades, whatever of knowledge and you're still going to know what's happening for a while. But there's a point you're going to need to get some facts in order to adjust your knowledge. That's where I'm not sure where and how that gap happens.

Tim Wilson (53:22)
I challenge and this is I guess anecdotal just based on the different brands I've worked with. I I've kind of learned that whenever there's a inverse correlation to somebody when they announced to me proudly as a marketer or a product or a commerce person.

The more proudly they proclaim that they are data driven, the less.

Katrin (53:55)
But no, specifically not

those people. I'm talking about specifically not those people. I'm talking about the people who actually just, you know, build the knowledge and data is just part of it. Like the professional that...

Tim Wilson (53:59)
Well...

But I don't think

the data is as much, don't, I mean, I honestly think it's the ones who have an intuition and who are actually more interested in, and I'll say talking to the customer. And it is easy to spin that around of, well, talking to the customers, watching all the shit they click on your website. That is like the least effective way to listen to the customer in many, many, cases. And I think there's,

There are many people who say I have built up all this rich thing because part of that is because I've had this feed of these dashboards that come through and they...

Katrin (54:44)
I'll give you an example

of what I mean. One of the design partners of AskWI actually also was already a design partner of Datorama. And I met her before that when we were both agency side. It gives you the span of how long she's been working in this industry and how long we've been working together. Specialist in education, education marketing for universities, online programs, these sorts of things, right?

⁓ gets a new job, I don't know, three, four years ago, something like that, gets into a company and realizes very quickly that ⁓ everything CRM related is stale. has been no investment into renewing the user base. And you're like, okay, well, I need to invest in awareness. She just knows that because she has 20 years of experience. She looks at three.

three numbers, three response curves. She's like, yeah, okay, well, this is very clearly what is happening, right? She knows that because she's seen enough of this over time that she knows what it looks like. And when she sees one, she feels the business, she knows it. Now, obviously, in order to get the investment, there's some proof points with data that need to be built. The reality is it's her intuition.

that is making the decision because of her experience. That experience wasn't built on nothing. It was built on an accumulation of experiences with customers and data and different businesses.

Tim Wilson (56:14)
And I don't think I'm going to win this. don't,

I don't disagree. I don't think it was built because she was pouring over, ⁓ detailed CRM data, detailed digital analytics data, detailed the stuff that we're pouring so much money into information. Yes. But I would, I would go to the mat. It was, I would guess we ran this, I remember this one campaign we ran that like surprised us cause it was so effective. And then we thought about it.

and we talked to some people and this was why it was effective. And that piece of knowledge went in and is anchored permanently and is useful. The story that gets told, it's like, no, no, no, because at that other place, we had a 360 degree view of the customer and we were always pouring over and building the segments. This is from working with companies, the number of times they have built a next best action program that goes nowhere. The number of times they have rung their hands around how they should be.

segmenting and who is it? It's the people who are actually spending more time thinking about who their customers are, what their wants and what their needs are and not saying I'm picking that up by watching their behaviors and then assuming that I'm applying some motivation to the behavior. They're getting closer to the customer. My one other defensive, my defense.

Anecdote is when I've worked in digital agencies and I've watched this with multiple brands every time a marketer goes and sits and watches a focus group or a usability study with some tiny little number of their customers in an artificial environment they come back and they're like, oh my god, we have learned so much and I'm like, and you spent way less on that than you spent on

on upgrading to the latest CRM on investing in integrations with pixels all over the world and wringing your hands about first party versus zero party data, which is not to say I'm opposed to doing that. just think that knowledge is it's a revisionist history to say that how much we index to hard behavioral data being what builds up that intuition. think it's.

Katrin (58:39)
And I'm smiling because I'm very happy because I got you into a real rant.

Tim Wilson (58:39)
thoughtfulness and intuition. Yeah.

Katrin (58:46)
And you know what? I will ask her and I will get back to you with an answer.

Tim Wilson (58:51)
I don't

know that she's going to be reliable, like asking somebody like where do they, I couldn't defend exactly. I mean, I could give you a bunch of anecdotes as to how I've come to this position, but I, we're unreliable narrators of our own lives and society has pushed so much to us that we are supposed to be data-driven that I would claim we're kind of incentivized to tell ourselves a story that this was, you know, from a proper analytical perspective.

Katrin (59:04)
That's true.

I can tell you

that from my perspective, I've obviously worked across a ridiculous amount of industries, clients, et cetera, et cetera, right? Because I'm on the data side. I'm literally on the analytics side. So I literally pulled over mostly to fix them, ⁓ inordinate amount of these things. And for the most part, I look at...

two or three pieces of information like a NAD reporter, CRM reporter, whatever, I get a really solid picture of what the business is actually about. Obviously, together with the fact that I also know these kinds of businesses because I've talked to senior people in those businesses and I understand how they work. One without the other would say absolutely nothing, would really not work at all.

But it really, when you've done it a lot, I think you really do build some level of just automatic intuition about it.

Tim Wilson (1:00:28)
Yeah, and my recognized bias is that I'm mostly working with companies that are raising their hands that they have a problem. So my sample of the marketers and analytics teams and brands that I've worked with are ones who are saying we are not getting the value out of the data. And this goes back.

probably the last 15 years. So I'm not getting approached by the ones who are crushing it, you know.

Katrin (1:00:57)
Makes sense. Yeah, no, that makes total sense.

So anyway, that was not what I wanted to ask you about this. ⁓ What I really wanted to ask you about is about what perspective that puts on loyalty programs in your opinion. ⁓ Because it really does have an impact. I would say if you are in a segment where

Tim Wilson (1:01:05)
Hahaha

Katrin (1:01:22)
The customer wants to have a relationship with the brand or you as a brand think it's it's it's really valuable you know it makes sense then that those programs come. Take a whole different importance given that you're losing a whole bunch of what's before.

Tim Wilson (1:01:28)
It makes sense, yeah.

Yeah. And I mean, I think that's another one. And like thinking through like, what is the value exchange for the loyalty program? Like back when I worked with Purina years ago, and I think it was cat, one of their cat brands, I think it was cat chow had like they had a loyalty program, had its own website. And that makes sense. People are devoted to their cats. They can become very brand loyal. There's lots you can, they could get with learning. And that like made a lot. was like, that's a smart move by that.

CPG brand to say, and then they had to kind of lean into their branding of being like, we're the kind where we can, we can offer you value. I turned around and look at Ford. I drive an F 150. is my, my wife jokes, whether it's our dog or the Ford that's number one on my list. And I've got a wife and three kids that are battling it out for third and blow. So very loyal to my truck, but I'm like,

the Ford Pass program, I'm like, what are you gonna, like, am I gonna build up enough points to put towards my next Ford? Like, that one seems like.

Katrin (1:02:40)
Yeah.

Tim Wilson (1:02:48)
What is the value there? So I think for a brand to stop, I mean, I've definitely watched other loyalty programs and I'm like, what are you, you're not thinking about your customer, you're thinking about what you would like to have. So if you wanna stand up a loyalty program, you better really think through how you're gonna reward their loyalty through the loyalty program. But then I agree, like it's like, I don't care where you buy as long as you're.

loyal to us and you have an incentive to come tell me that you bought again and I can get you to spread the word or you know, whatever I think. I don't know if I took off on a tangent again.

Katrin (1:03:22)
But for the Purida people, right?

For the ones where it actually does make sense. Obviously, there's a bunch where it doesn't make sense, right? But let's say you're a brand where it actually does make sense. It's a problem.

Tim Wilson (1:03:37)
Now why is it a problem?

Katrin (1:03:38)
I mean, you're

losing all of that information pre-purchase. So now if you are going to ⁓ optimize your loyalty program and that loop into advertising, marketing, and commerce, ⁓ you have lost a whole bunch of information about a whole bunch of people who are buying on your website and part of your loyalty program. You don't even know how they research you.

So like.

Tim Wilson (1:04:08)
Except

they're loyal, right? like they're, I mean, definitely it's cheaper to keep it.

Katrin (1:04:14)
everybody is not loyal

the same way, right? Like the whole goal of the loyalty program is to cultivate that loyalty. And for that you're typically closing the loop.

Tim Wilson (1:04:27)
Yeah, but it's just, I don't know. And maybe I just have not.

thought about, I mean, if, I'm selling apparel and you can be part of the, or choose the DSW loyalty program. And all I need to do is make sure that my agent that's shopping for me has my DSW rewards, whatever it's called. I ⁓ have trouble. I have trouble worrying too much about.

that honestly, I haven't maybe haven't maybe I'm missing the concern. I guess I just my broadly, I'm not really concerned about data loss. I'm rarely concerned about data loss. I'm more concerned about people trying to just taking taking it as fact. Well, of course losing data is bad and we need to try to lose less data like is

The entire history of digital has been increasingly investing more to slide backwards at a slower rate on the data loss because we believe that if we just reach the top of a mountain of complete data, then the hard work's done. And never seen, that's not where the hard work is, but we spend so much time climbing that mountain that we.

We don't stop and say, what is it we're really trying to do? And is there a way to do it in a completely different way with less, less data? So I'm a contrarian.

Katrin (1:06:02)
Yeah, that makes sense.

And so like, well, you know, let's go back to my third metaphor, the GA4 migration. So you went through obviously mobile repart, platforming, Amazon, blah, blah, GA4, ⁓ lessons learned that could be applied to this.

Tim Wilson (1:06:12)

⁓ that even Google can shoot themselves in the foot without proper product management? I don't know. That's probably not the... ⁓

Katrin (1:06:28)
That's a very good lesson. No, mean, given the level

of maturity of the companies that we are working with in this, you know, it's true, right? Google is a very mature company and they did this migration and it was, ⁓ yeah, suboptimal, let's say. ⁓ So there is a, it does set sort of like, you know, the bar as to what to expect here. It's not going to be good.

Tim Wilson (1:06:46)
Yeah.

Yeah. mean, it opened doors to a bunch of competitors who Jason Packer's new second edition of his Google Analytics alternatives book is ⁓ by the time this comes out, it is available. I highly recommend it. Partly because he spends the first half of the book talking about what is it we're really trying to do in a framing for tool selection? Cause I think that what happened with GA4, the cranky bastards interpretation is one, people don't like change.

Katrin (1:07:19)
Mm-hmm.

Tim Wilson (1:07:26)
And two, they assumed that they needed everything they had built up over time with their, their universal analytics implementation, definitionally needed to be available in the new implementation as opposed to saying, you know what, what if we started from the premise of we have no data? What's the base level that we need? And there are a million tools out there, which doesn't mean by the cheapest tool, but the migration was

just gold for the consulting industry. And even the consultants kind of hated it. They're like, this is easy money, but it's painful that I'm doing it. So I'm not sure which of my rants, yeah. Yeah.

Katrin (1:08:07)
Let's go back to the analysts because we

actually, know, knowledge destination, we cater to the AI analysts. So, so basically analysts who use AI to help their workflows and also analyze AI. I'm just broadening the definition because it's my podcast and I do whatever I want. skill wise, you know, like

Tim Wilson (1:08:24)
Hahaha

Katrin (1:08:31)
There's a lot changing these days, faster and faster, and there's upskilling needs, whatnot, et cetera. There's a lot to learn. People have finite time. If you think about that GA4 migration and what's probably coming with these hockey stick hitting the agent e-commerce, and you think about what skills for the actual analyst, right?

What skills transfer? What skills do not transfer? ⁓ What would you say? Where would you go with that?

Tim Wilson (1:09:07)
I think the skill that transfers is just woefully underdeveloped is the engaging with the business and not looking at the data at all until there's been a fully internalized clear, clear, clear understanding of what the business problem is. And the business problem is not agentic AI. ⁓

business problem, like it's gotta be framed to something that directly matters to the business. And I think that skill, totally transferable. think the technical skills, SQL, Python are in a deep, like an understanding of the data that's being generated, which I do think,

different type of traffic, how to identify it, how, where does it get captured? Where does it not? I think that like, I need to understand the processes that are generating the data and they're, Hey, there are new processes generating the data. need to have an understanding of those where the data lands, where it looks like, what it looks like. think ask why is super valuable because it's like, well, we don't have to, we don't have to completely gut and we don't have to make GA four.

Katrin (1:10:28)
Thank you.

Tim Wilson (1:10:35)
identify agents. Because to do that, we have to put this other thing on the site that kind of hacks around through some measurement protocol, whatever, whatever. If we instead say, no, you know, we can do is we can look at get the log files going into the data warehouse. We can bind it. We can get kind of a messy query. But now that query is reliably there in skills is going to sound like I'm doing an ad for Ask Why. Yeah.

Katrin (1:11:00)
Which you do on a regular base.

Tim Wilson (1:11:05)
So I think like that having the analyst having ⁓ intuition about the data and the scrappiness to say, can I get to good enough data? Which is another analyst who have just grown up with saying, well, I'm going to take whatever the GA interface tells me, I'm going to take it at face value. They've always struggled because they get burned because they don't really understand what the data is. So I think all of that's transferable.

Katrin (1:11:32)
So let me see if you disagree with me on this. I've been thinking about what's important to shift to being an AI analyst as in an analyst who uses AI effectively. Because I don't think you can ignore it. It's there. As an analyst, you really do need to upscale because it just makes you so much faster and broadens your remit.

it really goes full stack, more technical and more business. So it's just not ignorable, right? So my thinking is one, you need to actually truly understand how LLMs work. To me, that's number one. You need to know what the tool that you're working with, you need to know how it works. That's actually what inspired me to do the Fluffy series because I felt like this is so dry. I cannot be...

Tim Wilson (1:12:24)
If we make

it animated little critters, then yeah.

Katrin (1:12:27)
It's, there

you go. It's much faster. I mean, honestly, would you want to listen to me talk about context windows and attention mechanisms for like, you know, two minutes? would honestly know I would sleep in. you know, so one, really understand LLMs. Two, I don't necessarily think that I think, you know,

coding is always useful and you should know how to code as an analyst, right? And maybe you'll become a little less good at it because you practice it less. However, you need to become really good at reading code because if you generate code, you need to read it.

Tim Wilson (1:13:15)
That's one of those not newly thought through territory. do very much worry about, I code and have coded enough and long enough, still somewhat hacky but have a decent level of intuition that I find the tools to be really, really useful. I do definitely worry because I can have the AI read my code, read the code and generate the code.

Like that, that intuition development, can't, I can't get my head wrapped around. I'm in the camp of the fearful that developing a level of intuition and knowledge about coding to the point that a AI assistant is really, ⁓ is really an augmenting tool that is like a really, really powerful tool just makes you faster and better. And I just.

I struggle, I struggle with that. look at my, my son, I'll try one more time. My son who's now 26 and got his masters in computer science. So timing wise, he basically came up, was like literally in school with the guys who founded cursor, which means he was kind of learning it all the old way. And then he was a junior.

software engineer doing stuff the old way for just a couple of years and now he's plugged into that stuff and I'm like he is the last one who I'm gonna have a lot of confidence that he had to go through the hard part of learning how to code and now he just has this secondary tool. I don't know what the kids who are five years behind him, how they're gonna play out. That's probably also totally unrelated to this but you... ⁓

Katrin (1:15:08)
No, actually, I think that's completely

related because I think between writing code and learning how to write code and how we're writing code and architecting systems, but writing code specifically in this situation and reading code and assessing code, they're two different skills. They're both valuable in that, like, I really do think they're two completely different skills.

and I say that because I'm very bad at writing code, I've become actually quite decent at reading it. Yeah.

Tim Wilson (1:15:41)
Reading. maybe that's

a good, if I go way back, like the early days of web analytics where an analyst who my core remit was to do stuff with the data, but I learned pretty quickly, I needed to understand how the internet worked and understand how JavaScript worked and never, well, I probably have pushed a couple of things for production in JavaScript, cannot, but you do not want to be writing JavaScript, but got pretty comfortable with reading it. Way simpler than wrapping your head.

around an LLM debugging, debugging, whatever the equivalent getting under the hood of an LLM is. But yeah, think that's your right. Like you have to have some, some fundamental understanding of like, how is this working? How is this going on? Which is both the data generation and I think the code as well. Like if you can't read through it and say, okay, this block, I don't know what that syntax or why there's a semi-colon there, but it's generally doing this. And then it's doing this.

Katrin (1:16:41)
And the last thing I would say is prompt and context engineering. ⁓ You really need to understand that because it's really the new low code sort of language ⁓ to get what you need out of LLMs. And I tinker a lot with obviously video generation and hell of a lot with everything else, generation, obviously, right?

Tim Wilson (1:16:41)
I

Katrin (1:17:08)
and we, so actually our multi-agent framework is built with a language that our CTO, invented. he, he, he sort of built the framework so that, ⁓ low-code individuals like me, for example, ⁓ or our product team who are not engineers, right? As product team, ⁓ are able to build the entire, the multi-agent framework themselves. So it's, it's basically development work because you have to

Think like a developer, you have to architect, have to, the language is low code. So we are able to build, because I wanted to build that with people who were the closest possible to the business, to the customer, as opposed to, because yeah, like, you where you really wanted the main knowledge to be as close as possible to that framework as you can. And so I know firsthand how prompt engineering really,

really, really makes a difference. I mean, across the board. And it's a skill. It's not just, don't, and you really do need to understand how the LLMs work in order to make that skill work well. Even today, and I think, as long as we're into transformer architecture, I don't see a way out of it because you're always going to be stuck in the context window and attention mechanism, sort of like conundrums. And that forces you to really understand how you

So manipulate those hooks and the other limbs to get them to do as deterministically as possible what you want. And so for that reason, think prompt and context engineering are really the new sequel that you're ready to get good at.

Tim Wilson (1:18:53)
Is it, let me ask you, it, I agree. And, and I, as a result, I also get kind of frustrated with the people who are like, no, no, no, the AI is going to come to where I can just ask it for my insights. Like, is it fair? My, my premise is that the people who are better prompt engineers and context engineers are ones that have clarity of thought first, then the skill that they're learning is how do I get

Katrin (1:19:07)
baby.

Tim Wilson (1:19:22)
the thought into a prompt effectively.

Katrin (1:19:25)
I think

that's true for everything, right? If you don't have an intention and a clear purpose and a plan of where you want to go, you're going to be less effective. And if you, the more you have, ⁓

habit of building a plan and following the plan, the better you're going to be at this like everything else, right? It's the same for an analytics project. If you're not very good at segmenting it into steps and going, I'm going to do this, this, this, this, this, and this, then yeah, you're going to be less good. I think that's absolutely true for everything. ⁓ I think...

Tim Wilson (1:20:03)
But we're not very good at

teaching or looking for that. That's another one of those that's, I think, similar to where we were. We don't really screen for that effectively. I don't know what's teachable versus what isn't. get very quickly to... Somebody goes to a prompt engineering course and it's got all of the right information, but there may be kind of a missing... ⁓

a missing upstream component, which I think runs into analysts too. The analyst wants to just dive in and start pulling the data and they're missing that upstream component.

Katrin (1:20:36)
That's true.

That's very true.

Yeah, I think that's true. And it's something that actually is reflected to me that by a lot of analytics leaders, I talk to that one of the really difficult things for analysts is planning the analytics work, is planning the steps ⁓ and the sort of logical just how this all chains into the result. ⁓ Yeah, so that's one that I will add to my list that people should learn how to plan.

Tim Wilson (1:21:07)
Yeah. Learn

Katrin (1:21:10)
So, I mean, this has been, yeah,

Tim Wilson (1:21:11)
how to think. Yeah.

Katrin (1:21:12)
we have, well, yeah, really learn how to plan. Let's not be that dismissive. So I definitely learned a lot. Thank you. I learned some brilliant things that I'm absolutely going to steal and attribute when possible, obviously. But if, you have like something that you want people to walk away with, something to summarize?

Tim Wilson (1:21:25)
Likewise.

I

Like I'm I'm one cranky cranky person. ⁓ I think I don't think anyone's going to go wrong. I think so. I guess two things. One absolutely. Okay, okay. That's the two for. I absolutely think. Really the two for because all my last calls are.

Katrin (1:21:41)
You can do a two-four if you want.

You can do a twofer if you want.

You know I learned that word from you. I didn't know the word. Yeah,

exactly. I didn't know the word and then it took me a few of the episodes to understand that was not a mistake, that's an actual word.

Tim Wilson (1:22:15)
It's yeah, but it's a, but some label of it shouldn't be a word, but, ⁓ yeah, two for one. ⁓ I think diving into as many places as possible and having fun and getting comfortable, like I think. Spending time with this new world is kind of critical. And that I think needs to be, that can be kind of undirected, just trying to immerse yourself in like, what is the new paradigm?

is great. think my second one though is still to resist the urge to dive deep into solutioning the data architecture or the instrumentation or the data querying until there's real, real clarity on like very specific business things that you're, worrying about. So that's kind of like my, my broad soapbox.

rant just because I think we're going to stumble into the same mistakes again.

Katrin (1:23:19)
I'm sure we will, definitely, I'm

sure we will. And for people who want to learn more, follow your work, get help with their measurement strategy, learn how to ask themselves the right questions, where should they go?

Tim Wilson (1:23:34)
Uh, they should just find me on LinkedIn. Although Tim Wilson is not a, uh, super, um, uncommon name, but there is my book analytics the right way, which co-wrote with Joe Sutherland, which has many of the themes we talked about today are in that, uh, my company is facts and feelings.io. Um, listen to the analytics power hour podcast. Um, yeah, you'll be sick of me. Um, um,

long before you get through any of those. yeah, no.

Katrin (1:24:12)
So we'll put all of those in the show notes. Thank you very much. I had a blast. that's it for episode 10 of Knowledge Distillation. If today's conversation made you think about how AI is changing data and analytics, visit us at askwhy.ai and try Prism. I should really record this and not do this every time. Thanks for listening. And remember, bots won't win.

Tim Wilson (1:24:18)
It's always a blast talking to you.

Katrin (1:24:42)
AI analysts well.

­Resources Mentioned:

Companies & Organizations
  • Facts & Feelings – strategy and experimentation consultancy
  • Search Discovery – analytics consultancy where Tim held a senior role
  • Analytics Demystified – digital analytics consulting firm
Analytics & Measurement Platforms
  • Google Analytics 4 (GA4) – referenced in the context of migration lessons
  • Universal Analytics – legacy platform discussed in measurement transitions
  • Customer Data Platforms (CDPs) – discussed in relation to 360° customer views
AI & Commerce Platforms
  • OpenAI Instant Checkout – referenced in the agentic commerce context
  • ChatGPT – discussed in relation to AI adoption and commerce behavior
  • Claude – referenced as an example of an LLM interface
Media & Publications
  • Analytics Power Hour – podcast co-hosted by Tim Wilson
  • Analytics the Right Way – book by Tim Wilson & Joe Sutherland

Connect with Our Guest:

Host name:

Katrin Ribant

Episode Credits:

Host: Katrin Ribant Guest: Tim Wilson Podcast: Knowledge Distillation
Episode: 10 Runtime: ~85 minutes Release Date: 02/20/2026