11# June Dershewitz (Co-founder: InvestInData, Web Analytics Wednesdays, Former President: Digital Analytics Association) on 25 Years of Watching Analysts Reinvent Themselves, Why Data Leaders Are Betting Their Own Money on the AI Stack, and What Community Building Taught Her About the Skills That Actually Survive Every Disruption

Published:

Knowledge Distillation Podcast - Episode 11 thumbnail.

Share:

In this episode of Knowledge Distillation, Katrin Ribant talks with Tim Wilson – one of the most respected voices in digital analytics, widely known as the Quintessential Analyst (a title he’ll deny but absolutely deserves) and equally famous for climbing on a soapbox and delivering the kind of rants that somehow leave you smarter when he’s done. Tim has been working with digital data full-time since 2001, holding senior analytics roles at Search Discovery, Analytics Demystified, and across multiple agencies and Fortune 500 consultancies, and is widely known for his no-nonsense, clarity-first approach to getting business value out of data – earning him a reputation as one of the industry’s most beloved (and self-admittedly cranky) analytical thinkers.

Continuing the agentic e-commerce series, this episode goes upstream: before you instrument, before you build the data layer, how do you decide what to measure? Tim draws on over two decades of experience to argue that the agentic commerce shift – comparable in scale to mobile, Amazon, and GA4 combined – demands business clarity first, not more data collection. Together they explore why organizations keep repeating the same measurement mistakes across every technology disruption, how to use hypothesis testing and primary research to cut through the hype, and why the analyst’s real superpower is resisting the urge to solution before the business question is clear. The conversation also dives into the evolving skills analysts need now, from understanding LLMs to prompt and context engineering as the new SQL.

All episodes on our website: www.ask-y.ai/knowledge-distillation-podcast

Learn more about ASK-Y: www.ask-y.ai

Chapters:

  1. 00:00 The Evolution of Web Analytics
  2. 05:25 Hype Cycles and the Future of Data Analytics
  3. 10:25 Upskilling in the Age of AI
  4. 15:27 AI in Data Teams: Current Trends and Future Directions
  5. 20:23 Connecting Analytics to Business Value
  6. 23:40 Navigating Human and Bot Audiences
  7. 25:32 The Shift in Customer Experience
  8. 27:04 Understanding Customer Behavior
  9. 29:06 Top-Down vs. Bottom-Up Approaches
  10. 30:54 Efficiency vs. Innovation in AI
  11. 33:19 Upskilling for the AI Era
  12. 36:14 The Importance of Data Modeling
  13. 38:10 Data Quality and Trust
  14. 42:05 Context Continuity in Data Processes
  15. 45:19 AI Skills in Data Team Goals
  16. 51:26 Investing in Data Technologies

Katrin (00:00)
to Knowledge Distillation, where we explore the rise of the AI analyst. I'm your host, Katrin Ribant, CEO and founder of Ask-Y. June Dershewitz is one of the OGs of our industry. June, I think you started as a web analyst at a San Francisco startup in, may I say the year? 1999?

June (00:18)
You may.

It's true.

Katrin (00:23)
Back when

there was literally no template for the job. I mean, literally there was nothing, not even a user manual. Most of the tools didn't work.

June (00:31)
that

I mean if there were any any tools I think we got reports ⁓ once a month. And it was uphill in the snow both ways.

Katrin (00:34)
if there were any tools.

Wow.

Although

one might argue most people haven't evolved a lot from that time, having reports once a month. Well, since then you've led analytics teams in various highly successful companies, you co-founded Web Analytics Wednesdays so that if you've never been to one, you have June to thank for it. It's like, I recommend. If you can go to one of those, really you should.

June (00:46)
Yeah, we still have a long way to go. I know.

Katrin (01:07)
You served on the board of the Digital Analytics Association and you founded ⁓ Invest in Data, which is an angel syndicate investing in early stage startups. You also write the Measurecraft newsletter on Substack, which I also highly recommend. So June, welcome. Thank you. We are actually both together here in the same room and that's kind of like a little bit of a first for us.

I'm lying, we're not exactly in the same room because we couldn't make the audio work, but we're almost in the same room and we can have a drink together.

June (01:42)
Cheers. ⁓

Katrin (01:44)
Which is amazing. So before we dive in, what was that like being a web analyst in a startup in 1999 of all years? Like the hype at that time, I can't even imagine in San Francisco, right?

June (01:45)
Mm-hmm.

Yeah.

It was wild. It was the first ⁓ dot com boom that came to San Francisco. And I had just finished working as a research assistant for a mathematician on the East Coast. And I knew I wanted to move to San Francisco. And I thought I wanted to be a software engineer. But when I showed people my resume, they were like, ⁓ she knows SQL. She seems curious about the way businesses work. We think she'd make a good data analyst. And so I got this data analyst job offer. And I was like, whatever. I can quit if I don't like it.

And I loved it. And I think someone saw in me something I didn't even see in myself at that point, that I would ⁓ make a good data person. And we had a 200 person startup that was entirely devoted to giving away free postcards online, greeting cards online. It was called eGreetings. Yeah, and I was on a five person data team supporting this 200 person startup. So it was ridiculous.

Katrin (02:59)
Crazy!

June (02:59)
We had this

army of people who were doing something that one person could vibe code now. They wouldn't even need their own analyst. at the... Exactly. Eventually got acquired by American Greetings. And so there was something to it there. But yeah, at one point we decided we were going to sell stuff online and we were inventing e-commerce in 1999. And we had for the launch of our e-greeting store where we sold...

Katrin (03:07)
to give away something for free.

Yeah.

June (03:28)
chocolates and ⁓ roses for Valentine's Day to go along with your Valentine's Day card. We printed commemorative t-shirts that said, hey, let's sell something.

It's so quaint, you know? Yeah.

Katrin (03:43)
That is hilarious. So

having gone through that hype cycle, which is the hype cycles of all hype cycles until today, I suppose, right? And everything in between. How do you look at this one? Yeah.

June (03:52)
huh.

this hype cycle, I've been through so

many of them at this point. And the boom's the bust, the boom's the bust. I'm like a cockroach. keep reinventing myself and finding a way to still remain useful, which is a good thing. And so I'm used to seeing people around me get ⁓ really ⁓ excited about certain ideas, whatever those are. When mobile became a big deal, for instance, that was really transformative, I think.

⁓ seeing- ⁓ robot cars in San Francisco now it's like well we're already in the future are we're on our way there so- ⁓ it's- this this kind of new thing has- ⁓ become something that I've just gotten used to through my time here. Maybe it's a little bit different this time this seems like. Really permeating everything and everyone in a way that some of the earlier rounds have not.

Katrin (04:53)
Maybe the internet, right?

June (04:55)
the internet? Well, the physical world too.

Katrin (04:57)
Well, mean,

I no, no, I mean, I mean, it permeates the layers of society in the same way the Internet did.

June (05:07)
yeah, yeah, on the scale, it's like on the same magnitude as the adoption of the internet by mass society.

Katrin (05:10)
Yeah.

And so you said something recently that really stuck with me. You said 2025 is not the year of data science and analytics professionals see the major shift. Not in the way software engineers are seeing it, which I agree. But what do you think about 2026? Is this the year?

June (05:25)
Yeah.

Mm-hmm.

is this the year? I still think, I mean,

really in 2025, I think ⁓ all the data people are watching the software engineering people and seeing how ⁓ AI was transforming the work of software engineers. And it's really far along now. Software engineering work is ⁓ quite a bit different than it was just a couple of years ago. We can't yet say, yeah.

Katrin (05:56)
⁓ I can tell you for you

know, obviously I have a team of software engineers and we're AI native and we have been from the onset AI native both in the platform and in the you know, all the processes around the company It's nothing like what I've done. I mean it is it is still a lot like what I've done before in some ways, but in terms of the actual, you know coding and

June (06:01)
huh.

Mm-hmm.

Katrin (06:24)
the way you envision features and development, etc. The bottleneck has moved. It's not that there's no bottleneck. are bottlenecks, right? But it is really, really different. And ⁓ the way the software engineers work is very, very, very different. And it's also not necessarily the same type of people that are good, like, that are doing well at this. There's really a certain type of people that just

June (06:30)
Mm-hmm.

Mm-hmm.

Yeah.

Yeah.

Mm-hmm. Right.

Katrin (06:54)
don't do well. you are, I think there is something about, and I hate to say this because I'm a craft person myself, but if you're really bothered about the elegance of your code and your solution, cetera, it's problematic.

June (06:56)
Right.

Mm-hmm.

Yeah, yeah. I see a similar kind of attitude among data practitioners, and I certainly know how I feel about that, that we want excellence, we want perfection, and there's something about the new way of working that's really disconcerting because we don't have that control. And some of the stuff that's getting produced as data artifacts, it's not high quality. It's really low quality.

Katrin (07:38)
Yeah.

June (07:38)
But it's ⁓ empowering for everyone who now can dabble in it for the first time.

Katrin (07:44)
And I think there's a real question about is it good enough? Because at this point, often not for us, obviously, right? In data, not really. Like the scaffolding that you have to put around it to make it good enough is really significant. I actually think that some of them is good. That is going to stay true for a very long time because this is data analytics, not software engineering. It is different.

June (07:48)
Heh.

Mm-hmm.

Yeah, it is.

Mm-hmm,

it is.

Katrin (08:10)
without

a solid semantic layer and a solid business understanding of what your data means, your analytics isn't going to do anything much that's useful. Because the paradigm of data analytics is fundamentally different from the paradigm of software engineering, right? We're not building features or building platforms. We are creating a model of reality translated into data, literally called a data model.

June (08:16)
Mm-hmm.

Yeah, yeah.

Mm-hmm, mm-hmm.

Katrin (08:38)
And then we we Manipulate it Get to some information from it and then retranslate it into a reality and if we don't retranslate into a reality in a way that It that something can be done with it then everything else we did is just an intellectual exercise, right? It's it may be very fun to do but it's not going to affect any change in the world another completely different process than some from software engineering

June (08:51)
Mm-hmm.

Right.

the way that you just described that workflow is part of the reason why analytics is lagging behind software engineering. It's difficult. And I can go on the record on recording to say, I don't even know if 2026 is gonna be the year when analytics sees the same kind of transformation that software engineering has seen. I would like to think that we're progressing in that direction. And I know I'm like reading the news every single day and it seems like there's

Katrin (09:18)
It is.

June (09:36)
some ⁓ new development or new startup or something that's promising. But so far, we just haven't had that silver bullet that has ⁓ changed things in a remarkable way as we have with software engineers who've been through several waves at this point of ⁓ transformation. And every time, it makes it easier. Now you have ⁓ AI that's writing software. ⁓

The AI is designing the systems that design the systems for software engineers.

Katrin (10:09)
Well, let's meet in a year, have another drink in these same rooms and talk about this again and see what happens. you know, hopefully, Asgwai is going to be part of the reason why we can celebrate that it actually does.

June (10:12)
Yeah, I know. All right, let's do it. It's a plan.

I would love to toast to that.

Katrin (10:27)
Me too. So on that note, you're currently running a poll on LinkedIn asking analysts about upskilling, ⁓ which is a subject that I'm really, really very interested in because ⁓ my thesis around that is people have finite time and ability to focus. We all do, right? And there is a lot out there changing very quickly. And part of what I'm trying to understand is

June (10:40)
Yeah.

Mm-hmm.

Katrin (10:57)
From different point of views, from different people, different angles, what should people focus on in terms of upskilling for the perenniality of ⁓ their skills at an employment? Because it's wonderful to have bots helping you. We all still want to have a job. So why did you decide to do that? And do you have any early learnings like you can share or something?

June (11:05)
Mm-hmm.

Yeah.

I do. So this poll that as we record this is still running, but by the time this podcast is out will be closed. The question is compared to two years ago, which part of your work increased the most? And the options are sense making and judgment, getting more done faster, coordination and orchestration, or no real change. Because I wanted to know, I see things from my perspective. see that, you know, my

individual work changing. see the work of the team I'm on and the company I work for changing. But I didn't know if it's what other people see. and by the way, this wasn't a poll that was run just to analysts. It was my entire LinkedIn network. So it's pretty broad. But what I found was that only 5 % of the 44 people so far who've responded said there was no change. Everyone is feeling the change. But I think we're feeling it in different ways. And my hunch,

is that we're feeling it in ways that are complementary to the strengths that we already have. And so maybe we're a data analyst and we're used to flexing sense making and judgment. We might be finding ways to use that more now. Maybe like me, you're the kind of person who goes broad across the entire organization. I can get more done faster.

I don't need to expand on my ability to coordinate and orchestrate because I've already got that. But for me, ⁓ Gen.ai in particular is like the key to unlocking getting a lot more done faster. And what I found is that sense making, getting more done faster in coordination, were kind of ⁓ equally ⁓ subscribed as responses here.

I find it a little bit dangerous. I was kind of hoping that sense making would come out on top because I think even if we're not doing it, even if we're not doing it more, it's the most critical thing. And if we're just like churning through it and getting it done really fast and like running all our armies of agents and all that, but we're not focusing on sense making and judgment, we're creating chaos. So I think.

Katrin (13:21)
Yes.

What did we

all hope that sense-making comes on top of everything?

June (13:36)
wanted time route I'm

rooting for it but right right now it's actually in third place- but the polls not close yet so there's hope for the future.

Katrin (13:44)
It's in third place, so that means just above no change at all. All right. Yeah. Come on,

June (13:48)
No change. Yeah. And also,

my guess, judging from ⁓ the coordination and orchestration piece being so popular, is that perhaps it's data engineers or software engineers in my network who are now cross-pollinating with other people in ways they didn't used to and ⁓ also ⁓ dispatching agents to do things and then having to act as a manager.

⁓ even if they're an individual contributor. So it's like flexing new skills for people.

Katrin (14:21)
would make sense, yes.

I think that's a really, broadly speaking, very ⁓ strong reality across the board. Everybody is flexing new skills with this. ⁓ Yeah, we really are. So you frame the AI opportunity for data teams like us two distinct lanes, right? Building AI into products for customers and using AIs for internal

June (14:33)
Mm-hmm. Mm-hmm. We are.

Mm-hmm.

Katrin (14:50)
total process improvement. Where are most teams you see right now and ⁓ whether you know it's in your company or in your network? And where do you think they should be?

June (15:05)
So I made this observation that you're referencing two years ago as I was involved in an initiative for my own company ⁓ to do the first case, building AI into products for customers. And that was one of the first kind of proof of concepts I got involved in with Gen.AI. ⁓

And it seemed kind of equally balanced at the time. ⁓ Internal process improvement, and GEN.EI for ⁓ employee efficiency wasn't really as developed at the time. But ⁓ now, I think for data teams especially, that balance has shifted. ⁓ I think data teams are not owning the building of AI into products for customers. We're contributing. And the owners are... ⁓

the chief product officer or ⁓ interface designers, right? Because they're like re-imagining how humans ⁓ engage with stuff online. And we can contribute to that in the classic product analytics sense like we always have. But ⁓ using Gen.AI for process efficiency and organizational efficiency? Absolutely. I think we're beginning to recognize the potential of doing that. ⁓

getting this, I hate to say this, getting the same amount done with fewer people or ⁓ getting ⁓ the same size group to actually produce more of value. ⁓ And so I think that the balance is shifting for data teams to the latter. First, like as individuals, how can we make ⁓ our own, how can we improve our own personal processes? As data teams, how can we improve the processes of our teams using new tools available to us?

And then ⁓ working together with our stakeholders, how can we make the entire company more efficient and productive?

Katrin (17:05)
For you to your first one which is like making more than with fewer people, it sounds bad obviously, right? Except in reality, I've never met a data team that didn't have too much work. Never, never, right? And I mean way, way too much work. So getting more done with fewer people doesn't actually sound that bad to me given that there generally is way too much to do anyway.

June (17:10)
Mm-hmm.

No, know. Yeah, exactly. We're still busy. Right.

Mm-hmm.

Yeah, yeah.

Katrin (17:35)
And that

the part that the AI would do for you or with you is not the part most people like the most. It's like all these brain heavy sort of coding and making sure that the syntax is correct and you've got the right variable names and everything crisscrosses correctly, ⁓ et cetera, et cetera. It's very time consuming. It's not.

June (17:53)
huh.

Katrin (18:02)
necessarily the most creative and most interesting and I understand that not everybody is interested in creative and interesting work, right? ⁓ But it's also really not where the value for the company is. Whichever company you work for, that's not where the value is. And ultimately, ⁓ if you want to continue having a job, want to produce value.

June (18:15)
Mm, yeah.

Right. For the end customers of your company, not just creating capacity for your team. And I think that's a problem that data teams have in general. I've seen plenty of initiatives that have been proposed that you say, well, what's the benefit to the whole company? And they're like, well, ⁓ it actually just makes things easier for data analysts. And then that frees up our bandwidth to do higher order things. And that's really like, there's a lot of steps between that.

and actual dollars or customer satisfaction or whatever, which is the thing everybody should be striving for.

Katrin (19:03)
And I would say if you know, if this shift pushes data and analytics people and teams in general to worry more about the value brought to the company, that's really positive because you really shouldn't, in my opinion, be content to ⁓ sort of be a administrative cost center doing, you know, data and analytics processes.

June (19:14)
Mm-hmm.

It is.

Mm-hmm.

Katrin (19:33)
It's kind of not where you generally want to be. You want to be in the value production, value generation part of the business.

June (19:36)
No.

And that's, I mean, that's been a very, very long theme is like, how do we switch from being a cost center to being a profit center? And I think with often limited success.

Katrin (19:54)
And profit center, I don't know if that's necessarily possible in all organizations, but what is possible is to have a profit oriented mentality so that when you are working and you are analyzing data, you do actually think of who your stakeholder is and what they do and what they could possibly action, where the value is and preferably left side of the decimal, not right side of the decimal.

June (19:57)
Mm. Yeah.

Mm-hmm.

Mm-hmm.

Mm-hmm.

Yeah. Yeah.

Katrin (20:24)
which

is something that us digital analytics people have been guilty of a lot, especially in our younger years. We knew everything about CTRs and we would optimize the 20 people on the website and think that everything was running around that. Obviously things have changed from that point of view, right? But it is really something that is sort of

June (20:31)
Mm-hmm.

Yeah, yeah.

Katrin (20:53)
An easy ⁓ rabbit hole is focusing on things that seem important in your domain, in your remit, but actually the magnitude of the impact is just simply too small.

June (20:56)
Mm-hmm.

Yep. That actually reminds me of a conversation I was having with a new data scientist on my team earlier today. And she was asking for my advice on the kinds of things that she should propose to include in her work. And I said, look at the company goals first. Read through the goals and understand those goals. And then think about how you can connect the work that you do with the skills that you have to further and support those goals.

And I think it wasn't something that she had necessarily considered as someone who was younger in her career.

Katrin (21:41)
And I think that that is a choice that everybody can make, right? Everybody can make individually the choice of dedicating brain power to understanding their environment and the environment of the company they work for and make the bridge between what they do as best they can and those goals. And you will fail in that over and over again as you fail in everything you learn for the,

June (21:44)
Yeah.

Mm-hmm.

Yeah.

Mm-hmm.

Katrin (22:07)
for the first time, it's not like it's going to be super successful generally from the first get go, but you are the person who needs to bridge that gap. That gap will not be bridged for you. And that I think is an attitude when it comes to thinking of, ⁓ AI will still my job or automate my job. Well, no, not really.

June (22:22)
Mm-hmm.

Katrin (22:34)
Definitely not if you have that point of view to begin with.

June (22:38)
Right, right, that's the right kind of mindset to have right now.

Katrin (22:43)
So, AI is still in the exploration phase, right, for analytics team, I think we agree on that. changes will, in my opinion, happen from the ground up, because that's how they always happen in analytics, ultimately. If you don't get adoption, if he doesn't get adapted by the people who are actually doing the work day to day, nothing really sticks. I think that one of the trends that will force that, or like,

accelerate that is agent e-commerce ⁓ because I think that the rise of agent e-commerce will kick off really a sort of an enormous work stream across organizations because it will be forced to adapt to talking to two different audiences. And I mean by that organizations will market, right? Obviously. So, ⁓ and so online.

June (23:36)
Mm-hmm.

Katrin (23:40)
So all of sudden, you have these like humans and bots type of audiences and they have very, very different needs. And when I say different needs, mean, technically different needs in the way you communicate to them. Which means your website has to be structured in a certain way, which means now your data layer changes, which means you also have to think about what are you going to ask us questions that your data layer needs to answer that you weren't asked

June (23:52)
Mm-hmm.

Mm-hmm. ⁓

Yeah.

Katrin (24:09)
asking before because there actually weren't AI agents that did e-commerce before. questions like how do you get into the context window and how do you get selected? But also questions like how do you make sure that the agent can read your content correctly? And then how do you measure that traffic on your website? And then ultimately, because you are going to lose

June (24:21)
Mm-hmm.

Mm-hmm.

Katrin (24:38)
some of the relationship management capabilities you have with those customers because they will not actually touch your brand physically necessarily. So how does that then change what happens in your loyalty programs, etc. And so what I'm thinking is this ultimately, we're at the very beginning of this, but I think that in terms of magnitude, it sort of like spans some aspects of the mobile re-platforming because it's a very different customer experience.

June (24:45)
Yeah.

Mm-hmm.

Right.

Katrin (25:07)
Some

aspects of the rise of Amazon because there are aspects of the loss of the relationship with the customer that are kind of similar. And then some aspects of the GA4 migration because it really is a physical change to the website and to the data layer that needs to happen, including the metrics layer on top of that. Yeah, yeah, yeah.

June (25:16)
Yeah.

I'm going to add one thing to that, think, which is multi-channel

marketing. When people realize that people were going into stores and shopping digitally both, and it was the understanding of both of those behavior types combined that was valuable to commerce organizations, that was a big shift too. so, Anyway, yeah.

Katrin (25:52)
Mm-hmm.

That's true. Yeah, that's

only true. Yes. So do you think that does that that that that could force this sort of like analytical maturity shift? ⁓ Or does it just create panic?

June (25:58)
Anyways.

my gosh. ⁓ I don't think it creates panic. think it's, as someone who's been through like all the waves of this, I think it's exciting. ⁓ The common thread that I see, and this is from like the beginning of my own career until now, is ⁓ being someone who strives to understand customer behavior. And ⁓ for me, I've worked primarily in the digital realm, but that's why I brought up multi-channel. ⁓

At first it was looking at how customers were engaging with websites. Humans were engaging with websites. And then humans engaging with mobile apps. And then for me personally, I went to go work for a company that did live streaming video. And so was like, ⁓ it's not just static pages anymore, it's live streaming video. How can we understand how customers are engaging with that? And then I went from there ⁓ to work ⁓ for a company that focused on audio. And so how are customers engaging with audio?

How are they just engaging with voice, even if you don't have a ⁓ visual interface? And the common thread still through all of that is understanding how customers engage with your business. And so whatever it is that happens next, it either is the same sort of thing. We're humans. We need a thing. We're waving money around. How does the business have a transaction with us? Or it becomes different, where it's like the humans need a thing.

And the robots need the thing, and those are different things. ⁓ But I've always also been fascinated by ⁓ robotics, and I've never worked in robotics. But I think of it as an interesting parallel to ⁓ understanding customer behavior is understanding robot behavior. So maybe this is the convergence.

Katrin (27:54)
I never

thought about it that way. I'm with you on the, know, it's really all about the customer behavior and understanding the customer behavior. Ultimately, I think that that's really one of the things that's so interesting about marketing analytics is it's about what people do and what people want, ⁓ which I find interesting. ⁓ Obviously, you know, one person's excitement ⁓ is another person's panic.

June (28:01)
Yeah.

Mm-hmm.

Katrin (28:24)
But so, you know, we've talked about this like bottom, bottoms up motion. A lot of people and I think that's something that, you know, we probably see a lot of that in your poll, or at least I kind of like feel it, you know, between the lines in your poll, there is also a lot of top down pressure, right? ⁓ Pressure, especially around efficiency and, and sort of, you know, transformation of workflows. So

June (28:27)
Yeah.

Yeah.

Katrin (28:56)
Do you think that top down is ultimately how it actually plays out or do you think that bottom up actually has a chance?

June (29:05)
It's got to converge at some point, but I think it's pretty starkly different right now. ⁓ A friend of mine shared ⁓ an article she thought I would like, it came out in the Wall Street Journal about a week ago on AI efficiency. And it polled ⁓ workers in the C-suite and asked both of these groups, how much time do you think you're saving each week by using AI?

For the workers, the people on the ground, the grassroots people, said 40 % of them said they were not saving any time at all with AI in their job, is the current poll. But executives said that they believed they were saving 19 % of them said they were saving more than 12 hours a week by using AI. And so there's this stark difference, right? If the C-suite

Katrin (30:00)
I wonder what these

people are doing.

June (30:02)
I don't know. Maybe it's just like completely different workflows for the people who are on the ground versus the people in the C-suite. ⁓ But still, I think like we're all experiencing very different realities right now. ⁓ And so I've seen and I've talked to other people who are getting a lot of cheerleading from the C-suite to be like, we're all in on AI, yay, you know, go, learn, do, ⁓ prototype, build, and workers on the ground.

Katrin (30:06)
Yes, probably.

That's true.

June (30:30)
⁓ Data teams especially are like, my God, I have more work than I ever did. My data team was swamped to begin with and now we're swamped by the service desk work we've always been expected to do. Plus this, can you please automate all of this stuff and make me an AI chat bot that does everything that you do? And ⁓ it's been a struggle.

Katrin (30:54)
Have you seen any concrete wins? mean, not conference demos, like real improvement. Things that you've seen and touched and good went like, yeah, that's actually useful.

June (31:05)
Yeah, well, I think there's a difference between efficiency and innovation that comes out of Gen.EI. And I think that ⁓ many of us are brought in to hackathons and perfect concepts and whatever with the idea of efficiency in mind. But I think really what we're getting out of it is innovative solutions to problems. ⁓ So for example, you might be able to

Katrin (31:12)
Mm-hmm.

June (31:35)
use Gen.AI to classify and structure a bunch of unstructured data and produce a new data set out of it that's useful to the business. ⁓ And so the easy production of new data sets that mine unstructured data, awesome. I would say that's innovation. ⁓ Is it efficiency? Well, we were never able to ⁓ scale ⁓ humans across that much unstructured data to begin with, so it's not really efficiency. ⁓

⁓ looking across lots of, say, structured data to pick out patterns and ⁓ write narrative that describes ⁓ to business people what's happening. ⁓ That's great. mean, humans can do that too. Maybe it's a little bit about efficiency, but I think it is the case that we've never really been fully staffed enough to be able to write that kind of narrative to begin with. So it's not efficiency, it's doing more.

⁓ doing more than we were able to do with just humans alone, if that makes sense.

Katrin (32:43)
It does. So if you're an analyst listening to this five years into your career, ⁓ you know, somebody not not somebody like us, like, you know, 30 plus somebody more. Somebody who's more like like beginning mid career, who has their entire career in front in front of us of them and their entire career is going to obviously, you know, happen in this world of AI.

June (32:46)

Yeah, of 30 plus.

Hmm.

Katrin (33:09)
and you're obviously feeling the pressure to become an AI analyst, ⁓ what would you say they should be investing in in terms of upscaling?

June (33:13)
Mm-hmm.

Yeah. ⁓ So I wrote this article ⁓ about a year ago called To Thrive in Data and Know the Business. And it was about, it was kind of a reflection on the course of my career where at certain points I realized that ⁓ even though I was hired and valued for the technical skills that I had and had accumulated, the thing that kept me moving forward was my understanding of the business.

And it came as a surprise to me. I didn't even realize that I'd have to get a full understanding of the business when I came in. was like, yay, I know SQL and other things. And I thought that that could sustain me, but that wasn't enough. And I think that now as ⁓ some of the technical competencies that data people have are getting easier to do with AI, the thing that we can really continue to lean into

is knowing the business. And I'm not saying like, you don't have to be technically competent anymore. I think that it goes hand in hand with business sense. But I think that combination of business sense and technical knowledge is incredibly powerful. so I would encourage people who are early on in their career, if they're like, what tech skills should I learn? I'll say, get the grounding of tech skills that excite you and give you energy.

But also, you better be starting now if you haven't already in developing business sense.

Katrin (34:55)
⁓ I agree, think that when you say technical skills are still going to be important, I actually think technical skills are going to be way more important. Yes, well because, ⁓ so when I'm thinking about the main things that are important to be a good AI analyst, one is you have to really understand how other things work.

June (35:06)
Really? How so?

Katrin (35:23)
What are these things? What is your tool? It's your tool, right? What is your tool in reality? How did it get trained? ⁓ Really understand how the inner sort of mechanisms, what is context window? What is an attention mechanism? How does it work? What is delusion, attention delusion? You really need to get that because if you don't get how it works, you're going to talk to it. It's going to do things you don't want.

you're not going to understand what is happening in the background. It's like working with a database and not understand the concept of projection and joints. So that's one thing. The other thing is you need to learn to talk to it. So you need to become really good at prompt and context engineering, which does not work if you don't understand the inner mechanisms of the LLM.

June (35:55)
Yeah.

Yep, yep.

I would agree with you on that.

Yeah. Yeah.

Katrin (36:17)
then you have to become really really good at reading code. And I think that reading code is actually more difficult than writing code. Reading code that you have not written is not easy. And you have to now read this code, understand what it does, decide whether it is sound or not sound, put it together with the rest of the code. And for that you need to... ⁓ This is something that I've always said to

June (36:28)
Yeah.

huh.

Katrin (36:47)
everybody who asked me is what do you think is the thing if I had to say one single technical skill that is the most important in data analytics in my opinion is data modeling. Like if you do not understand data modeling you do not understand how to translate reality into data that is going to actually give you answers or insights or whatnot that you can really translate into reality. If you do cannot do this translation journey

June (36:49)
man.

Mm-hmm. Data modeling. Yeah.

Mm-hmm.

Katrin (37:16)
You're just amusing yourself with data, which is very fun, but not contributing to the value that your company is creating. And so that to me is the basis of business sense, in fact.

June (37:18)
Yeah.

Yeah, actually.

Yeah, you're right. think connecting that to work that I see kind of on the rise for data teams, ⁓ one of the categories that I see people doing more of is focusing on ⁓ developing and maintaining really ⁓ well-curated data sets that serve as input for AI and input for humans too. ⁓ But we're all kind of

Katrin (37:51)
Yeah. Mm-hmm.

June (37:57)
seeing the light and recognizing that that's an important thing to do. And ⁓ that requires technical competency, like you said, data modeling, and also business sense, all of it together.

Katrin (38:10)
And actually, this is a wonderful segue into what I wanted to talk about next, which is data quality. So in your super week talks, your measure craft writing, ⁓ basically all throughout what you write, you keep coming back to trusting data, right? Trusting data, data quality, making sure, as you just said, well curated data sets. ⁓ So there's been this promise for years that's

June (38:13)
Yeah.

Mm-hmm. Yep.

Katrin (38:39)
started with NLP. ⁓ I remember at Datorama, we did something with NLP like that. It was ask questions, get answers. I think it was 10 years, 12 years ago. There's been this promise for years that a business user will just ask an actual language question and get a beautiful chart or an insider answer. That makes sense.

June (38:46)
Mm-hmm.

Mm-hmm.

Yeah.

Katrin (39:05)
What do you think about that?

June (39:06)
I'm

Like ⁓ AI for BI. It's the hardest thing. Everyone just wants to chat with their data. ⁓ it's a struggle right now. And I think that you can make it work in a proof of concept setting with like ⁓ data sets that are already well curated for one reason or another. But it doesn't take very long for someone to come along and ask a question that's out of scope for the area of coverage that you have in your proof of concept.

And they're like, what do you mean? can't ask about this thing way over here. And they're like, no, you can only ask about certain topics over there. And they're like, but I don't care about that. And so it's a hard problem. I hear you ⁓ phrasing this potentially as a data quality concern. But I think part of that is the lack of a shared definition and understanding of what data quality means. ⁓

I battle with my own coworkers on this all the time.

Katrin (40:06)
You mean a lack

of a common semantic layer on top of the significance of data quality?

June (40:09)
That,

touché, that. ⁓ But I ⁓ have this running debate with ⁓ a colleague of mine who thinks that, she thinks that when data is missing ⁓ from a data set, we actually didn't bother to instrument a system to collect the kind of data we need to answer a business question, that that is a data quality problem.

And I think it's like a requirements gathering problem. But ⁓ from the outside, someone might say, ⁓ we can't use that data set. has data quality problems. The quality problem is the fact that there's some missing data. But

Katrin (40:53)
It's

a language shortcut, it's not technically exact. ⁓

June (40:56)
Yeah.

But it's tough because I know, and many other people know, how hard it is to devote enough time and expertise and resourcing to develop well-curated data sets. I've done it. But I think that the thing that drives people to invest in that is a pattern of failure when they haven't done it.

and they see the result and the wreckage and they're like, we can never let that happen again. And then they'll invest in it. It's not something that's just like the way we operate around here. Like, you know, we're just not staff for that. Yeah. Yeah.

Katrin (41:35)
No, Nowhere. I agree. But

your example touches on something that I think is, ⁓ I would say, these days, I would consider that as being the main sort of thing to solve ⁓ in data processes, which is context continuity. Because it used to be that we didn't have the tools to handle the type of data we needed. The data was too big.

June (41:52)
Yeah.

Mm-hmm.

Katrin (42:05)
and the tools weren't good enough, it was horrible, and we need better tools. ⁓ Then came the modern data stack, the tools are quite frankly good enough. Can tools be a little better? Sure, maybe yes. They're good enough. ⁓ The problem is the context continuity between ultimately the steps of the analytics process, from connection to data to producing outputs, but those steps tend to happen in different tools, obviously.

June (42:17)
Incremental,

Katrin (42:34)
with different people and with a context continuity problem within that whole process because it is not possible to document, like it is just not possible to document. so ⁓ keeping the context continuity of what happened in the ⁓ requirements gathering, which ends up with us having this data in this state and then what transformations have we made, what choices have we made.

June (42:42)
Mm-hmm.

Hmm.

Yeah.

Katrin (43:02)
We made some decisions because we didn't have everything. So we made some choices. And then when the result gets in front of the stakeholder, the first question they ask is, revenue, where does that come from? I have to retrace the whole thing. But half the people have left the company, et cetera. This is the reality of working with data. This is really the reality. I actually think that that

June (43:16)
Bum bum bum. Yep.

Sure. Yeah. yeah.

Katrin (43:31)
context continuity problem is ⁓ really one of the, probably the main issue in most processes. Would you agree with that?

June (43:39)
Yeah,

my god. Yeah, I think you've just described some pain that I have definitely felt. And I think that you haven't used this word, but the word previously that was used to describe that is, well, we just need something that shows lineage. And ⁓ I kind of rolled my eyes at that because I'm like, lineage what? From what to what? And it's never extensive enough. And you can get a little piece of it here to say there were these tables and these ⁓

Katrin (44:00)
Yes. It's no.

June (44:08)
transformation processes and the summary table that came out at the end and whatever. But then there's the everything that happened before that and everything that happened after that. And you ask about like, where's that in the lineage? And they're like, we don't have visibility into that. You know, it's like the end to end. Right.

Katrin (44:21)
But you need the business context around

the lineage. You need to understand why these choices were made. What choices were made? Why they were made? ⁓ Is it possible to make other choices?

June (44:28)
Yep. Yeah.

Yeah. There's a lot

of lore. And I think what you're hitting on right now is how this is exactly why data work is different than software engineering work. And with regard to AI maturity right now, it's hard.

Katrin (44:46)
Yes.

So it is hard. Let's talk about goals. You've written a framework for data team goals, I think seven types, foundational data literacy, all the way to like quantifying business ⁓ impact. Do you fit AI skills somewhere in that hierarchy, throughout that hierarchy? What do you think?

June (45:14)
Mm-hmm.

I think ⁓ I see it in two different ways. In one sense, it is infused throughout all levels of the hierarchy. ⁓ For example, if your ⁓ data team is developing ⁓ some kind of Gen.E.I. platform that makes it easy ⁓ for business stakeholders to do a thing, ⁓ then it could be like shipping the thing.

We're gonna produce the first draft of this platform. Or maybe you have already developed this platform, you might take a goal that's adoption and usage. How can get more people in the company to use this thing? ⁓ But ultimately, it should be about dollars. So we have a thing, we built it potentially with Gen.ai or using Gen.ai. How is it? ⁓

relating to the business outcomes that we want to achieve, making money, saving money, keeping customers happy. And that should be eventually what we're striving for. I don't think we're quite there yet. But when I look at this pyramid of goals for a data team, again, I think where we are really ⁓ realistically right now is at the very bottom, which when I wrote this a couple of years ago, organizational data literacy is the foundation along the bottom.

There's ⁓ certainly something in there where it's also about upskilling ⁓ everyone on a data team to know as much about ⁓ AI as they need to to do their job effectively today. So it might be conducting a certain number of trainings or developing ⁓ AI workflows or ⁓ producing curated data sets or

whatever that is, that's kind of on the foundational level of the kinds of goals that data teams should be taking today related to Gen.AI.

Katrin (47:18)
going to do a shameless plug here or watch videos of the floofies on AskWise website because they will teach you everything you need to know about LLMs. You've never watched the floofies? You need to watch the floofies. It's fun.

June (47:20)
Okay.

Mmm.

I need to watch the floofies. have an

actual floofie in my pocket right now if I can find it. I probably can't. my god, here it is. Here it is. So ⁓ for those in the live audience, this is a floofie. ⁓ can watch videos of them, but I have a three-dimensional floofie that I'm going to take home. Yeah.

Katrin (47:40)
are you gonna make some people jealous?

3D printed two days ago by me. the

floofies, basically I came up with the floofies completely by chance. I didn't want to have my face on social media to do a promotion of Ask Why. So I was like, okay, I'll create an avatar. That's fun. I'll create an AI avatar. It was mildly fun. So I was like, well, I'm just going to create like a little monster so that the monster appears, you know, and makes it a little more fun.

June (48:05)
Mm-hmm.

Mm-hmm.

Uh-huh.

Katrin (48:23)
And then everybody wanted to see the monster and nobody wanted to see the AI avatar. So I was like, okay, well, I'll just redo it just with the little monster. And then it dawned on me that that would be a really good way to explain point one of what I think everybody should know about how do LLMs actually work. Because ultimately, if you think about it, I don't know, I thought about it anyway, I was like,

June (48:46)
Mm-hmm. Mm-hmm.

Katrin (48:53)
If I think about it, the flu-fis really can represent ⁓ weights and biases, right? There are the parameters in the LLMs. And they can be the untrained parameters in lost landscape, and they can be the trained parameters in the latent space. And if I give them different colors and personalities, they can even be the fine-tuned parameters. So like the metaphor really works.

June (49:01)
Mm-hmm.

Katrin (49:21)
And so I spent an entire weekend in a rabbit hole of mapping Fluffyland versus AI ML concepts to make sure that everything worked in the metaphor when I was telling the story, you know, making the explanation within Fluffyland. ⁓ And so I started producing videos about what is trading. And then I shifted to attention mechanism because I realized that really what most people need to

understand is how does attention work in prompting because most frustrations from prompting come from a lack of understanding of the attention mechanism. And then now I move to ⁓ AI security because I feel like especially with you know all the multibots etc there is a there is a real need to understand where you actually where your data actually goes between the moment where you're prompt

June (50:16)
my god, yeah.

Katrin (50:18)
And the moment where the LLMs generate an answer, there's a whole application layer that makes multiple copies of your data. And those copies persist for multiple reasons that are structural, that aren't in any way covered by enterprise contracts, because enterprise contracts cover ⁓ training, et cetera. They don't cover the physical structural need of the system to make copies of the data for compliance, for speed, for calculations, et cetera, et cetera.

June (50:34)
Mm-hmm.

Katrin (50:47)
So like when you use an LLM, you kind of really do need to be a little bit conscious of that, I think. So that's the floofies. Yeah.

June (50:52)
Yeah.

That's the floofies.

And it ties back to the goals, I think, because you believe that everyone on the data team needs to understand these fundamentals. And that seems like a foundational knowledge for the rest of the stuff. data, right, right. So data teams in 2026 should be taking a goal to watch all the videos of the floofies and understand and be able to pass the floofy quiz.

Katrin (51:05)
I believe that strongly. Yes. And I'm trying to make it a little amusing to, you know, to kind of like you to learn about it.

Absolutely.

June (51:27)
Yeah,

I agree.

Katrin (51:30)
So let's talk about ⁓ tools and investing in tools and creating the new tools. ⁓ You're a founding member of Invest in Data, which is an angel fund ⁓ specialized in investing in data. What are you seeing that come coming like as deal flows across your desk that you find interesting and how like tell us some good stuff. What's cool and what's coming?

June (51:53)
Yeah.

Sure, So invest in data is five years old. ⁓ And so ⁓ that's as much time as I've been paying really close attention to ⁓ new innovation in the data and AI technology space. But as I was thinking back over the course of maybe a decade to think about ⁓ how data tech has changed and evolved, ⁓ where I was...

10 years ago, I think there was a lot of excitement around the data visualization layer and NBI tools for that matter. there were like the traditional things where like they were being challenged by some upstarts and there were a lot of different solutions in the market. And then around 2018 through 2020, there were some consolidations. Tableau got acquired, Looker got acquired, Periscope and Sisense merged and so on.

Katrin (52:32)
Yeah. Yeah.

June (52:53)
And then it kind of quieted down. think these days, people don't really talk about data visualization as one of the exciting technologies in the data space anymore. ⁓ There was a period of time when data quality observability was ⁓ super hot. The first generation of data catalogs, which, I mean, we could have a debate about that, I suppose, because I think it's related to context.

⁓ Didn't really work out. all like believed that like, ⁓ in the vision of ⁓ data lineage solving so many of our headaches and then we really did. These days though, I think that there's a lot of excitement around new technology related to data infrastructure, AI infrastructure, ⁓ ML ops, feature stores, inference.

⁓ those kinds of things that like selling pick access to the miners. Everyone is building all these systems and it's kind of like that mid layer of working with data inside ⁓ AI native systems where ⁓ people are building and people are investing. If that makes sense.

Katrin (54:07)
Thank you.

Shameless plug, where can people find you? What should they read? What are you excited about right now? We'll put everything into the show notes so you can just rattle things out. No worries, we'll pass it with AI.

June (54:14)

Sure.

OK. ⁓ I have a newsletter. It's called Measurecraft. I've been ⁓ publishing articles every two weeks for almost two years now, where you get to hear more about my opinion on all things data. ⁓ Invest in data. If you go to that website, you can find out what's in the portfolio of the kinds of products and technologies that we have chosen to put our money behind as a group so far.

⁓ One thing I am really excited about, and actually I'm going to see Kat there, is Measure Camp Los Angeles, which is set to happen on May 30th in LA. ⁓ I'm a very light member of the organizing committee there, but if you don't know about it, Measure Camp as a whole ⁓ event series is the best free unconference in the data space, and it happens all over the world. And then if you want to find me, you can find me on LinkedIn.

I'm Jay Dersh.

Katrin (55:25)
Thank you. so, yes, absolutely. I actually should do way more shout outs to Measure Camp. If you've never been to a Measure Camp or don't know what is a Measure Camp, go to measurecamp.org. It's really, really good. Actually, we're to release on our LinkedIn's all of the New York organization committee. We've got these videos that we've done in the New York Measure Camp last year about a bunch of people

June (55:31)
Yeah.

Katrin (55:55)
talking about their experience, which I think, you know, hopefully should give a bit of a flavor of what you can get there because it's very difficult to see with words. Like it's an unconference. my God, what's that? ⁓ well, you know, it's the audience that creates the content where that does not sound great. It actually really is. It actually really is amazing.

June (56:14)
It is great. It is great. And you know, people are like, ⁓

I have to give out my Saturday to do this. And the answer is yes, you do.

Katrin (56:22)
Yes, but

probably the year after, you're not going to think, ⁓ I have to give up my Saturday. You're going to think, ⁓ I get to go there on a Saturday. So I'm just going to plug the New York one as well because the New York one is on March 29th And I would like to remind everybody that we do have the best pizza. So, you know, there is something to be said about that. pizza. Get your ticket.

June (56:29)
No, I get to go. Exactly, exactly.

For sure.

Katrin (56:52)
And yes, Los Angeles, so that's going to be the first iteration in Los Angeles, May 29th. I'll absolutely be there. Thank you for that reminder. And well, thank you so much for doing this with us, June. It was really amazing. And ⁓ I'll see you at Measure Camp.

June (57:08)
Thank you for inviting me, Kat.

Yes, you will. Looking forward to it.

Katrin (57:14)
So, correction, Los Angeles is May 30th, right? Not May 29th. Do not go there on May 29th.

June (57:22)
No, that's a Friday.

Katrin (57:24)
Thank you.

If today's conversation made you think about how AI is changing data and analytics, visit us at askwhy.ai ⁓ and try Prism, our data platform, helping analysts navigate complexity with context. Thanks for listening and remember bots won't win, AI analysts will.

­Resources Mentioned:

Publications & Communities
Industry Events
  • MeasureCamp Los Angeles – May 30, 2026
  • MeasureCamp New York – March 29, 2026

Connect with Our Guest:

Host name:

Katrin Ribant

Episode Credits:

Host: Katrin Ribant Guest: June Dershewitz Podcast: Knowledge Distillation
Episode: 11 Runtime: ~57 minutes Release Date: 03/02/2026