[00:00:00.000 --> 00:00:09.440] [MUSIC]
[00:00:09.440 --> 00:00:11.400] Welcome to Knowledge Distillation,
[00:00:11.400 --> 00:00:15.680] where we explore how AI is reshaping the role of the data analyst.
[00:00:15.680 --> 00:00:20.080] I'm your host, Katrin Ribant, CEO and founder of Ask-Y.
[00:00:20.080 --> 00:00:25.480] Today, I have the perfect first guest to help us unpack this transformation.
[00:00:25.480 --> 00:00:29.760] Someone who literally wrote the book on AI agents.
[00:00:29.760 --> 00:00:35.400] Martin Kihn is an AI strategist at Salesforce, former VP at Gartner,
[00:00:35.400 --> 00:00:41.600] and the author of five books, including his latest, Agent Force.
[00:00:41.600 --> 00:00:45.160] But before we were both evangelizing AI,
[00:00:45.160 --> 00:00:48.760] Marty and I were both in the trenches of marketing technology.
[00:00:48.760 --> 00:00:52.760] Me at Datorama and Marty analyzing for Gartner and
[00:00:52.760 --> 00:00:55.560] talking a lot about Taylor Swift.
[00:00:55.560 --> 00:00:58.800] Before, it was really that much of a thing, actually.
[00:00:58.800 --> 00:01:00.000] I was early with Taylor.
[00:01:00.000 --> 00:01:04.960] It was, and I tell people this because now there's actually a book that came out.
[00:01:04.960 --> 00:01:09.080] Someone at, I think it was HPR or MIT wrote a book,
[00:01:09.080 --> 00:01:11.320] The Marketing Secrets of Taylor Swift.
[00:01:11.320 --> 00:01:15.040] And I was there probably eight years ago,
[00:01:15.040 --> 00:01:17.120] talking about what marketers can learn from Taylor Swift.
[00:01:17.120 --> 00:01:20.960] And I actually did research around it as well.
[00:01:20.960 --> 00:01:24.720] And basically what I said was that Taylor Swift was infinitely personalizable.
[00:01:24.720 --> 00:01:26.920] People projected themselves under her.
[00:01:26.920 --> 00:01:29.440] And that's what you should do to be a modern brand.
[00:01:29.440 --> 00:01:32.520] >> Yeah, I remember going to one of those presentations.
[00:01:32.520 --> 00:01:37.880] It was really, definitely in advance and very, very interesting.
[00:01:37.880 --> 00:01:39.640] So yeah, thank you for that.
[00:01:39.640 --> 00:01:40.600] >> Of course.
[00:01:40.600 --> 00:01:43.800] >> So just around where Salesforce acquired Deterama,
[00:01:43.800 --> 00:01:46.600] Marty joined the strategy team at Salesforce,
[00:01:46.600 --> 00:01:49.120] where we were colleagues for a couple of years.
[00:01:49.120 --> 00:01:54.120] We've known each other for over a decade, actually way over a decade.
[00:01:54.120 --> 00:02:00.480] And honestly, few people see the data analyst evolution as clearly as Marty does.
[00:02:00.480 --> 00:02:01.840] So let's dig in.
[00:02:01.840 --> 00:02:06.080] >> By the way, I used to be a data analyst.
[00:02:06.080 --> 00:02:07.400] It wasn't.
[00:02:07.400 --> 00:02:11.040] Before Gartner, I worked at Digitas.
[00:02:11.040 --> 00:02:15.320] It was basically an ad agency, digital advertising.
[00:02:15.320 --> 00:02:17.480] And I did measurement, and that was my job.
[00:02:17.480 --> 00:02:22.560] So I used a lot of Excel, to be honest, but in those days.
[00:02:22.560 --> 00:02:24.320] >> Okay, everybody does, yes.
[00:02:24.320 --> 00:02:28.560] >> Yeah, but that was my job, making dashboards, things like that.
[00:02:28.560 --> 00:02:32.960] >> So last time we spoke was on your podcast,
[00:02:32.960 --> 00:02:38.200] Palais Ad Tech, which I highly recommend to anybody who wants to learn
[00:02:38.200 --> 00:02:41.680] about the world of ad tech and how all of that came about.
[00:02:41.680 --> 00:02:45.760] So we talked about my Havas days and my Deterama days.
[00:02:45.760 --> 00:02:48.320] That was great, but now it's my turn.
[00:02:48.320 --> 00:02:53.600] So first, how does it feel to go from writing House of Lies,
[00:02:53.600 --> 00:02:57.880] which was adapted on Showtime, and consulting,
[00:02:57.880 --> 00:03:00.160] to now writing about AI agents?
[00:03:00.160 --> 00:03:02.160] That's kind of quite a knock for a writer.
[00:03:02.160 --> 00:03:04.920] >> It's different, cuz the House of Lies you're referring to,
[00:03:04.920 --> 00:03:06.520] that was my first job after business school.
[00:03:06.520 --> 00:03:11.600] I was a management consultant, and it was irreverent.
[00:03:11.600 --> 00:03:15.040] I wrote it when I realized I was not gonna be a management consultant,
[00:03:15.040 --> 00:03:17.320] that I was gonna leave the business.
[00:03:17.320 --> 00:03:21.600] So it was okay for me to say whatever I wanted to say within bounds.
[00:03:21.600 --> 00:03:25.360] So it was very irreverent and a bit of a satire.
[00:03:25.360 --> 00:03:28.240] And what I'm doing now is much more straightforward.
[00:03:28.240 --> 00:03:34.080] But I think that the writing process is pretty much the same, and also the tone,
[00:03:34.080 --> 00:03:37.560] the kind of the flow and the tone is sort of the same.
[00:03:37.560 --> 00:03:40.040] So it doesn't feel that different for me.
[00:03:40.040 --> 00:03:43.400] It's just I guess I'm holding back on the humor a bit.
[00:03:43.400 --> 00:03:45.840] I mean, agents can be funny, but they're not that funny.
[00:03:45.840 --> 00:03:49.160] >> [LAUGH] >> A little bit maybe.
[00:03:49.160 --> 00:03:50.560] >> [LAUGH]
[00:03:50.560 --> 00:03:51.920] >> And if I remember well,
[00:03:51.920 --> 00:03:55.000] you told me you wrote the entire book without AI.
[00:03:55.000 --> 00:03:59.040] >> Yeah, I mean, that's always gonna be true for me.
[00:03:59.040 --> 00:04:02.360] I actually am one of the few people who enjoy writing, and
[00:04:02.360 --> 00:04:06.480] a lot of people don't like it, and I think they embrace AI for that reason.
[00:04:06.480 --> 00:04:10.960] But I like it, I don't see why I would give it up to anybody else.
[00:04:10.960 --> 00:04:16.040] >> So when Salesforce acquired Datorama in 2018,
[00:04:16.040 --> 00:04:19.000] what did you think from your vantage point?
[00:04:19.000 --> 00:04:23.880] You were at Gartner then, and you were watching the martech consolidation.
[00:04:23.880 --> 00:04:28.240] Did you see what was coming with the data and AI convergence?
[00:04:28.240 --> 00:04:33.160] And most importantly, I would say, did you have a notion of how it would affect
[00:04:33.160 --> 00:04:34.480] the digital analyst role?
[00:04:34.480 --> 00:04:37.440] >> Yeah, I think something was definitely happening.
[00:04:37.440 --> 00:04:41.280] This was sort of the beginning of the big data era, or a few years into it,
[00:04:41.280 --> 00:04:42.920] like Hadoop, people talking about.
[00:04:42.920 --> 00:04:45.880] And this is massive amounts of information.
[00:04:45.880 --> 00:04:48.840] And for an analyst, it's a different approach.
[00:04:48.840 --> 00:04:53.200] It was before that, people can't remember the before times,
[00:04:53.200 --> 00:04:54.960] it wasn't really that long ago.
[00:04:54.960 --> 00:04:59.160] But the problem was the scarcity of data, there really wasn't enough.
[00:04:59.160 --> 00:05:05.120] So you had to make statistical inferences, and it was all aggregate data.
[00:05:05.120 --> 00:05:08.960] And what happened in the big data era is that actually there came to be a lot more
[00:05:08.960 --> 00:05:13.320] data, it was coming real time off of social networks.
[00:05:13.320 --> 00:05:18.200] It was like the Twitter feed could be a source of data, your website, etc.
[00:05:18.200 --> 00:05:22.240] Every event on your website could be a source of information.
[00:05:22.240 --> 00:05:25.040] And so the scale exploded.
[00:05:25.040 --> 00:05:27.960] I think I remember a meeting, I don't know if I ever told you this.
[00:05:27.960 --> 00:05:29.080] [LAUGH]
[00:05:29.080 --> 00:05:31.660] I went to Lando Lakes, when I was at Gartner,
[00:05:31.660 --> 00:05:35.040] Lando Lakes was a Midwestern dairy company.
[00:05:35.040 --> 00:05:36.640] >> I remember that.
[00:05:36.640 --> 00:05:38.000] >> They were a client.
[00:05:38.000 --> 00:05:39.160] >> Yes, I know.
[00:05:39.160 --> 00:05:43.920] And they did, it was the analytics team, and they showed their process.
[00:05:43.920 --> 00:05:45.960] And they had a very impressive command center.
[00:05:45.960 --> 00:05:51.000] So the command center, it basically aggregated their media data and
[00:05:51.000 --> 00:05:53.840] some of their own channels as well.
[00:05:53.840 --> 00:05:56.560] But it was sort of aggregated using DataRama.
[00:05:56.560 --> 00:06:02.160] They organized it using DataRama, and then they had it put into a data lake.
[00:06:02.160 --> 00:06:04.520] And they had dashboards.
[00:06:04.520 --> 00:06:08.280] And so they had a kind of an overview of all of their marketing efforts.
[00:06:08.280 --> 00:06:10.960] That was very impressive, and it was a small team.
[00:06:10.960 --> 00:06:12.560] And I thought, this is interesting.
[00:06:12.560 --> 00:06:15.800] So the key technology in there was DataRama, sort of sitting in the middle.
[00:06:15.800 --> 00:06:20.640] And it was the convergence of the data, the data inputs.
[00:06:20.640 --> 00:06:23.800] And then DataRama was doing the organization at the campaign level,
[00:06:23.800 --> 00:06:25.360] as you well know.
[00:06:25.360 --> 00:06:27.960] And then making it available to the team.
[00:06:27.960 --> 00:06:29.960] So it was sort of making data available in a way.
[00:06:29.960 --> 00:06:33.200] And I thought, that is new, that was kind of net new.
[00:06:33.200 --> 00:06:38.320] And that's sort of the space where the customer data platform came later.
[00:06:38.320 --> 00:06:43.120] >> Yeah, that was really DataRama's mission at the time, right?
[00:06:43.120 --> 00:06:46.880] It was all about, I mean, the main problem was really bringing all that data
[00:06:46.880 --> 00:06:49.000] together and organizing it.
[00:06:49.000 --> 00:06:52.440] And I feel like today, we've moved from that,
[00:06:52.440 --> 00:06:56.240] not that that isn't a problem, that is a problem, it's a structural problem, right?
[00:06:56.240 --> 00:06:58.080] There's complexity in it.
[00:06:58.080 --> 00:07:01.680] But today, we really moved sort of to the cognitive era, right?
[00:07:01.680 --> 00:07:07.760] Where it's really about using the data that you have in a way that is much more
[00:07:07.760 --> 00:07:11.760] semantic, that is about what does that data actually mean?
[00:07:11.760 --> 00:07:16.920] And how do you make your analysis meaningful for business decision making?
[00:07:16.920 --> 00:07:21.120] >> Yeah, every part of the analyst value chain,
[00:07:21.120 --> 00:07:26.960] that the process of analytics has improved with technology.
[00:07:26.960 --> 00:07:30.520] And I think the one good thing about the analyst or data analyst,
[00:07:30.520 --> 00:07:35.920] that particular role is that it was always extremely heterogeneous.
[00:07:35.920 --> 00:07:41.000] Data analysts were people who always use multiple tools, you never use just one.
[00:07:41.000 --> 00:07:42.960] And also they were data analysts or
[00:07:42.960 --> 00:07:47.480] were people who adopted open source early on like R and
[00:07:47.480 --> 00:07:50.880] Python data analysts are the people who kind of got to know those.
[00:07:50.880 --> 00:07:52.840] And that was open source technology.
[00:07:52.840 --> 00:07:56.400] So they're people who use tools as they appear.
[00:07:56.400 --> 00:08:02.520] >> And that's definitely true now as well with LLMs making, it's like an evolution.
[00:08:02.520 --> 00:08:07.400] >> Yes, as you said, basically, the tools have improved immensely, right?
[00:08:07.400 --> 00:08:14.120] And I feel like really, whereas it was really a question at some point of,
[00:08:14.120 --> 00:08:18.400] do you have good enough tools to wrangle this amount of data,
[00:08:18.400 --> 00:08:21.520] these type of queries, etc.
[00:08:21.520 --> 00:08:25.600] And the problem was really getting tools that were good enough.
[00:08:25.600 --> 00:08:30.480] I feel like the problem has now moved to those tools are good enough, right?
[00:08:30.480 --> 00:08:34.320] Your connection tools and your databases and your BI tools,
[00:08:34.320 --> 00:08:36.120] they really work well.
[00:08:36.120 --> 00:08:41.200] But the problem is that in between all of those steps, connecting to data,
[00:08:41.200 --> 00:08:45.040] hosting the data, transforming the data, transforming the data for pipelines,
[00:08:45.040 --> 00:08:49.200] transforming the data for analytics use cases, building your outputs,
[00:08:49.200 --> 00:08:51.920] getting to your stakeholders with your outputs.
[00:08:51.920 --> 00:08:56.360] All of these steps are handled by different roles, different people generally,
[00:08:56.360 --> 00:08:58.480] definitely being different tools.
[00:08:58.480 --> 00:09:02.120] And the memory and the context between those steps get lost.
[00:09:02.120 --> 00:09:07.280] And so you basically get in front of your stakeholder with your report.
[00:09:07.280 --> 00:09:10.880] The first question you're gonna get is, ROI, great.
[00:09:10.880 --> 00:09:13.080] What are you using for revenue?
[00:09:13.080 --> 00:09:16.680] What is your source and how do you get to that revenue number?
[00:09:16.680 --> 00:09:22.120] And now you have to go back to basically where does this revenue number come from,
[00:09:22.120 --> 00:09:25.720] retrace all of these steps, which is across people and tools.
[00:09:25.720 --> 00:09:30.560] And there's really no way to keep track of what happens at all of these steps,
[00:09:30.560 --> 00:09:33.200] because nobody can do documentation at that level.
[00:09:33.200 --> 00:09:35.240] That's just not something that happens.
[00:09:35.240 --> 00:09:40.760] And I feel like that's really where one of the benefits of AI and
[00:09:40.760 --> 00:09:44.960] using natural language to talk to all of these, to generate the code,
[00:09:44.960 --> 00:09:50.080] to talk to all of these tools and keep the trace of what you're doing comes.
[00:09:50.080 --> 00:09:51.400] What do you think about that?
[00:09:51.400 --> 00:09:56.680] >> I think it's, I like to take a look at it as we're in a situation now where
[00:09:56.680 --> 00:09:59.040] the tools, as we said, are getting easier to use.
[00:09:59.040 --> 00:10:03.920] And a concrete example would be, it wasn't that long ago when you really had to
[00:10:03.920 --> 00:10:08.720] know SQL and be a SQL power user to query data from databases.
[00:10:08.720 --> 00:10:11.320] That is still true.
[00:10:11.320 --> 00:10:13.000] I mean, people of course still use SQL.
[00:10:13.000 --> 00:10:18.480] But now you can go into something like a CDP and just use natural language and
[00:10:18.480 --> 00:10:22.800] describe a segment, and then SQL will be written for you by the machine.
[00:10:22.800 --> 00:10:27.880] But they'll take that nasty step of putting together these very convoluted,
[00:10:27.880 --> 00:10:29.160] nasty queries.
[00:10:29.160 --> 00:10:30.480] So you can just describe it.
[00:10:30.480 --> 00:10:32.520] So your focus, you as the analyst are focused,
[00:10:32.520 --> 00:10:33.680] what am I trying to learn here?
[00:10:33.680 --> 00:10:36.520] And you have to be very precise in your instructions.
[00:10:36.520 --> 00:10:39.040] So you still as an analyst need to know what you want to know.
[00:10:39.040 --> 00:10:42.160] And you also need to know the data sources even better than ever.
[00:10:42.160 --> 00:10:47.080] But the actual techniques like SQL writing, code writing are less important.
[00:10:47.080 --> 00:10:48.320] So the tools are easy to use, but
[00:10:48.320 --> 00:10:51.640] they're also getting harder to understand simultaneously.
[00:10:51.640 --> 00:10:55.440] And the reason is because they're built on neural networks.
[00:10:55.440 --> 00:11:00.640] Neural networks, large language models are just really big neural networks,
[00:11:00.640 --> 00:11:03.640] are extremely large now.
[00:11:03.640 --> 00:11:06.920] You could have trillions of parameters, billions and billions.
[00:11:06.920 --> 00:11:11.480] And all of them have a weight and a bias and they kind of feed into one another.
[00:11:11.480 --> 00:11:15.800] As a human being, we can literally not know if we have an input over here,
[00:11:15.800 --> 00:11:18.200] question and an output over here.
[00:11:18.200 --> 00:11:20.840] We're not gonna know exactly how that output happened.
[00:11:20.840 --> 00:11:22.360] That's why it's unpredictable.
[00:11:22.360 --> 00:11:25.120] So we're gonna have maybe change the prompt a little bit to try to change the
[00:11:25.120 --> 00:11:26.600] output if it doesn't.
[00:11:26.600 --> 00:11:30.680] But what's happening in the middle is quite literally a black, it's a black box.
[00:11:30.680 --> 00:11:36.840] Now we can inform ourselves on the method and we can try to get smart about
[00:11:36.840 --> 00:11:41.240] what's going on and have an intuition and we can have an intuition if the output fits.
[00:11:41.240 --> 00:11:46.080] But the process in the middle, as you said, the context setting,
[00:11:46.080 --> 00:11:48.400] all of that needs to be carefully monitored.
[00:11:48.400 --> 00:11:52.240] I think that the danger here is that we trust the output too much.
[00:11:52.240 --> 00:11:53.800] We shouldn't trust the output too much.
[00:11:53.800 --> 00:11:56.160] We should always be interrogating the data.
[00:11:56.160 --> 00:11:59.280] You can ask models where they came up with an answer.
[00:11:59.280 --> 00:12:02.280] You can just say, what are the three drivers of this trend that you're seeing?
[00:12:02.280 --> 00:12:03.560] And they will tell you.
[00:12:03.560 --> 00:12:07.280] You can ask them, how did you come up with this recommendation?
[00:12:07.280 --> 00:12:10.720] They will tell you, they work for you, you don't work for them.
[00:12:10.720 --> 00:12:13.880] And also using common sense and stuff like that.
[00:12:13.880 --> 00:12:19.280] So I think it's really an important part of the data analyst's job now to be
[00:12:19.280 --> 00:12:25.280] constantly checking and monitoring the output of what their tools are giving them.
[00:12:25.280 --> 00:12:30.960] Yeah, I mean, that's why we consider our core mission really to empower the AI analyst.
[00:12:30.960 --> 00:12:35.760] We're focused on digital analysts who want to wrap their head around using AI across
[00:12:35.760 --> 00:12:39.720] the entire span of the analytics process, so from data collection to transformation
[00:12:39.720 --> 00:12:41.680] analysis and final step.
[00:12:41.680 --> 00:12:46.040] The step that the analyst would like to be the final step or hopes is the final step,
[00:12:46.040 --> 00:12:52.360] which is presenting your stakeholder with your output so they can make data informed decisions.
[00:12:52.360 --> 00:13:02.160] And I think really the three main points that I see that an analyst needs to upscale on
[00:13:02.160 --> 00:13:08.360] in order to become an AI analyst is one, and you mentioned it, is a deep knowledge of how
[00:13:08.360 --> 00:13:14.440] LLMs work, but also a deep knowledge of how different LLMs work differently because they
[00:13:14.440 --> 00:13:17.040] all have their personality, so to say.
[00:13:17.040 --> 00:13:19.720] The second one is prompt engineering.
[00:13:19.720 --> 00:13:23.080] Prompt engineering is how you talk to LLMs.
[00:13:23.080 --> 00:13:28.360] I think as an AI analyst, you really need to understand prompt engineering very, very
[00:13:28.360 --> 00:13:29.440] deeply.
[00:13:29.440 --> 00:13:36.840] And that means that you need to understand how LLMs take your prompt and analyze and
[00:13:36.840 --> 00:13:39.160] pay attention to your prompt.
[00:13:39.160 --> 00:13:43.840] And actually to that effect, I've created these little characters, the Foofies, that
[00:13:43.840 --> 00:13:48.560] help explain the mechanics of LLMs.
[00:13:48.560 --> 00:13:49.560] They are cute.
[00:13:49.560 --> 00:13:50.560] I've seen that.
[00:13:50.560 --> 00:13:53.200] He said you would be trying to drive subject, right?
[00:13:53.200 --> 00:13:54.200] Very.
[00:13:54.200 --> 00:13:58.080] So I was thinking, how can I make it something where it's not just me talking about it because
[00:13:58.080 --> 00:14:00.720] people will just fall asleep?
[00:14:00.720 --> 00:14:05.120] And it's incredible what you can do today with generative video.
[00:14:05.120 --> 00:14:15.840] I've done all of those videos for almost no money, all of them generated with VO3 mostly.
[00:14:15.840 --> 00:14:22.800] And I've gotten tremendous return from people who are not even remotely in our field about
[00:14:22.800 --> 00:14:27.640] how much they understand better what LLMs are and how they work, which I think is a
[00:14:27.640 --> 00:14:30.840] really important thing in the world in general.
[00:14:30.840 --> 00:14:34.160] So I have point one LLMs, point two prompt engineering.
[00:14:34.160 --> 00:14:41.520] Point three, I think is really, if you're going to do analysis with AI, you're going to generate
[00:14:41.520 --> 00:14:43.400] a lot of code.
[00:14:43.400 --> 00:14:47.400] Reading code that has been generated by an LLM is not like writing code.
[00:14:47.400 --> 00:14:51.480] It's a completely different skill to read the code and understand really what the code
[00:14:51.480 --> 00:14:52.840] does.
[00:14:52.840 --> 00:14:59.520] And if you're going to run that code on your database, you need to understand how the code
[00:14:59.520 --> 00:15:04.440] is generated, how good or bad it is, and you need to get good at it.
[00:15:04.440 --> 00:15:11.640] So getting good at code generation as opposed to code writing, I think is a separate skill.
[00:15:11.640 --> 00:15:18.040] And most importantly, really is keep your critical thinking, like keep your wits about
[00:15:18.040 --> 00:15:19.040] you.
[00:15:19.040 --> 00:15:25.140] It's not magic, it's a tool and you need to become really good at using the tool.
[00:15:25.140 --> 00:15:31.020] And as you said, trusting the output blindly is generally a recipe for disaster.
[00:15:31.020 --> 00:15:38.900] In your book, you talk about giving agents strict rules, data sources, actions, and guardrails.
[00:15:38.900 --> 00:15:41.760] That's basically what we're doing with analytical context, right?
[00:15:41.760 --> 00:15:48.520] So what's your take on specialized analytical agents versus generalized AI, specifically
[00:15:48.520 --> 00:15:51.440] when it comes to digital analytical workflows?
[00:15:51.440 --> 00:15:58.640] Well, I think there's a lot of kind of loose terminology in this field.
[00:15:58.640 --> 00:16:01.440] AI in general, what does that mean?
[00:16:01.440 --> 00:16:03.400] It's become an umbrella term.
[00:16:03.400 --> 00:16:08.380] It used to be actually companies avoided using the word AI because it had a bad reputation
[00:16:08.380 --> 00:16:09.560] 10 years ago.
[00:16:09.560 --> 00:16:14.700] And now people are falling over themselves to put AI in the name of their company.
[00:16:14.700 --> 00:16:15.700] So it's changed.
[00:16:15.700 --> 00:16:18.360] The PR has really improved recently.
[00:16:18.360 --> 00:16:23.640] But I think in general, like what we say, I work at Salesforce, as he said, and we have
[00:16:23.640 --> 00:16:27.880] this phrase, don't DIY your AI, don't do it yourself.
[00:16:27.880 --> 00:16:32.680] Larry's language models are there as he's, you know, they're very generic.
[00:16:32.680 --> 00:16:36.600] And as a consumer, I could go to chat GBT and ask it to do a workout routine or, you
[00:16:36.600 --> 00:16:38.780] know, recipe or whatever.
[00:16:38.780 --> 00:16:42.580] And whatever it gives me is okay, I can tweak it, maybe make some mistakes, no big deal.
[00:16:42.580 --> 00:16:46.920] But if you're a company, it's, you know, in an enterprise context, it's completely different.
[00:16:46.920 --> 00:16:51.160] Your entire brand reputation is at stake with every single interaction with your customers.
[00:16:51.160 --> 00:16:53.840] So you have to kind of harden it.
[00:16:53.840 --> 00:16:57.920] And it has to be if you're going to be doing personalization at scale, which is the promise
[00:16:57.920 --> 00:17:01.160] of these agents, it has to be relevant.
[00:17:01.160 --> 00:17:02.160] So it has to be grounded.
[00:17:02.160 --> 00:17:05.960] And so there's a lot of LLMs themselves aren't applications.
[00:17:05.960 --> 00:17:08.360] LLMs are its infrastructure.
[00:17:08.360 --> 00:17:12.320] It's basically, you know, we think of it as like a platform, the LLM, and you've got to
[00:17:12.320 --> 00:17:14.400] build on top of it to make it relevant.
[00:17:14.400 --> 00:17:19.880] And a lot of things you got to build into it, the specialized models would be, you know,
[00:17:19.880 --> 00:17:24.760] levels of trust and governance, compliance, there needs to be auditability for what's
[00:17:24.760 --> 00:17:26.040] going back and forth.
[00:17:26.040 --> 00:17:31.960] There's, you know, focus on your particular industry, there would be some kind of an intelligence
[00:17:31.960 --> 00:17:32.960] engine.
[00:17:32.960 --> 00:17:37.240] So like a reasoning engine that's sitting on top of multiple LLMs.
[00:17:37.240 --> 00:17:40.620] And you mentioned earlier that LLMs themselves have different strengths.
[00:17:40.620 --> 00:17:45.880] So you should be able to select which one you're using, or, you know, they have small
[00:17:45.880 --> 00:17:46.880] language models now.
[00:17:46.880 --> 00:17:48.920] So you can train your own third party data.
[00:17:48.920 --> 00:17:54.000] And that can be very effective in certain kind of more focused areas.
[00:17:54.000 --> 00:17:58.660] And also need to have support for structured and unstructured data and retrieval, log-bent
[00:17:58.660 --> 00:18:01.160] generation, all those processes like that.
[00:18:01.160 --> 00:18:06.680] So I think that the more and more specialized the model can be, as long as it's making the
[00:18:06.680 --> 00:18:12.240] output more personalized to your, and I come from a marketing context, so marketing is
[00:18:12.240 --> 00:18:18.440] more relevant to the consumer, the ultimate consumer on the end, then that's the way you
[00:18:18.440 --> 00:18:21.320] need to go as an enterprise.
[00:18:21.320 --> 00:18:28.720] But it's a long way from just a generic LLM to something that can be really adding value
[00:18:28.720 --> 00:18:29.940] to the business.
[00:18:29.940 --> 00:18:33.640] And so there's steps along the way that I think people don't appreciate.
[00:18:33.640 --> 00:18:38.160] A lot of companies now are surprised at the amount of work.
[00:18:38.160 --> 00:18:44.500] And it just seems so easy as a consumer to have a recipe or whatever.
[00:18:44.500 --> 00:18:52.320] But as a company, they're like, why can't I just productize and automate all my workflows
[00:18:52.320 --> 00:18:53.840] and all this stuff overnight?
[00:18:53.840 --> 00:18:56.120] And it doesn't really work like that.
[00:18:56.120 --> 00:19:02.320] The strength and the weakness of this new generation of AI is that it is not deterministic.
[00:19:02.320 --> 00:19:04.680] It's not just this, then that.
[00:19:04.680 --> 00:19:06.200] You don't know what the output is.
[00:19:06.200 --> 00:19:09.360] It has its own kind of agency.
[00:19:09.360 --> 00:19:10.640] And that's why it's good.
[00:19:10.640 --> 00:19:11.640] That's why we use it.
[00:19:11.640 --> 00:19:16.320] But on the other hand, that makes it a little bit harder to control.
[00:19:16.320 --> 00:19:17.720] I think that's true, right?
[00:19:17.720 --> 00:19:22.800] We're sort of in a little bit of this magical era, right?
[00:19:22.800 --> 00:19:29.560] This era of magical thinking where there's a confusion between getting a recipe for a
[00:19:29.560 --> 00:19:36.960] Thanksgiving casserole and having an application that will do actual work for you in an enterprise
[00:19:36.960 --> 00:19:37.960] context.
[00:19:37.960 --> 00:19:45.280] And that applicative layer is not really well understood, including issues of security around
[00:19:45.280 --> 00:19:48.720] that are really not well understood at all.
[00:19:48.720 --> 00:19:54.140] And reliability, how to build, especially for analytics, is really a challenge.
[00:19:54.140 --> 00:19:56.800] How to build something deterministic?
[00:19:56.800 --> 00:20:05.400] Because in analytics, no one needs random insights on approximately correct data.
[00:20:05.400 --> 00:20:10.840] That's just in my entire career, I have never been asked for that.
[00:20:10.840 --> 00:20:15.600] Occasionally you'll have hit something brilliant, but not all the time.
[00:20:15.600 --> 00:20:18.920] Yeah, but who knows if it's correct?
[00:20:18.920 --> 00:20:24.000] Really, you have to run code on data, right?
[00:20:24.000 --> 00:20:27.920] So otherwise you're never going to trust the output.
[00:20:27.920 --> 00:20:35.620] And from that perspective, there really is a little bit of this magical sort of like,
[00:20:35.620 --> 00:20:40.280] it will solve everything with no effort and not really understanding.
[00:20:40.280 --> 00:20:43.300] No, you have to still have a use case.
[00:20:43.300 --> 00:20:47.120] You have to know what use case you want to solve and you have to find a solution for
[00:20:47.120 --> 00:20:53.220] that particular use case and make sure that that solution is compliant with the different
[00:20:53.220 --> 00:20:55.180] aspects of your business.
[00:20:55.180 --> 00:21:00.760] And from that perspective, I sort of really don't want to talk about this, but let's talk
[00:21:00.760 --> 00:21:03.220] about it anyway.
[00:21:03.220 --> 00:21:07.080] The big AI is replacing analysts scare.
[00:21:07.080 --> 00:21:15.320] So you mentioned in agent falls that AI agents work as electors, coworkers and advisors.
[00:21:15.320 --> 00:21:21.200] Where in your opinion does the AI analyst role fit in this new world?
[00:21:21.200 --> 00:21:26.120] Well, on that topic, there's a chapter in my agent force book, agent forces, Salesforce's
[00:21:26.120 --> 00:21:28.040] AI agent platform.
[00:21:28.040 --> 00:21:31.400] And there's a chapter in there on the future of work.
[00:21:31.400 --> 00:21:37.040] And I said about writing that chapter in the beginning, I didn't really didn't know.
[00:21:37.040 --> 00:21:42.040] My opinion, to be honest, was these models are pretty good and I feel like maybe there's
[00:21:42.040 --> 00:21:44.480] going to be a lot of jobs at risk here.
[00:21:44.480 --> 00:21:48.920] But I did research and I read everything I could find on this topic.
[00:21:48.920 --> 00:21:53.740] And there've been a bunch of studies, not just McKinsey and Deloitte, but International Monetary
[00:21:53.740 --> 00:21:58.320] Fund and World Economic Forum and Davos, they talked about it.
[00:21:58.320 --> 00:22:03.360] And the consensus is, and it's really a consensus across all of these big studies of people
[00:22:03.360 --> 00:22:08.680] who must be smarter than me because they've written big studies, is that in fact, every
[00:22:08.680 --> 00:22:11.440] technology revolution brings with it two things.
[00:22:11.440 --> 00:22:14.100] One is a fear of job displacement.
[00:22:14.100 --> 00:22:18.080] So computers appear, internet appears, it'll be massive job displacement.
[00:22:18.080 --> 00:22:22.320] But then what ultimately happens is there are more jobs created.
[00:22:22.320 --> 00:22:26.040] The difficulty is that, first of all, it helps the economy.
[00:22:26.040 --> 00:22:28.640] So companies get bigger, so they need to hire.
[00:22:28.640 --> 00:22:31.640] But the difficulty is in trying to predict what the roles are.
[00:22:31.640 --> 00:22:36.120] And the example I would give there is that my entire career in the last 20 years, I was
[00:22:36.120 --> 00:22:40.360] at Digitas with digital advertising and now I'm at Salesforce like cloud software.
[00:22:40.360 --> 00:22:43.440] And none of this existed when I was in college.
[00:22:43.440 --> 00:22:47.140] And I was in college before the internet.
[00:22:47.140 --> 00:22:53.640] So literally these are new jobs, but not a job that I at the time could have thought
[00:22:53.640 --> 00:22:54.640] of.
[00:22:54.640 --> 00:22:56.840] You're asking about what the role of the digital analyst would be.
[00:22:56.840 --> 00:22:57.840] I think it's going to change.
[00:22:57.840 --> 00:22:58.840] I really do.
[00:22:58.840 --> 00:23:04.720] And I think in good ways, I'm very bullish on the future of the digital and the data
[00:23:04.720 --> 00:23:05.880] analyst role.
[00:23:05.880 --> 00:23:10.200] I think it's always, as I said earlier, a very flexible role.
[00:23:10.200 --> 00:23:15.260] So data analysts are people who understand the business, good ones, understand the business,
[00:23:15.260 --> 00:23:19.160] but also understand the tools and are open to using all kinds of different tools to get
[00:23:19.160 --> 00:23:21.040] answers to questions.
[00:23:21.040 --> 00:23:27.160] And the skill there is in trying to define the problem and determine the data sources
[00:23:27.160 --> 00:23:31.560] and also knowing if the answers that you're getting are directionally right or not.
[00:23:31.560 --> 00:23:34.120] There's a lot of intuition and common sense.
[00:23:34.120 --> 00:23:40.920] So I think that that higher level reasoning, orchestrating, choosing, selecting the tools
[00:23:40.920 --> 00:23:45.240] among those that are available, knowing the business well enough that you can ask the
[00:23:45.240 --> 00:23:49.000] right questions, all of that is still relevant and it's going to be even more relevant in
[00:23:49.000 --> 00:23:50.000] future.
[00:23:50.000 --> 00:23:56.120] I think what will go away is the parts of my job that I used to hate when I was doing
[00:23:56.120 --> 00:23:58.760] this, like building dashboards.
[00:23:58.760 --> 00:24:04.000] It was a very manual process, going and cutting and pasting data into Excel.
[00:24:04.000 --> 00:24:05.800] Oh my God.
[00:24:05.800 --> 00:24:09.280] And even just getting the data, it was a headache.
[00:24:09.280 --> 00:24:13.320] You had to get the file sent over and sometimes it was too big in the wrong format.
[00:24:13.320 --> 00:24:17.920] Anyway, I won't get into that.
[00:24:17.920 --> 00:24:21.560] Our audience and us know enough about that.
[00:24:21.560 --> 00:24:27.520] But I think that one of the ways as analysts we can sort of see a little bit into the future
[00:24:27.520 --> 00:24:35.180] is we're in a way lucky enough that this disruption happened to software engineers before it's
[00:24:35.180 --> 00:24:37.840] happening now to AI analysts, right?
[00:24:37.840 --> 00:24:44.520] The AI engineer is a role that has now existed for the whole of two long years, which in
[00:24:44.520 --> 00:24:50.520] AI time is two millennia of history.
[00:24:50.520 --> 00:24:57.360] And obviously, Asqua is a startup, so we have an engineering team and we've seen very clearly
[00:24:57.360 --> 00:25:04.960] from the very beginning where we started hiring and how difficult it was to hire software
[00:25:04.960 --> 00:25:10.720] engineers that had any experience with code generation and understood how LLMs work, et
[00:25:10.720 --> 00:25:11.720] cetera.
[00:25:11.720 --> 00:25:17.720] And so we had to have a selection process that would look for people who had the potential
[00:25:17.720 --> 00:25:21.920] to switch, not necessarily had already switched.
[00:25:21.920 --> 00:25:28.000] Today and this is only one year later, we don't have to do that anymore.
[00:25:28.000 --> 00:25:36.440] Today it's a given for software engineers and I've written a lot about the software engineering
[00:25:36.440 --> 00:25:41.120] process versus the analytics process, how they're different and how you can't simply
[00:25:41.120 --> 00:25:45.880] take cursor and apply it to analytics because it just doesn't work for the workflow of an
[00:25:45.880 --> 00:25:46.880] analyst.
[00:25:46.880 --> 00:25:53.720] I really, really do think that something like this will happen to analysts and that the same
[00:25:53.720 --> 00:25:58.920] thing will happen in terms of, yes, you always need to be a good software engineer.
[00:25:58.920 --> 00:26:04.440] You still need to be a good software engineer in order to understand how to use the code
[00:26:04.440 --> 00:26:09.320] that you've generated and create an application that has all the different bits and pieces,
[00:26:09.320 --> 00:26:10.700] security, et cetera.
[00:26:10.700 --> 00:26:17.600] You still need to be a good analyst to understand what it is your stakeholder wants or needs,
[00:26:17.600 --> 00:26:22.640] what it is your business does, what are the steps involved to getting that answer, giving
[00:26:22.640 --> 00:26:26.720] your setup, giving your data layer, giving your data structure, et cetera.
[00:26:26.720 --> 00:26:29.640] And then you have to orchestrate those steps.
[00:26:29.640 --> 00:26:34.000] And this is, I think, where a lot of the difference really is.
[00:26:34.000 --> 00:26:39.520] So we're focused on giving analysts repeatable workflows with AI.
[00:26:39.520 --> 00:26:40.560] We call that skills.
[00:26:40.560 --> 00:26:47.020] I think that once you package your workflows and you have an AI doing a lot of the steps
[00:26:47.020 --> 00:26:56.020] for you, your role as an analyst really changes more from being an operator of a platform
[00:26:56.020 --> 00:27:02.160] or in the case of a BI tool, I was joking about how so many clicks, setting up a dashboard
[00:27:02.160 --> 00:27:05.000] is what used to be a lot of clicks, right?
[00:27:05.000 --> 00:27:09.620] To select the variables and the charts, et cetera, a lot of clicks.
[00:27:09.620 --> 00:27:18.580] To basically being an orchestrator of code that will control what is being done in the
[00:27:18.580 --> 00:27:24.440] different pieces in the analytics process so that you get to the output that you need.
[00:27:24.440 --> 00:27:30.620] Do you have a view of how that switch from operator to orchestrator will work in practice?
[00:27:30.620 --> 00:27:32.620] Do you see that around you at all?
[00:27:32.620 --> 00:27:36.140] Well, I think there's always been a difference.
[00:27:36.140 --> 00:27:40.220] I don't know how to say this.
[00:27:40.220 --> 00:27:45.740] There's always been a difference between good data analysts and the other ones.
[00:27:45.740 --> 00:27:46.740] Yes, that's true.
[00:27:46.740 --> 00:27:48.740] And that is very well said.
[00:27:48.740 --> 00:27:49.740] Thank you.
[00:27:49.740 --> 00:27:50.740] Yeah.
[00:27:50.740 --> 00:27:54.820] I mean, I don't want to, you know, if anybody out there is one of the other ones, which
[00:27:54.820 --> 00:27:59.200] I doubt it because you wouldn't be listening to this if you were.
[00:27:59.200 --> 00:28:02.820] But I think that, you know, then this is a role people, you know, as data analysts, you're
[00:28:02.820 --> 00:28:04.220] like, oh, that's a low level role.
[00:28:04.220 --> 00:28:06.020] But it's underestimated.
[00:28:06.020 --> 00:28:10.060] I think you can do more with a single person who's very good in that role than you can
[00:28:10.060 --> 00:28:13.300] anywhere else in the entire enterprise, I would say.
[00:28:13.300 --> 00:28:18.140] And when I realized that, I was at Gartner and I went to a company on the West Coast.
[00:28:18.140 --> 00:28:20.300] It's like a big computer company.
[00:28:20.300 --> 00:28:21.900] They sold hardware.
[00:28:21.900 --> 00:28:25.780] And there was one, they hired one guy there and he was a data analyst.
[00:28:25.780 --> 00:28:29.540] And he came up with a new way to segment their market so they could change their go to market
[00:28:29.540 --> 00:28:32.900] policy, go to market practice.
[00:28:32.900 --> 00:28:34.140] And it worked really well.
[00:28:34.140 --> 00:28:38.860] But it was just him and he took it upon himself to like ask the right questions and get the
[00:28:38.860 --> 00:28:41.660] right and he's like, oh, maybe we try this different segmentation method.
[00:28:41.660 --> 00:28:47.520] And he didn't get enough credit, in my opinion, because he actually turned that business around.
[00:28:47.520 --> 00:28:49.180] And that was that's one data analyst.
[00:28:49.180 --> 00:28:52.860] So I think that that kind of a person who can come in and ask super smart questions
[00:28:52.860 --> 00:28:58.780] and apply the right tools and, you know, know what's going on will always be very valuable.
[00:28:58.780 --> 00:29:06.860] But there's no existing AI process that could replace such a kind of a broad orchestrator,
[00:29:06.860 --> 00:29:08.700] as he said, that kind of a thinker.
[00:29:08.700 --> 00:29:13.940] But it's the lower level, the analysts who are content to focus on a single channel,
[00:29:13.940 --> 00:29:19.340] single tasks that can be automated, who are content to do a single kind of a job and not
[00:29:19.340 --> 00:29:23.540] ask deeper questions.
[00:29:23.540 --> 00:29:25.260] Those people are in danger.
[00:29:25.260 --> 00:29:26.260] They're in trouble.
[00:29:26.260 --> 00:29:31.900] I think I'm very I am optimistic about not just data data analysts, but people in general,
[00:29:31.900 --> 00:29:36.020] because people are very adaptable human beings.
[00:29:36.020 --> 00:29:38.460] I think that's how we've survived.
[00:29:38.460 --> 00:29:42.220] And so whenever anyone asks me, oh, the ad agency is doomed.
[00:29:42.220 --> 00:29:44.060] And my thought is you worked in an agency.
[00:29:44.060 --> 00:29:47.160] I'm like, the ad agency is really just a bunch of people.
[00:29:47.160 --> 00:29:49.800] And they can see what's going on even better than we can.
[00:29:49.800 --> 00:29:53.220] And they can change what they do, change their what they offer.
[00:29:53.220 --> 00:29:57.180] So the ad agency is not going anywhere, you know, it'll be around.
[00:29:57.180 --> 00:29:59.460] And it's just this adaptability, this constant change.
[00:29:59.460 --> 00:30:04.340] I was at a dinner in Chicago this week, and it was these parents talking about their kids
[00:30:04.340 --> 00:30:07.020] like, oh, I'm so worried about I don't know what to tell my kids.
[00:30:07.020 --> 00:30:08.220] Should they learn how to code?
[00:30:08.220 --> 00:30:09.300] Should they?
[00:30:09.300 --> 00:30:13.340] And I'm thinking, don't tell them anything, you know, they'll figure it out.
[00:30:13.340 --> 00:30:16.220] Whatever they do, two years from now, if there's something they need to know how to do, they'll
[00:30:16.220 --> 00:30:17.260] learn how to do it.
[00:30:17.260 --> 00:30:20.000] And then they'll kind of apply for those jobs and so on.
[00:30:20.000 --> 00:30:21.500] So I think adaptability is key.
[00:30:21.500 --> 00:30:23.500] Yeah, yeah.
[00:30:23.500 --> 00:30:28.820] In your age, as we all are a testimony, both of us are a testimony to that.
[00:30:28.820 --> 00:30:29.820] Yeah, that's right.
[00:30:29.820 --> 00:30:31.380] You got to keep learning.
[00:30:31.380 --> 00:30:36.100] Yeah, I mean, you and I, we've studied LLMs and that's in the past couple of years, and
[00:30:36.100 --> 00:30:38.980] that's not easy to understand how those work.
[00:30:38.980 --> 00:30:43.520] I mean, I challenge you, whoever is out there, try to figure out exactly how they work and
[00:30:43.520 --> 00:30:45.180] it's not easy.
[00:30:45.180 --> 00:30:46.180] It's not that simple.
[00:30:46.180 --> 00:30:48.460] And then try to explain it simply.
[00:30:48.460 --> 00:30:50.360] That's really, it's very interesting.
[00:30:50.360 --> 00:30:57.740] So if you think about ultimately this evolution, right, of tasks that can be automated, that
[00:30:57.740 --> 00:31:00.220] disappear, that is not new.
[00:31:00.220 --> 00:31:06.580] I think that what is probably new is the pace at which it's happening with LLMs.
[00:31:06.580 --> 00:31:09.420] It is certainly faster than anything else.
[00:31:09.420 --> 00:31:16.140] I still don't think that somehow this is the be-all of, you know, transforming everything
[00:31:16.140 --> 00:31:21.140] into the bots are going to do everything and humans are going to be obsolete.
[00:31:21.140 --> 00:31:22.940] I just can't see that.
[00:31:22.940 --> 00:31:27.500] By the way, you know, this rising unemployment, apparently now, I was just reading that and
[00:31:27.500 --> 00:31:30.060] every story about this said, "Oh, it's AI.
[00:31:30.060 --> 00:31:32.140] It's all caused by AI."
[00:31:32.140 --> 00:31:35.540] And I thought, well, you know, throughout my working life, there's been unemployment
[00:31:35.540 --> 00:31:39.020] and there have been periods when people are laid off and we didn't have any AI to blame.
[00:31:39.020 --> 00:31:41.760] There was something else, the interest rates were being blamed.
[00:31:41.760 --> 00:31:45.860] So I think it's going to be a scapegoat now for any kind of bad news.
[00:31:45.860 --> 00:31:52.060] And obviously it's a better story than, you know, we over-hired and now we have...
[00:31:52.060 --> 00:31:53.980] Or our companies are badly managed.
[00:31:53.980 --> 00:31:55.900] Yes, of course they're not.
[00:31:55.900 --> 00:31:57.820] No, no, no, no, absolutely not.
[00:31:57.820 --> 00:32:07.300] Given that example that you gave about this AI analyst, it's a reality of the data analyst
[00:32:07.300 --> 00:32:13.340] role in most organizations, except if your organization's product really is analytics,
[00:32:13.340 --> 00:32:14.340] right?
[00:32:14.340 --> 00:32:18.820] And for that exception, the data analyst doesn't make decisions.
[00:32:18.820 --> 00:32:23.820] The data analyst support a business stakeholder who makes decisions.
[00:32:23.820 --> 00:32:30.340] And it is true that it is very often the business stakeholder that gets the credit and not necessarily
[00:32:30.340 --> 00:32:36.620] the data analyst that had the brilliance in the case of, you know, your data analyst there
[00:32:36.620 --> 00:32:42.500] to find another way to segment the business, which is if it's actually working commercially,
[00:32:42.500 --> 00:32:43.500] that's huge, right?
[00:32:43.500 --> 00:32:44.500] Yeah.
[00:32:44.500 --> 00:32:46.540] It's absolutely huge.
[00:32:46.540 --> 00:32:56.560] I feel that for a good AI analyst who is able to understand how to use these tools well
[00:32:56.560 --> 00:33:04.460] in order to explore opportunities in data, like opportunities in analysis, it will just
[00:33:04.460 --> 00:33:11.740] make this something that is easier to get to because you'll just be able to circle through
[00:33:11.740 --> 00:33:14.300] scenarios faster.
[00:33:14.300 --> 00:33:21.300] Because I've always felt that this was the one thing that was hindering the creative
[00:33:21.300 --> 00:33:26.820] process in analytics is it is very costly to test a new hypothesis.
[00:33:26.820 --> 00:33:29.580] Trying something takes a long time.
[00:33:29.580 --> 00:33:36.100] And that's where I think this orchestration aspect is very helpful because imagine you
[00:33:36.100 --> 00:33:40.740] have one analysis process, you have your data, you have it's organized the right way, et
[00:33:40.740 --> 00:33:45.900] cetera, and now you want to test six hypotheses with small variations.
[00:33:45.900 --> 00:33:52.780] You can run them in parallel in six tabs and they will call you when they need decisions
[00:33:52.780 --> 00:33:59.020] from your side and you can now be a lot more effective at exploring finding those gems.
[00:33:59.020 --> 00:34:04.860] Yeah, I think also underestimated in my opinion is the sort of role of like a presenter agent
[00:34:04.860 --> 00:34:12.700] or it's the UX component because data analysis is numerical and it's quantitative, but the
[00:34:12.700 --> 00:34:19.540] way that quite often data driven decisions are spread through an organization, particularly
[00:34:19.540 --> 00:34:23.100] through the business side of the organization is through visuals.
[00:34:23.100 --> 00:34:27.220] It's always through powerful, useful visuals some way.
[00:34:27.220 --> 00:34:32.580] Visual will convince people where numbers don't, even if it's exactly the same information.
[00:34:32.580 --> 00:34:37.020] And creating those visuals is something that I think AI is going to be very good at and
[00:34:37.020 --> 00:34:38.900] it'll be instantaneous.
[00:34:38.900 --> 00:34:43.500] And that was not always true, like trying to think about exactly the right way.
[00:34:43.500 --> 00:34:48.020] And so optimizing the kind of the visual display of information, if you will, is something that's
[00:34:48.020 --> 00:34:55.380] going to be increasingly automated, but I think very powerful for this role going forward.
[00:34:55.380 --> 00:35:01.420] I know that the part of my job when I did it that I liked the least was trying to come
[00:35:01.420 --> 00:35:03.900] up with the right charts and graphs.
[00:35:03.900 --> 00:35:07.260] Not very visual, you know, a lot of quantitative people really aren't.
[00:35:07.260 --> 00:35:09.940] So our UX skills are not so great.
[00:35:09.940 --> 00:35:16.940] I think nobody is, you know, equally strong across the entire process.
[00:35:16.940 --> 00:35:21.900] I also think this is where LLMs really power the AI analyst.
[00:35:21.900 --> 00:35:28.500] I'm thinking of it as like a full stack extension of the skill set where I'm never going to
[00:35:28.500 --> 00:35:30.020] be a data engineer.
[00:35:30.020 --> 00:35:38.620] I'm not good enough at SQL, but given a good LLM, I can generate code that I would not
[00:35:38.620 --> 00:35:42.700] have written by myself because it would just have taken too much time.
[00:35:42.700 --> 00:35:47.820] So I can move left towards more technical, left of the stack towards, you know, more
[00:35:47.820 --> 00:35:49.740] technical aspects.
[00:35:49.740 --> 00:35:58.060] But also nobody's equally strong in analysis, visualization, storytelling, translating insights
[00:35:58.060 --> 00:36:05.180] into different ways of presenting them to literally different people who will react
[00:36:05.180 --> 00:36:10.780] to different metaphors or people who hate metaphors or whatever it is, right?
[00:36:10.780 --> 00:36:13.520] Ultimately, you end up presenting to somebody.
[00:36:13.520 --> 00:36:19.460] You need to present to whoever these people are with what you know that will get them
[00:36:19.460 --> 00:36:25.380] to actually break through and understand what they're looking at, right?
[00:36:25.380 --> 00:36:27.980] And that or quite frankly, just business domains.
[00:36:27.980 --> 00:36:31.980] I mean, you asked me to do an analysis for supply chain optimization.
[00:36:31.980 --> 00:36:34.420] I have no idea what the key metrics are.
[00:36:34.420 --> 00:36:35.420] None.
[00:36:35.420 --> 00:36:38.180] But with a good LLM, I'll do something decent.
[00:36:38.180 --> 00:36:41.380] I'll have a notion of what we're talking about.
[00:36:41.380 --> 00:36:44.580] And you know, I'm an analyst, so like I'll be able to make it happen.
[00:36:44.580 --> 00:36:49.540] And so there's also an extension of the skill set to the right towards more of the business
[00:36:49.540 --> 00:36:50.540] side.
[00:36:50.540 --> 00:36:55.900] And I think that full stack extension is really something that AI analysts should embrace
[00:36:55.900 --> 00:37:00.540] to move away from that commoditization that you talked about.
[00:37:00.540 --> 00:37:05.940] And I think, well, you know, these think about the LLM is not a person.
[00:37:05.940 --> 00:37:06.940] They don't have lives.
[00:37:06.940 --> 00:37:11.760] They're basically like somebody who, you know, a blank slate who went into a library and
[00:37:11.760 --> 00:37:15.140] read every single book, like literally every book in the library.
[00:37:15.140 --> 00:37:16.140] That's who they are.
[00:37:16.140 --> 00:37:17.940] But they don't have to negotiate with people.
[00:37:17.940 --> 00:37:23.100] They don't need to go around, you know, drive to the store and all this stuff that we do
[00:37:23.100 --> 00:37:27.740] as human beings that we take for granted and how to deal with other people, difficult people,
[00:37:27.740 --> 00:37:29.820] you know, not difficult people.
[00:37:29.820 --> 00:37:33.860] And so that whole element of being human, which is so important for marketing, it's
[00:37:33.860 --> 00:37:37.300] basically the message that we're transferring to our consumer.
[00:37:37.300 --> 00:37:43.380] We're connecting with them is something that an LLM has to mimic based on what it's read.
[00:37:43.380 --> 00:37:49.180] But it's never going to be as convincing as if we do it or if at least we guide it, we
[00:37:49.180 --> 00:37:53.500] as human beings, because we will always be better at being human.
[00:37:53.500 --> 00:37:59.620] And so I think that that like this idea of common sense is underestimated because humans,
[00:37:59.620 --> 00:38:01.160] it's just that we take it for granted.
[00:38:01.160 --> 00:38:05.380] But there's things that like, for instance, the first version of GPT, it wasn't good at
[00:38:05.380 --> 00:38:06.380] adding.
[00:38:06.380 --> 00:38:09.060] Like if you put in what's three plus three and pay, well, why didn't it know that it
[00:38:09.060 --> 00:38:12.340] knows, you know, the history of the French Revolution?
[00:38:12.340 --> 00:38:14.980] And the reason was because that's sort of common sense.
[00:38:14.980 --> 00:38:17.740] And I guess it didn't show up in enough books.
[00:38:17.740 --> 00:38:19.660] So yes.
[00:38:19.660 --> 00:38:22.380] So there is still hope for us humans, right?
[00:38:22.380 --> 00:38:25.700] Agent Force, you know, just came out in June, right?
[00:38:25.700 --> 00:38:27.820] I think it was in June, 2020.
[00:38:27.820 --> 00:38:29.220] Yeah, June.
[00:38:29.220 --> 00:38:32.580] So when you're thinking, I mean, you're a writer, right?
[00:38:32.580 --> 00:38:35.940] So I imagine you're going to continue writing.
[00:38:35.940 --> 00:38:38.660] Are you already thinking about your next book?
[00:38:38.660 --> 00:38:42.300] Is there a sequel to the agent revolution?
[00:38:42.300 --> 00:38:44.740] Are you going for a sci fi novel?
[00:38:44.740 --> 00:38:45.740] What's that?
[00:38:45.740 --> 00:38:50.140] I don't think I would inflict my fiction on anybody.
[00:38:50.140 --> 00:38:51.980] Not sure about that.
[00:38:51.980 --> 00:38:54.460] Yeah, no, I always, I mean, I have a bunch of ideas.
[00:38:54.460 --> 00:38:58.220] I had the Paleo ad tech podcast, so I wanted to do a book based on that.
[00:38:58.220 --> 00:38:59.540] It's like the history of ad tech.
[00:38:59.540 --> 00:39:00.820] That would be great.
[00:39:00.820 --> 00:39:04.020] I think that would be really interesting to a small group of people.
[00:39:04.020 --> 00:39:09.500] So it'd be like a niche title, probably, because most people don't really care how ads are
[00:39:09.500 --> 00:39:11.500] served, you know.
[00:39:11.500 --> 00:39:12.500] Really?
[00:39:12.500 --> 00:39:14.140] They should.
[00:39:14.140 --> 00:39:19.260] But I got very interested when I was thinking about the future of work and looking at all
[00:39:19.260 --> 00:39:23.380] those very thoughtful pieces on where work is going.
[00:39:23.380 --> 00:39:26.020] And I got interested in trying to predict the future.
[00:39:26.020 --> 00:39:30.660] So I think I'm working on something that's sort of more of a like a futurist point of
[00:39:30.660 --> 00:39:33.020] view on where we're going.
[00:39:33.020 --> 00:39:34.020] And it's very interesting.
[00:39:34.020 --> 00:39:35.260] I mean, you have to do scenarios.
[00:39:35.260 --> 00:39:38.460] I don't think anyone can exactly know where we're going.
[00:39:38.460 --> 00:39:39.980] But there's a lot there.
[00:39:39.980 --> 00:39:41.220] And AI will change things.
[00:39:41.220 --> 00:39:44.860] I mean, the world of tomorrow will not be the same as the world of today.
[00:39:44.860 --> 00:39:46.660] But then it never is.
[00:39:46.660 --> 00:39:47.660] It never is.
[00:39:47.660 --> 00:39:48.660] No.
[00:39:48.660 --> 00:39:51.500] And yeah, I'm looking forward to reading that.
[00:39:51.500 --> 00:39:57.660] So you're at work on the book already, or you're like sort of in the phase where you're
[00:39:57.660 --> 00:39:58.660] thinking about it?
[00:39:58.660 --> 00:39:59.660] No, I do.
[00:39:59.660 --> 00:40:03.460] Well, I always do a lot of research first.
[00:40:03.460 --> 00:40:05.060] The writing part is actually easy.
[00:40:05.060 --> 00:40:09.940] Like the Agent Force book I wrote in about a month, but the research part took many months.
[00:40:09.940 --> 00:40:13.220] And the research part is basically assembling the facts and putting them in the right order.
[00:40:13.220 --> 00:40:16.740] So you could see how if I have that in front of me, doing the writing part would be much
[00:40:16.740 --> 00:40:17.740] easier.
[00:40:17.740 --> 00:40:18.740] Yes.
[00:40:18.740 --> 00:40:19.740] So I'm in the fact gathering.
[00:40:19.740 --> 00:40:22.160] Oh, I understand.
[00:40:22.160 --> 00:40:23.860] When I produce content, I do the same thing.
[00:40:23.860 --> 00:40:26.100] I have to have my structure in front of me.
[00:40:26.100 --> 00:40:29.140] And then wrapping around is that that's easy.
[00:40:29.140 --> 00:40:32.940] And the information is like the actual data points.
[00:40:32.940 --> 00:40:35.700] So where can people find Agent Force?
[00:40:35.700 --> 00:40:39.260] Oh, well, the best place to go to Amazon, amazon.com.
[00:40:39.260 --> 00:40:42.540] And I mean, it's available on any online bookseller.
[00:40:42.540 --> 00:40:45.180] And it's called Agent Force.
[00:40:45.180 --> 00:40:46.180] That's the name.
[00:40:46.180 --> 00:40:48.060] And then the author is me.
[00:40:48.060 --> 00:40:49.700] So it's easy to find.
[00:40:49.700 --> 00:40:55.620] And I also have a website, Marty, Martin Kihn or Marty Kihn, either one.com.
[00:40:55.620 --> 00:40:58.380] Oh, you got both URLs?
[00:40:58.380 --> 00:40:59.380] Good.
[00:40:59.380 --> 00:41:01.580] Yeah, years ago I did.
[00:41:01.580 --> 00:41:06.500] There's another Martin Kihn out there in South America and we became friends because we would
[00:41:06.500 --> 00:41:08.860] sometimes get each other's email.
[00:41:08.860 --> 00:41:09.860] That's another story.
[00:41:09.860 --> 00:41:12.940] Luckily he was a hipster and his Instagram feed was so impressive.
[00:41:12.940 --> 00:41:17.380] He's like, he really improved my brand.
[00:41:17.380 --> 00:41:23.080] Well, Marty, it was really great.
[00:41:23.080 --> 00:41:27.660] Thank you very much for being Knowledge Distillation's first guest.
[00:41:27.660 --> 00:41:29.780] Yeah, thank you for inviting me.
[00:41:29.780 --> 00:41:30.780] I'm honored.
[00:41:30.780 --> 00:41:32.220] We'll talk to you soon.
[00:41:32.220 --> 00:41:33.460] Bye, Marty.
[00:41:33.460 --> 00:41:34.460] Bye.
[00:41:34.460 --> 00:41:37.860] That's it for our first episode of Knowledge Distillation.
[00:41:37.860 --> 00:41:44.840] If you're building your own AI analyst workflow, check out Ask-Y.ai where we're practicing
[00:41:44.840 --> 00:41:51.180] what Marty and I just preached about context engineering and intelligent analytics automation.
[00:41:51.180 --> 00:41:52.180] Thanks for listening.
[00:41:52.180 --> 00:41:53.900] And remember, bots won't win.
[00:41:53.900 --> 00:41:59.900] AI analysts will.
[00:41:59.900 --> 00:42:03.640] Thanks to Tom Fuller for the editing magic on this episode.
[00:42:03.640 --> 00:42:09.340] If you want to work with Tom, head to Ask-Y.ai and check out the show notes for his contact
[00:42:09.340 --> 00:42:09.780] info.