DATA SECRETS Podcast

The Missing Skill in Analytics: Data Product Management with Nadiem von Heydebrand (Ep 009)

Allegro Analytics Episode 9

In this episode of the DATA SECRETS Podcast, we sit down with Nadiem von Heydebrand, CEO and co-founder of Mindfuel, to uncover the real secrets behind creating business impact with data and analytics.

Nadiem shares why data teams struggle to demonstrate their value, drawing on stories from consulting and startup life... including why 1,900 out of 2,000 dashboards at one client were simply unnecessary. We discuss:

  • The difference between “output” and true business “outcome” in analytics
  • How data product management transforms tech output into customer-centric results
  • Why most organizations don’t generate (or properly use) their own data early on
  • The hidden dangers in data processes, like the paint shop model ruined by overlooked temperature data
  • How to build and structure a data team for maximum business value, whether centralized or distributed
  • What makes customer satisfaction the most universal KPI, and why click rates are losing relevance
  • How AI and data governance are reshaping enterprise priorities

From startup pivots to insurance risk modeling to KPI horror stories, Nadiem’s insights reveal how leaders must link every data asset to a real business impact... or risk getting lost in a forest of unused dashboards.

If you want to know how data teams, product managers, and business leaders can finally crack the value code, this conversation is a must-listen.

Connect with Nadiem:
🔗 LinkedIn: https://www.linkedin.com/in/nadiemvh/
🌐 Mindfuel: https://www.mindfuel.ai/

Follow the DATA SECRETS Podcast

📬 Get episode recaps & bonus insights at allegroanalytics.com/podcast
🕺🏻 Connect with Nathan on LinkedIn
📺 Watch every episode on our YouTube Channel

Nadiem von Heydebrand [00:00:00]:
From a support function perspective, we are just as good as the businesses we cannot measure or like. We cannot justify our value due to the great number of dashboards we've produced. I had a client the other day, he said we have 2000 dashboards from which 1900 are unnecessary. From which 1900 we can switch off.

Nathan Settembrini [00:00:26]:
Your data has secrets, secrets that could change everything if you only knew where to look. Welcome to the Data Secrets Podcast. Welcome to the Data Secrets podcast where we uncover how business leaders crack the code, find truth in their data and turn insight into action. Today's guest is Nadeem von Heidebrand, a seasoned data practitioner, founder of Mindful and creator of Delight, which is a software tool that helps companies manage the impact of of data and AI. Nadeem, welcome to the pod.

Nadiem von Heydebrand [00:01:06]:
Nathan, thank you very much for having me. It's great to be here. I'm really looking forward to our conversation. I already enjoyed our pre talks and I think the mindset we both share and I'm really looking forward to discuss this with you, especially as the topic around value comes up day in, day out in my life these days. And I'm really looking forward to exchange.

Nathan Settembrini [00:01:35]:
Yes, yeah, we align a lot on the impact from data efforts. And why in the world would you even go to all this effort of building analytics and BI and data warehouses and all this stuff, right? Unless it's driving the business forward. So could you give our listeners just a brief intro to yourself and your company?

Nadiem von Heydebrand [00:01:59]:
Yeah, happy to do so. So hi everyone, my name is Nadeem, I'm the CEO and co founder at mindfuel. Over a decade ago I studied computer science. I'm located actually here in Munich, Germany. So for those so from the other side of the pond in Europe I studied computer science. I focused my entire career, my studies on data and analytics. I started hands on as a data scientist. I grew the career ladder in a consultancy, became a team lead in data science and machine learning and shifted then from hands on after a couple years in the direction of data strategy. So my, my focus was then on building up teams instead of leading the team itself. Classical data strategy or data and AI strategy focuses on what we would call operating models. So building the organizational part of it, the processes, the roles, responsibilities, architecture. And I was doing this for major enterprises and in a certain moment in time I was faced with the question by one of my customers actually asking me, nadim, now I gave you budget year over year to facilitate data analytics in our organization. When can I get my money back? So they, I mean the question was more than fair. It was a huge, it was a huge effort. It was a huge project, a big team. We're talking more than 200 people which have been hired for building up the what you would call center of Excellence or yeah, back in the days they called Data Lab or Data Unit. So this question was more than fair to, from his perspective, asking, okay, what is the value in all of these efforts or how much return do we contribute to the business? Because as a data analytics department, we used to be a support function. I mean, we'll come to this discussion maybe later on as well. I face a big problem in the world that data teams are always questioned and are very puzzled and at the same time challenged to justify their value contribution. And with this problem in mind, we founded mindfuel. My co founder and I, we both come from the tech world. He has been a great product manager in his prior career. So the best idea we could find was bringing data and product management into one discipline. We call it data Product management. Launched mindfuel, always with a business perspective on that topic. Not necessarily coming from the architecture side of the house, but how can data analytics teams demonstrate their business impact? Fast forwarding. Five years later, we have a platform for data and AI impact management, which is really the one management platform helping data analytics leaders and teams to exactly demonstrate this business contribution. I think we put all our energy and all our research into the idea and fixing the problem, helping the teams to manage two worlds. On the one hand side, what does the business truly need? What are the business demands? And on the other sides, which assets have we built? Dashboards, machine learning models, data marts, BI environments like data warehouses. And we can link these two worlds and demonstrate how much business value every single asset can contribute to the business.

Nathan Settembrini [00:05:38]:
That's amazing.

Nadiem von Heydebrand [00:05:39]:
Yeah.

Nathan Settembrini [00:05:39]:
And for the listeners, I got a demo of delight and it was delightful. I was, I saw, I saw the, the value immediately. It's actually, it's a very intuitive but also extremely powerful tool.

Nadiem von Heydebrand [00:05:56]:
Thank you.

Nathan Settembrini [00:05:57]:
Because I. So the, the way that you described it to me was, you know, product managers, software developers, they have tools, right, that they use to manage their stuff, but data teams really don't.

Nadiem von Heydebrand [00:06:12]:
Right, Exactly. I mean, you know, Nathan, nothing looks easier than a realized idea. This is the favorite quote of my co founder. And I think what we have developed is something completely new because we're building a new category for this topic of data and AI impact management. And as you said, tech people or development teams, they typically have their tech environment and they have their tools and products and systems where they work in either task management systems or data management platforms, or data catalogs or whatever. But we provide a management software on top of that, which behaves more like a. Like yeah, you could imagine a portfolio management software keeping track and overview on what the teams are actually working on. So our product is for product managers, for portfolio managers, for business people. So we onboard a lot of business people who are contributing their demands and use cases to the data team. So I think the best comparison could be if you want. So we are the management system for data and analytics teams. Just like HR teams have their HR tools or sales teams have their sales tools to keep track and overview of what they are actually doing. So imagine sales would still work with Excel spreadsheets, or HR would still work with Excel spreadsheets. Data analytics seems to work with Excel spreadsheets when it comes to their master data management. And this is where we're coming into play. We are replacing these scatter Tools of Excel, PowerPoint, Miro, Confluence, Google Spreadsheet, Google Docs, Google Sheets. So everything you need to keep track and overview and what you're actually working on, we bring this all together in one platform to manage the value of it.

Nathan Settembrini [00:08:10]:
Yes. Hey, so, okay, so I'm actually curious, so going back in your story to when you were doing consulting, can you, can you tell us a story about, you know, how did this light bulb go off that you said, hey, I, I must stop consulting and I must go build this software tool because the world needs it.

Nadiem von Heydebrand [00:08:30]:
I think the light bump for me was the moment when I realized what the difference between output and outcome actually is. So my co founder who used to be a product manager, he, he told me always, Nadim, you guys in data, you always think in output. Especially you as a consultant, you, you have a project to deliver on. This project has a start date and end date. You finalize the scope, you deliver on the scope and then you're off. But this is, this is the output thing. You're delivering output. But as a product manager, my job is to constantly generate outcome for my recipient of the service, which is my customer. So when I was thinking back to my stakeholders in my projects and to the specific moment where this group CIO challenged me on how much value contribution all of this have. So how much money? When do I get my money back? He was actually asking for the outcome we are generating for the business and not necessarily of how many use cases have we developed, which is output. And this moment in time when Max, my co founder then said to me Nadim Data teams have to think in outcome, not in output. From that moment I've realized data teams have no clue about product management in general. Not only about the discipline of product management, but also about best practices and frameworks. And I mean there are so many great also thought leaders in product management out there in the world, especially in the U.S. by the way. And so what we've did, we brought this mindset into the world of data. And when I, when I figured out that this is actually the key for the lock to unlock the topic of value, I realized, okay, this is a business idea. I don't know how, I don't know how the solution should look like. So we are also startup. We, we really started only with the problem statement. We had no clue at all how the solution could look like. And here comes another quote into play which I truly love is you have to fall in love with the problem, not with the solution. Because the solution pivots a lot over, over the years. But I think what we've been successful with is we always stick or we always kept to the problem statement we want to solve.

Nathan Settembrini [00:11:00]:
It's like a similar quote is marry the mission date the model. So you have this mission of outcome over output and the model that you use to help people get there could change. Absolutely will change over time. Absolutely cool for those. So I imagine some people are not super familiar with the concept of applying product management to analytics. Could you just briefly kind of unpack that for us?

Nadiem von Heydebrand [00:11:31]:
Yeah, I try to keep that very simple. Product management is a discipline with a lot of cool techniques and tools where you learn how to build great products. A dashboard for me is a product, a data warehouse is a product. And so we as data people, we actually build products. But we do this with a very technical oriented mindset. And I think the goal of data product management from my perspective, and I mean open to discuss this in a bigger round, but is bringing these product mindset principles into the world of data and making data products more customer centric, making sure that we are delivering the outcome for the consumer. In our case it would be the user of the dashboard. And I think this is the overall idea of what you would call data product management. I mean we could deep dive now into the disciplines and the techniques and all that stuff. But I think the overall focus is doing the right things versus doing the things right is another great principle in product management. So product manager is about of focusing on doing the right things while an analyst, an engineer, a developer is focusing on doing the things right. So they're making sure that we build the right solution and building the right technical component, while a product manager always makes sure that we're doing the right things for the customer, for the consumer, for the recipient of the service. So the one last example I really like in product management is if I build you a presentation with 200 slides in PowerPoint, it could be that you only like two out of the slides. So the output is 200, the outcome is two because only the two slides were valuable for you. And I could have also come with just the two slides. Then the outcome would have been 100%. And this is actually the idea of product management.

Nathan Settembrini [00:13:42]:
Awesome. Love it. All right, so let's get into the data secrets part of our show here. So let's start with your own data secrets. I think it's for you as a business owner, your company throws off data. And so what's one data driven insight that has reshaped how Mind Fuel operates or grows?

Nadiem von Heydebrand [00:14:12]:
I mean, this is a super good question. And when I'm thinking of it now, the challenge sometimes of a startup is that you don't have enough data or at the beginning you have no data at all. So the first job is to generate data and to generate insights. And we at mindfuel, we actually, I mean this comes maybe then to the bad stories later on, but we even, we really tried from the very beginning to build up a data warehouse or to build an analytics environment because we always thought, yeah, because we are a data company, we're building solutions for data people. How shameful would it be if we're not would have our own data lake and doing analytics on top of that? But if you don't produce data, you don't need a data lake, right? So for us the question was truly what are the right data generating processes we actually need and how do we capture data to take analytics on top of it or decisions on top of it? So for me, one of the most insightful data driven decisions I'm taking today is everything around our funnel, like from top of the funnel until the very bottom of the funnel. And so how do automated, I don't know, sales methodologies work or how about different motions we're trying to address in sales and marketing and how our conversion rates and how do they scale. So I think having a clear understanding of these metrics is super critical for a startup like us because investors will ask, where do you have, so where, where do you have to invest next? Which is the right channel to invest into to scale the entire system and for this you need proper data. And we were lacking this data in the very beginning for a very long time and it took us a while to build this entire environment.

Nathan Settembrini [00:16:18]:
Yeah, I mean it makes sense because when you're small, you're running the whole company on a couple key systems. You probably have a CRM, some system to bill people out of, track your finances. Not much. Yep.

Nadiem von Heydebrand [00:16:34]:
Not a whole lot. You run your company on assumptions. I think you're on assumptions and hypothesis and convictions. These are the currencies you're running your company on in the very first years. Because data is, even if you generate it, it's not reliable because it's a small number of observations. Right. And if two companies or if two customers say, oh, I really like that product, how significant is, is, is the number of two. So you, you have, you need time to generate data and you need time to generate good data. And now we are coming into an age where we have enough data to learn from, but it took us a while, to be honest.

Nathan Settembrini [00:17:23]:
Yeah, yeah. I mean, for the example in the CRM, if you have a few number of deals, like small number of deals, and you're trying to understand how long, how long is it taking us to, from beginning to end to close the deal? Well, there's also, yes, the numbers are small, but also do we have the right processes in place where the deals are actually entered when they are created consistently and you know, like, because the measurements are.

Nadiem von Heydebrand [00:17:52]:
And you're changing your processes constantly. I mean, for us we changed our sales process I think 15 times. So actually in these 15 times I always start at zero when it comes to data generation because the data the former process generated is not comparable anymore to the data we're generating now. So you're pivoting so often and I think it's not only for startups, but also for major enterprises. I've seen this in my career. The data generating process is a very tricky, very tricky piece. And this is where the feedback loop then is super important, how you also optimize the processes. I mean, in today's AI world, this is one of the most crucial topics in general, not only in a startup world, but also for major enterprises.

Nathan Settembrini [00:18:43]:
Yeah, for sure. Was there a time where the data told you one thing and your intuition told you something else and you had to choose to go with one or the other?

Nadiem von Heydebrand [00:18:54]:
Yeah, I mean, in my job I, to be honest, I have this constantly, this situation. I would consider myself a data related person. I truly believe in data and I think I can also assess data If I see it like if somebody presents me a report with a, I don't know, deviation or variance or whatever, or I see the number of observations and I know it's only 12 observations, I know to qualify and also to assess the result and taking the decision of it. But especially everything around our say our go to market numbers. Even if the data tells me that we should go left, I sometimes turn right. And this sounds, don't get me wrong, but sometimes it is experience and it's institution of, of what I truly think data leaders want and data leaders think and maybe the data we've generated is not so profound yet as it might, as it should be maybe. So yeah, it happened.

Nathan Settembrini [00:20:21]:
That's why we call it data informed decision making is you were informed with the data, but you sensed that it was either collected wrongly or potentially the right decision is actually the opposite.

Nadiem von Heydebrand [00:20:38]:
So and I think this is also really comes to the domain. I mean I would definitely always follow most probably the data when it comes to our finance department. I mean I think our finance lead does a really great job and I don't think that I have to turn right when he recommends left because he always has very good arguments when it comes to financial data. So I think it also comes really down to the domain are we talking about and which type of data we're discussing.

Nathan Settembrini [00:21:09]:
Yeah, that makes sense. All right, let's, let's take a turn into the data horror story. So this is maybe a story from your career, either before mindfuel or more recent where it was a painful data mistake, either bad metric, you made the wrong call, there was a hidden problem, everything crashed down. What's your data horror story?

Nadiem von Heydebrand [00:21:36]:
I'm not 100% sure if I'm, if I would call it a horror story in terms of it crashed, it finally crashed the airplane or something like that, so that the outcome of the decision was that bad. But of course I've seen crazy stories in my career and I think one of my favorite story was in a manufacturing company where a prediction model should qualify the quality of a painting machine. You know, when you have to paint the parts in the production line. These machines are very sensitive. They need a very clear instruction of how thick or thin the paint should be in the machine so that the process works very smoothly. And if the parameters change, you try to avoid failure or you try to avoid. Like outtakes of the part so that parts are not getting painted the wrong way or too thin or too thick because of the density of the paint. And we had a prediction model up and running to predict when is the right moment in time to maintain that machine, to make sure that there's somebody, an operator goes there, checks the machine, making sure that the process is still working. And what we never realized is, or we actually, we had a very good accuracy of the model. Then we started the production line processes and for some reason we had a lot of failing parts. So it was while we were producing, the number of parts which were failing were really high. And we didn't know why, actually because we couldn't find a mistakenly or we couldn't find any significances in the data. By a couple of weeks later, we realized that it was summertime and the manufacturing facility was in southeastern Europe. And purely due to the fact that there had no air conditioning in the facility, the entire temperature of the building has changed. And nobody was measuring this insight or this data point. So we had influences from outside of our, let's say, model which we could not measure, which we cannot see. And therefore this was the reason why the entire model at the end of the day actually failed due to the fact that the temperature in the facility was just way too high for the paint to produce.

Nathan Settembrini [00:24:30]:
Were you able to add that as a variable in the model?

Nadiem von Heydebrand [00:24:34]:
Yeah. So when the moment when we found out what the problem was, we definitely captured the temperature as a parameter and put this into a database and of course used it. I mean, as simple as that. The result, the solution was quite simple. To identify the problem was quite tough because I also never been on site. You're getting the data from somewhere in the middle of nowhere and you just see the data, but you've never been to the facility. Maybe there's also a good recommendation. Wherever you have a data project, always observe the data generating process by your own eyes. Look at it if you're in the manufacturing industry, go through the production line, follow up the steps, every single step, how this product is produced. Ever since whatever project I've done in the production industry, I always ask for a guided tour to better understand the process.

Nathan Settembrini [00:25:37]:
Yeah, I mean, yeah, the context matters in a big way.

Nadiem von Heydebrand [00:25:42]:
Absolutely.

Nathan Settembrini [00:25:43]:
All right, let's go back into a couple more little data secrets. Thinking about stories where you helped potentially a client or in the early days and the data that was surfaced helped them generate a big impact.

Nadiem von Heydebrand [00:26:00]:
There have been a lot of great cases, I would say, where I was able to really demonstrate where the team and I have been really able to demonstrate the business impact of data analytics. I think one great example was in a reinsurance environment. One of our Customers also here at Minefield, they work on climate risks and so they try to predict the impact of climate on our overall industry sectors and trying to give indications especially on, I don't know, for example, physical assets like buildings or like infrastructure or whatever. So imagine you are an insurance and you want to predict how should be the insurance policy for your housing at home. They typically try to model the risk of climate to get a better understanding of how the impact of climate will be on your. On your. On. On the risk of house damages due to climate catastrophes.

Nathan Settembrini [00:27:23]:
Yes, and floods and fires and that kind of thing.

Nadiem von Heydebrand [00:27:26]:
Flood, fires and all these stuff. So this is a. Became a normal KPI in the insurance industry these days. And I think we've been able to design not only the, the, the impact story around that. So, but we used our entire minefield methodology of identifying opportunities for that platform of risk management. What are specific use cases we want to drive, which dashboards, which data assets do we have to build to unlock this value and clarifying for every single opportunity which we collected to build that product, how much business impact we can generate for the reinsurance, but also the insurance company. So I think our entire methodology and framework helped the company a lot of to identify the right use cases to prioritize. And I think the product itself which they are selling as of today in the market became super successful. So they turned break even with the investments and creating a lot of additional revenue today with this risk platform.

Nathan Settembrini [00:28:38]:
Yeah, yeah. So you've helped a lot of big companies build out or kind of design a center of excellence for data analytics or data squad or data pod or whatever you called it. How I'm curious, how do you think about designing a team such that it optimally generates impact and insight for a company in general?

Nadiem von Heydebrand [00:29:13]:
Every data team, regardless of the size, if we're talking about three people, 30 people, 300 people should come in with the mindset of value. So because in the classical organizational setup, data teams are typically support functions. There is a core business and the data team is a support function. The core functions come to the data team, say I need a dashboard, I need a model, I need an agent, I need help. Can you help me with my data? So from a support function perspective, we are just as good as the businesses. We cannot measure, we cannot justify our value due to the great number of dashboards we've produced. I had a client the other day, he said we have 2000 dashboards from which 1900 are unnecessary, from which 1900 we can switch off. One mindset is clear, we are just as good as the businesses. We are defining our success through the result of the business. Now back to your question. How should you set that up? Depending on the size, everyone in the team, if it's a small team, then you maybe have a product owner or you have a designer, or you have a business analyst. These people need a good understanding of what the business actually needs and what the business actually wants and to drive value for them. They are our customers. They are our clients. If your team grows to a certain size, let's assume you have 15 people in your data organization, you grow from 5 to 15, then all of a sudden it makes sense to centralize this capability of what I would call product management or some other call it customer success. I heard business analytics, translator as a role. I heard whatever you want to call it. But a person who builds the bridge between the data team and the business team, Imagine a customer success manager. Imagine someone who's really into the business problems, understanding the business needs and trying to drive value for them. Because again, the value of the data analytics department comes through the value contribution to the business. In most of the organizations which are data teams, which are rather smaller. But this is also part of the role of the leader of the data leader. He goes to the sales lead, he goes to the marketing lead. They talking on a leader level saying, how can I help you? The more and the bigger the team goes and the bigger the team grows. It makes sense to establish data product managers as a discipline and as a dedicated role. I know a lot of companies, they have four or five data product managers in a data analytics department or I even know organizations, they have dedicated data value managers. So establishing value management as a discipline, when I say my job is to make sure everything we're building for the business is generating value. And I'm, I'm monitoring and keeping track and overview of the business cases which we are driving for that. And they work then closely together with finance or controlling. So I think regardless of the size of your organization, if your mindset is there that you are just as good as the business succeeds, then you're doing a lot of things right.

Nathan Settembrini [00:32:31]:
From my point of view, I am curious. So I kind of led the witness a little bit with that question, kind of saying that we are going to establish a centralized analytics team. I've also seen companies where they either have a hybrid of a central team and then distributed analytics professionals kind of throughout the business. And then sometimes there's not a central hub, it's all just distributed. Do you see like why would one, why would an organization choose one or the other? Or have you seen, you know, one is great, the other is terrible. Like how do you think about that?

Nadiem von Heydebrand [00:33:09]:
I've seen it, so I've seen it all is maybe too much, but I've seen all of these different, different types of operating models. What I can say is like a decade ago everyone talked about centralized teams. It was very classic to have a centralized team. Then over the turn of, of the decade, we decentralized a lot. So then a lot of organizations decentralized because they said that the centralized team is a lot of let's. They create a lot of bottlenecks and they don't understand me and they don't have a better. Like we need people in the business. So everyone decentralized. The problem with decentralization is you duplicate a lot of efforts, you duplicate a lot of investments. Imagine you would fully decentralize and every department or every function would build up their own data lake. So this makes absolute no sense. So what you're trying is you try to ascend and centralize core capabilities and then you have something like I've seen different nuances of how much do you centralize? How much do you decentralize? To be honest, it from my point of view, we are on the track coming back to a centralized like there's still the hub and spoke model idea and I think everyone has established something similar like that. But concentrating centralized efforts becomes more and more popular due to the fact that you want to keep track and overview of your investments and budgets these days. So you want to make sure that you are have a clear governance and overview of where do you put your money in. So you rather prefer to have one centralized governance function or department taking care of the most important processes and methodologies and then the teams can build decentrally, but the steering works centrally. So for us as mindfield is perfect because we are facilitating exactly that model with our platform. We help teams to build up their use cases decentrally and keep their portfolios decentrally. But we also provide a view for central teams then aggregating all these portfolios to see what everyone is doing and how we can aggregate this value perspective on that. So great question and great point.

Nathan Settembrini [00:35:26]:
I appreciate your insight. Yeah, I've seen a bunch of different methods as well. It is interesting to think about the history of 10 years ago. What did the tooling look like in the infrastructure? Right? You required tons of expertise, probably lots of hardware, maybe 15, 20 years ago, lots of hardware. And like hardware Expertise, Right. And so big investment to. To stand up servers and that sort of thing. And so centralization makes a heck of a lot of sense. And then as tools have gotten easier to use with Tableau and Power Bi and Snowflake and things like that, then now it makes like the business users or like the edges of the business can start to do some of their own analytics versus submitting a ticket to it to create this report for me.

Nadiem von Heydebrand [00:36:20]:
I mean, we're talking about data secrets here, right? One secret ingredient is definitely speed. So the faster an organization can adapt data and analytics, the more money they make or the more costs they can save. So self service is a core principle of speed. At the same time, I can just share from my own perspective. Keeping track and overview of how much money you can generate through the help of data and analytics becomes very tricky if you have a decentralized operating model where you then lead a very good governance structure and communication piece so that everyone informs everyone on what they do. And this is typically not happening. So we do observe customers also in our customer portfolio today, where they have built the same data product three or four times in different domains. And this is where it gets a little bit. This is what you actually want to avoid. So you want to avoid that you built the same asset three or four times in different domains because they don't talk to each other. So communication, transparency and overview is key if you want to facilitate these decentralized efforts.

Nathan Settembrini [00:37:33]:
Yeah, for sure. All right, let's pivot to our KPI corner, our metric Mystery Theater. This is our lightning round where we talk about KPIs and metrics that you've seen or adopted or experience with clients. So let's start with what's one KPI that you think every company should track? Fundamental KPI. It doesn't have to be.

Nadiem von Heydebrand [00:38:01]:
I mean, now I have to talk from a product management perspective. And when my co founder will listen or will hear to this, he will definitely nail me down to that one. But I think it's customer satisfaction. I think whatever metric you want to use, specifically how to measure customer satisfaction, if it's your NPS score or if it's your customer satisfaction score or whatever you want to take for that. But every company in the world, if we stay on a very galactic global.

Nathan Settembrini [00:38:31]:
Perspective, there is a customer and they must be happy.

Nadiem von Heydebrand [00:38:35]:
They only survive if the customer is happy.

Nathan Settembrini [00:38:38]:
That's right. Yeah, I like that.

Nadiem von Heydebrand [00:38:41]:
This would be the one.

Nathan Settembrini [00:38:42]:
Yeah, that's a good one. What about a vanity metric? So it's something that you know, feels good but is kind of meaningless in the end.

Nadiem von Heydebrand [00:38:54]:
I think here now my marketing team will start to hate me. But I think the vanity metric I really, yeah, I'm really questioning these days is any kind of click rates or click through rates in, in, in our digital environment or marketing environment and marketing space. And purely due to the fact that there's, there is so much AI out there, there's so many bots, there are so many tools which are destroying these metrics these days.

Nathan Settembrini [00:39:30]:
Yeah.

Nadiem von Heydebrand [00:39:31]:
If I see an ad campaign and I see the, and I see the click through rate of an ad campaign, I, I, I, I think it, I cannot trust. And, and if this metric is then super high and we say oh we have a super great click through rate or opening rate or whatever.

Nathan Settembrini [00:39:48]:
Yeah, poof.

Nadiem von Heydebrand [00:39:49]:
I'm, I'm skeptical about that these days. Not due to the fact that our marketing team is doing a great job. Let me phrase this here for the recording. No, but it's more like, it's more like the world out there becomes really complex when you want to measure these type of KPIs. So I would not put my, I would not double down on this vanity metrics here.

Nathan Settembrini [00:40:11]:
Yeah, I mean so as you said that I was thinking wow, that's really powerful because if you had an agent installed in your inbox that was reading every email and deciding does nadeem need to see this? And if so let me go ahead and draft a response. And so it's taking all these marketing emails, reading them and I imagine that would count as an open and then deciding oh this is a marketing email and putting it in some folder that you never actually look at. Right, exactly.

Nadiem von Heydebrand [00:40:40]:
So welcome to my world.

Nathan Settembrini [00:40:44]:
What about the. I love this one. The weirdest KPI or the funniest KPI that you've come across.

Nadiem von Heydebrand [00:40:52]:
Yeah, funny. I really like that one. I was in a startup meetup actually the other day with another founders and the one came up with the mps, but this was the net pitch survival. So like how long, how many pitches do you have where you survived the first question of the CFO of your customer until, until you get crashed. So I really like that one. The net, the net pitch survivals, they counting. Yeah, I, I think it's a good one.

Nathan Settembrini [00:41:30]:
That's hilarious. What's your North Star metric at Minefuel?

Nadiem von Heydebrand [00:41:35]:
I mean as a test company of course I would have to say the ARR. This is the metric. It's a lagging metric to be honest. So it's always not 100 reliable at the moment you look at it because it's so lagging. But of course just the metric I think every SaaS company looks at. Another metric I'm really curious about at the moment is cost per demo booked. So how much investments do we take to book a demo with a customer? This is one which I'm looking together with my, with my executive assistants where we really browsing through the channels and checking, okay, what is the number of investment we need to get new demos in. And so this is a key KPI I'm looking at at the moment which is much more leading and in the middle of the funnel where it helps me a lot to steer upstream the funnel or even downstream the funnel.

Nathan Settembrini [00:42:39]:
Yeah, yeah, yeah. So it's basically like the top half of the customer acquisition cost without factoring in what happens after the demo.

Nadiem von Heydebrand [00:42:48]:
Exactly, yeah.

Nathan Settembrini [00:42:49]:
That's good. Awesome. Well, I'm really excited about these last couple questions as we land the plane because I just really appreciate your thought leadership in this space. But how do you see AI changing the way that you and your customers or businesses around the world, how they're approaching data?

Nadiem von Heydebrand [00:43:13]:
The whole AI topic and the whole AI bucket is still not fully uncovered yet. It's still not unlocked. And I'm really curious to see where this will all lead to. What I can say is that due to the fact that AI is in the pocket of every senior executive these days because everyone has ChatGPT on their phone, this is a completely different game in comparison to, I don't know, self service bi. It was definitely not the case that every senior executive had tableau on their phone. Now due to the fact that it is so tangible and it's so close to our day to day life, how we experience. AI companies have realized that they have to approach their data in a completely new way and because otherwise AI will not work for them. And I think there's no enterprise out there these days which is not working on the AI strategy at the moment. And this implicitly also infects or influences their data strategy and the way how they approach data from the data generating process, through the transformational process, through the storing, through the cleansing, everything is influenced by the way how we think about AI. And I mean we could discuss now another hour about that topic but what I would, what I could really focus on is data quality and data governance is in a reviving age now in the last 15 years, nobody wanted to talk about data governance. Nobody was interested in data quality. Data stewards in their role were frustrated because nobody was actually Listening to them. And I think data governance has become a crucial piece nowadays to drive the whole topic around data and also to support AI in the long run. So every data governor out there or data steward, it's your moment now.

Nathan Settembrini [00:45:23]:
Nice. Any final pieces of wisdom for leaders who are trying to unlock business value from their data?

Nadiem von Heydebrand [00:45:32]:
I think one, I share this constantly also in my talks. I think data has no value. Data per se has no value. The value comes with the use case. The business use case holds the business case. And then we discuss which data do we use to unlock that business case or that use case. So data itself has no value per se. So what you have to do is you have to link your business problem and the use case of the business with the data asset you're developing. A dashboard alone has no value. The dashboard unlocks the value the moment it is used in the business and in the moment it is driving the business process. And then we have value from data. Other than that, we do just data for the sense of data. So we always have to link these two worlds. And this is the core idea we started at why we started myfuel. We bridging these two worlds. All of your use cases, all of your data. If we link this, we can show the value. As simple as that.

Nathan Settembrini [00:46:46]:
Amen. That's our, our thesis as well. We're not building software. We're actually taking that approach to building dashboards and data processes and all that. But man, we're so aligned in that ethos. That's amazing. I appreciate that. All right, so let's land the plane. If people want to learn more about you Mindfuel Delight, how should they reach out? How could they learn more?

Nadiem von Heydebrand [00:47:14]:
So number one is ping me on LinkedIn. As simple as that. Find me at infinite bond on LinkedIn. There's only one at infinite one. Connect me, find out on our website www.mindfield AI and of course find me on events. I'm traveling all over the world visiting events. I'm talking a lot of events, speaking there. Just whenever you see me, just, just come over, talk to me. Last but not least, find any further information about Mindfuel around our booth when we have an exhibition or booth on these events as well.

Nathan Settembrini [00:47:51]:
Awesome. Well, thank you Nadeem. It's been a real pleasure. Thanks to our listeners and watchers. We published the video of these on YouTube and Spotify. We'd love it if you would like comment, subscribe, especially commenting in YouTube. We would love to like if you have questions, thoughts. Nadeem and I will both be in there reading every comment and responding. So yeah, we'd love to connect with you. Thanks again, Nadeem, and to all of our podcast listeners, keep searching. The answer is there. You just have to dig it up. That's the pod.

Nadiem von Heydebrand [00:48:37]:
Thank you very much, Nathan, for having me. Sam.