Ep. 95 | Designing Data for Business Decisions

This week Brian O’Neill, founder of Designing for Analytics joins Allison Hartsoe in the Accelerator. To design for analytics means thinking through the myriad of human behaviors which support a successful outcome. From planning to process to production, designing for analytics is all about the right way to support decision making.   

Please help us spread the word about building your business’ customer equity through effective customer analytics. Rate and review the podcast on Apple Podcast, Stitcher, Google Play, Alexa’s TuneIn, iHeartRadio or Spotify. And do tell us what you think by writing Allison at info@ambitiondata.com or ambitiondata.com. Thanks for listening! Tell a friend!

Podcast Links:

Is the design of your ML solution, data product, or analytics application preventing customers from seeing the full value of your technology?

Designing Human-Centered Data Products (Seminar)

Bios, Photos, Social Media

Read Full Transcript

Allison Hartsoe: 00:01 This is the customer equity accelerator. If you are a marketing executive who wants to deliver bottom line impact by identifying and connecting with revenue generating customers, then this is the show for you. I’m your host, Allison Hartsoe, CEO of ambition data. Each week I bring you the leaders behind the customer-centric revolution who share their expert advice. Are you ready to accelerate? Then let’s go. Welcome everybody. Today’s show is about the process of designing analytics data for decision making. And to help me discuss this topic is Brian O’Neil. Brian is the founder of designing for analytics, and he helps companies apply human-centered design to data science and analytics. Brian, welcome to the show.

Brian O’Neill: 00:52 Greetings! How are you?

Allison Hartsoe: 00:52 Good. Now, this is a fairly unique space, and I think the idea of designing for analytics can go in a lot of different directions. Can you just give us a little bit of your background and what this actually means that when you say designing for analytics?

Brian O’Neill: 01:05 Sure. At the core of the problem is data, and everyone’s trying to turn it into insight and information, and there are different elevations of that. So typically, when you think about the word design and analytics, I think most people jump to data visualization is kind of the thing that we imagine in our head. And this is kind of like what I would say, maybe the lowest level elevation about how you can approach it, right? Because it assumes you have the right information, as soon as you have the right data and the problem is now visualizing it, right? If you go up a level from there, you might come into the world that we call UX, right? User experience, which has more to do with understanding the work, the jobs, the tasks, the attitudes, the goals of people that are doing the job using this data. And so from there, our perspective now starts, we start thinking about workflows and what does it a day in the life of this person like and what are they sitting down and trying to do with the analytics, and their challenges may or not be just a database problem.

Brian O’Neill: 02:02 We may find out we don’t have the right information or it’s not presented at the right time in their job or their workflow, or they have to coordinate with another person, right? Then you go up another level, and you might have a customer experience with the data, but you also have the business or the organization and some places that’s the same thing, right? It’s an internal business sponsor has a tech team and a data team who provides solutions to them. That’s a little bit different than say a software company that has a data product or analytics where the product management or the chief product officer or the business is trying to create a business out of their past platform or whatever it may be, but then they also have the customer experience with their platform, so you need to merge these two interests, right? Like on the one hand, maybe they’re like, we really want people to depend on the platform and be there all the time, but on the other end, the customer is really like, I only want to go in there when I have to.

Brian O’Neill: 02:53 I don’t want to look at your numbers and stuff unless I have to. And as you know, then you can go up another level, and you’re, you started getting into ethics, right? And getting into like, especially the way that we talk about AI and data science, we’re now talking about how has automation and AI affecting groups that may not even be at the table as a stakeholder. Right? So there are these different elevations there, and I’m kind of interested in helping people with these different elevations. And I would say it’s the database and then the plus two kinds of up from that. That’s where kind of the sweet spot is for the work that I focus on is helping people align their goals, align their analytics and data products with particular business objectives and making sure that it resonates with customers. And the large majority of the time, it’s not just a database problem, it’s usually they don’t have any idea what someone’s going to do with the analytics.

Brian O’Neill: 03:41 They just kind of theoretically know that this number may be helpful. This may help us like maybe if we had the average over last year and it’s kinda this, throw it at the wall and guess kind of approach and instead of the what if it was this, how would you react to it? And getting feedback on early prototyping and understanding at what point someone says, whatever, I don’t care. I wouldn’t use that. And then working backwards from the point where they say, Oh, that’s amazing. I would totally, this is really helpful. It’s figuring that stuff out as early as possible before we do lots of technical work and invest a lot of time building a solution that maybe no one will use or care about.

Allison Hartsoe: 04:16 From what I hear, that’s the whole AI industry right now. Inventing, inventing tools that no one’s really sure how they’ll be used.

Brian O’Neill: 04:24 Yeah, there’s a really low success rate. I mean one of the top articles, which was maybe sad on my website, I have this article where every time another survey comes out about success rate with what used to be like BI projects 10 20 years ago, and then it went to big data projects, and now it’s AI and it’s basically the stats on success rates with these, and this is mostly talking, I think about like companies doing internal, using data for internal purposes, internal analytics, insight, that kind of thing. It’s like on average it’s 15 to 20% success rates and it’s continuing now with AI it’s, yeah, it’s really low, which is just amazing to me that I’m kind of just amazed that people’s departments are still getting the same amount of funding and that there’s not a higher turnover rate. There are some, I’ve heard some mumblings, no quantitative data about it.

Brian O’Neill: 05:10 Just that this, the gravy train may be ending and now it’s what another consultant I was talking to was saying, you know, if you’re a VP or director of analytics, you know, or in data science, you’ve got about two years to show an impact. And if you don’t show an impact, there’s a chance that team’s going to just get wiped and they’re going to bring someone else in. And my guess is that you know, we’re in the, we’re still in the hype cycle. Maybe we’re on the kind of the downward part of the hype cycle with AI. And at some point it’s just going to become like, yeah, we have an accounting department and we, it’s like oxygen.

Allison Hartsoe: 05:38 Of course we have analytics.

Brian O’Neill: 05:38 You know, you just suspect it’s gonna work, and it’s no longer going to be just like go hire some PhDs, pray, you know, and go do some math in the closet. Like this kind of approach. You do need the people with that technical experience. But as part, like I’m getting ready to run a seminar and the point of my training seminar is that it takes more than just technical knowledge to deliver successful solutions to people, especially if they don’t have a stats or a math background. You need to relate the output of this work back to someone who doesn’t speak stats, and they don’t know those stuff.

Allison Hartsoe: 06:11 Right. So let’s talk more about that because I actually heard a similar metric from Jen stirrup when she was in talking about AI and uses of AI and when you say the director of analytics has like two years to show impact. What she had said is that people tend to move around within two years, so let’s say they’ve got two years to show impact and they’re coming up on month number 20, and they’re like, Oh, I haven’t really shown any impact. The combined effect is they start looking around, or they get HUD hunted into the next role and then everything that they’ve done, even if it’s good work, gets thrown up in the air and somebody else has to catch it, which again creates this process of not actually moving the ball forward or make it very difficult to move the ball forward so whether they’re being kicked out for showing impact or whether they’re just moving along because the industry is so hot.

Allison Hartsoe: 07:02 Either way, I think we have this discordance of using data for internal purposes to get to success just as you pointed out. But along those lines, I have to ask if data scientists should really be responsible for the adoption of their work because it’s so much effort sometimes to get to the right answer that once you reveal that, I know a lot of data scientists feel this way, like, oh, I’ve got the answer, and then they like trope it around, but then the organization doesn’t eat it up. They just kind of, or yeah, that sounds good. Or maybe they try to get it out, but it doesn’t stick. So why is there this problem of data scientists dropping me answers at people’s feet and people just saying, I don’t know if I want to use that?

Brian O’Neill: 07:48 Well, that’s simple. They dropped a technical answer in front of people. A mathematically correct answer. What they did not provide is value in a solution that people are willing to adopt. I would say this is not uncommon because what you’re talking about is it’s what I call thinking about outputs versus outcomes, and the more your team is focused on producing outputs and not thinking about whether they create outcomes, that’s a killer right there. The secret is to figure out what the desired outcome is and to align the outputs with that, and you may find out that it’s not enough to just come up with the model. Like the model, I was just actually had someone on my podcast yesterday. We were talking about this from a design company, and he runs the data science group there, and they were talking about like the first deliverable was a spreadsheet with a bunch of names in it.

Brian O’Neill: 08:34 I forget what the call like maybe it was a list of prospects, sales prospects to call or something like this. Oh actually has nothing to do with like training ambassadors, and they were trying to model who in the company are considered the people that others go to for help. They’re trusted like ambassadors in the company for information, and they were trying to do this at scale using some data science to identify these people for some training initiative, and the first output was just an Excel spreadsheet. Then the problem with that was it didn’t express how the system came up with this list of names. And so the managers had a really hard time swallowing that this list of names, a lot of stuff went into the like if this wasn’t just some random thing, there was some heuristics, there was some logic that went into this, and it was just too opaque and so that’s a great example of the math part may have been correct, but you didn’t figure out what someone was going to use, what would they be willing to actually engage with and depend on.

Brian O’Neill: 09:30 And so through the process of prototyping and approaching the redesign as the thing, they realized that the manager is actually a, they needed some explainability to understand how the system came up with this. And secondly, they needed to be able to adjust the names that were appearing. So whether that was simply removing certain people like using some last-minute human judgment or whatever it may have been. The point here also is that in this situation, there was probably never any requirements document that said we must be able to remove and add people Willy nilly, and we must be able to edit the parameters of the whatever. Why?

Allison Hartsoe: 10:03 Do you think they would have said that though if they had been asked, they’d be like, Oh, I think I want this metric. I think I’d want that metric. People will always give you an answer but not necessarily what they actually need.

Brian O’Neill: 10:13 Right. So my point there is that there was a latent problem here which wasn’t known until the entire solution was done. And so this is partly where design and prototyping can help because we could have come up with a list of fake names just to start doing the process of prototyping. And I imagine had they approached this with a different method where they didn’t wait and build out an entire predictive model to come up with a list of names, they probably could have teased out this requirement that A, there needs to be like an explainability feature, which, and I’m not sure what data science features in the model, but.

Allison Hartsoe: 10:45 The human side.

Brian O’Neill: 10:46 Yes, exactly. Talking about the like an X AI package that we’re going to need this or else no one’s gonna use it. Like the black box approach isn’t going to work here. And two, there is still a need to adjust names or maybe some qualitative reasons that they want to like get rid of someone in this list that wasn’t modeled in the data, whatever it may have been.

Brian O’Neill: 11:05 I’m pretty sure you could have probably gotten to those needs much earlier such that when the solution landed, it didn’t land with a thump and just, and that was the end of it.

Allison Hartsoe: 11:13 The sound of it, of a dry tome hitting the table. Oh yeah, let’s begin.

Brian O’Neill: 11:21 You know, so I mean that’s my perspective. And the original question here was like who’s job is it? And my question to my clients a lot of times, well, whose job is it to make sure this stuff is right? And this is a big problem right now is that the business thinks it’s the technology and the data teams and the data teams think this, the business sponsors, and again now we’re talking about internal analytics, right? And data science. Not so much tech companies, but they’re thinking that know your business sponsor, hand me some reasonable problems to go solve. Don’t just say I would like an order of machine learning with a side of AI, like, and let us know when we have that.

Brian O’Neill: 11:54 That’s a terrible thing. So my work, I like working with technical people and engineers and data people, and I feel like there is a place to meet in the middle here. I do think that from my studies and just research in this area, there’s definitely a need for more data literacy on the part of business sponsors. But the last line of defense for the customer here is the person that’s writing the code and making the stuff. And so my feeling is I would like to help that audience learn how to make better stuff at while the business catches up or the business sponsors figure out that they need training in this area. I think there’s room for the data scientists and analytics people to learn how to do this better. And actually the leaders I’m talking to, this is something that they’re also hiring for.

Brian O’Neill: 12:34 They expect a senior level data scientists or analytics person to be asking these questions that there needs to be more conversations happening. There needs to be more prototyping, they want to move with more agility. They don’t just want technical skillset and that’s a kind of a base requirement and eventually the market will get more skilled talent in this area such that you don’t just choose between, well they have this chops and yeah, they can’t talk to anyone. In fact, don’t let anyone else interview them because they’ll probably hate them. But I’m making crafts generalizations here. I’ve met some really talented and very interesting people with very accelerated advanced degrees in math and particle physics and all these things.

Allison Hartsoe: 13:10 Now, I hear what you’re saying is the, it was an interview I heard a while ago with Pamela Peele who runs the University of Pittsburgh, pulled data systems and we’re talking like $1 billion of data assets, machines, and people and just like a huge system. She said there were two roles that were the most valuable on her team and role number one was what she called like the factory foreman, the person who kept the jobs moving through the system like a factory and the other role that was incredibly valuable as exactly what you’re saying, she called it the data storyteller, but it was really not, that’s almost trite compared to what it really was. It was the person who was the bridge between business and data and was making sure actively that business needs were being met. Although she never used the word prototyping, which I think is a really key concept here. Do you have more structure or like a process around how people think about prototyping?

Brian O’Neill: 14:08 Yeah, well, I want to say two other things just on that if I could. So there’s a new tie. So you have data storyteller, which to me, is a very specific thing. It I feel like that’s aligned with outputs that have been created and conveying outputs that are different to me. Then there’s this new role that I believe came out of Mackenzie, and I loved the behavior and the need for the role. I hate the title, and it’s called the analytics translator. It’s just terrible like because it sounds like something after you create this behemoth piece of crap, then go out and have the translator translate it to all the rest of the people that actually need to use it. And that suggests that the first thing is definitely not going to be usable, but then we’ll translate it and then it’s like Mandarin to English, you know?

Brian O’Neill: 14:49 And you’re talking to a bunch of people, English speakers, you know, so I hate the title, but the rule is really important. And what I feel like I see here is having come where from the startup world and technology world is they’ve taken the product manager’s role and they call it an analytics translator because there maybe not creating a commercial software product, but they’re probably creating some type of software outputs, and it’s really what the role of good product designers or good product managers and occasionally you have this hybrid person that’s kind of doing both product and design. They’re sitting in that intersection of data, technology capability, the business, and the customer, and they’re talking to all those people. It’s kind of the hub and spoke, there right at the center of that to keep, make sure projects are aligned with value and all this.

Brian O’Neill: 15:32 So you might look, if your listeners are interested that that’s kind of a new, I just ran an article, this is apparently the next huge hot job after, you know, data science is currently really up there, but this analytics translator is really going to be needed. And I’m like kinda not surprised because we keep focusing on building outputs and not creating outcomes. And so now the business is saying, we really need someone to come in here and make sure that this stuff is usable and that we have outcomes and it doesn’t land. You know the book hit it on the desk again. So that’s kind of my take on that. You know.

Allison Hartsoe: 16:01 Yeah, maybe can invite the audience to come up with some new names for that role in the comments because that is clearly got to go. That’s the worst title ever.

Brian O’Neill: 16:11 Yeah, I mean the shift may have sold on that. Particularly I liked the data product manager. That’s kind of how the term that I like, but whatever you call it, I care more about that the behaviors are happening and what it sounds like a kind of a low-level role that’s not going to have the strategic impact and by definition it’s strategic. Right.

Allison Hartsoe: 16:29 Thanks everyone. This ends the first half of my discussion with Brian O’Neill from designing for analytics. Please join us next week when we have the exciting conclusion of this very rich and powerful discussion. Thank you for joining today’s show. This is your host, Allison Hartsoe and I have two gifts for you. First, I’ve written a guide for the customer centric CMO, which contains some of the best ideas from this podcast and you can receive it right now. Simply text, ambitiondata, one word, to three one nine nine six (31996) and after you get that white paper, you’ll have the option for the second gift, which is to receive the signal. Once a month. I put together a list of three to five things I’ve seen that represent customer equity signal, not noise, and believe me, there’s a lot of noise out there. Things I include could be smart tools I’ve run across, articles I’ve shared, cool statistics or people and companies I think are making amazing progress as they build customer equity. I hope you enjoy the CMO guide and the signal. See you next week on the customer equity accelerator.

Previous
Previous

Ep. 96 | Designing Data for Business Decisions Part 2

Next
Next

Ep. 94 | Creating Happy Customers via Compliance