Ep. 96 | Designing Data for Business Decisions Part 2
This week Brian O’Neill, founder of Designing for Analytics joins Allison Hartsoe in the Accelerator. To design for analytics means thinking through the myriad of human behaviors which support a successful outcome. From planning to process to production, designing for analytics is all about the right way to support decision making.
Please help us spread the word about building your business’ customer equity through effective customer analytics. Rate and review the podcast on Apple Podcast, Stitcher, Google Play, Alexa’s TuneIn, iHeartRadio or Spotify. And do tell us what you think by writing Allison at info@ambitiondata.com or ambitiondata.com. Thanks for listening! Tell a friend!
Read Full Transcript
Allison Hartsoe: 00:01 This is the customer equity accelerator. If you are a marketing executive who wants to deliver bottom line impact by identifying and connecting with revenue generating customers, then this is the show for you. I’m your host Allison Hartsoe, CEO of ambition data. Each week I bring you the leaders behind the customer-centric revolution who share their expert advice. Are you ready to accelerate? Then let’s go. Welcome everybody. Today I have the second half of my discussion with Brian O’Neill from designing for analytics. In our last episode, we were talking about how you use analytics to design the process for better decision making, better outcomes. Let’s get back to the discussion. Okay, so back to the question about the process. Is there a certain prototyping process people should be walking through in order to make this kind of iteration between outcomes and outputs?
Brian O’Neill: 01:04 Sure. It depends on the situation, of course, but one of the things that I like to champion is working in low fidelity and working fast. Is it this, or is it that? Did you mean this, or did you mean that? What would you do with this? It’s working in a fidelity that allows you to get feedback as quickly as possible. So you don’t always need data to do that. And that’s partly what, and you said, well, is this the role of the day assigned? Well, I don’t know, but they’re going to be on the hook to build the real thing. So I would feel like you’d want to be involved with what is eventually going to become your homework and would you like to figure out the first time what someone’s actually going to be really excited about using. So understanding these workflows upfront, you do need to do some pre-work before you get into working with prototyping.
Brian O’Neill: 01:48 And by prototyping, I’m usually talking about something from the level of whiteboards and markers and people working together in a room, and I’m not talking about a meeting, I’m talking about what I call it, design jam. This is a place to get visual, to sketch workflows, to understand like you got a parking lot. There are random ideas coming out. It’s not a clean and orderly process. It’s a creative process to tease out needs possible solutions without investing a huge amount of time and money into any one thing. Before you can get to that, though, you need to understand who matters, right? Like who has a stake in this project? Who is a stakeholder, who is a user? Those are not necessarily the same things, but you need to go through some process to develop some empathy and figure out what are people’s pains and problems look like.
Brian O’Neill: 02:31 How are they going to be open or resistant to using this process? Are we going to require training and even if you don’t solve all those problems, you can. That team that’s doing this work to figure out the solution can start to escalate back to the business stakeholders that, yeah, we’ve got this model or we figured out, we have a Tableau dashboard that we’re coming up with. We know that such and such group is not going to touch this with a 10 foot pole unless it’s in a mobile app because the sales team is on the road all the time on their phone, they are not going to bring their laptop just so they can open up a 10 page PDF report and that’s all that we have in Tableau. So just know like we need some, we need a mobile web designer to come in and help us make this tool or whatever work on a phone or it needs to be integrated with HubSpot or I don’t know what it is, but the point was you learned through this process, it’s like, wow, this one team just lives in their phone all day long, and you’re like, everyone’s talking about Tableau dashboards or whatever the heck it is, and you just see this thing.
Brian O’Neill: 03:29 Just get that information escalated to the right party. Even if it’s not your quote, your responsibility, but that could be the make it or break it moment, right? It’s the right information at the wrong time in the wrong form.
Allison Hartsoe: 03:41 Oh, that’s great, but you know, this process also strikes me as somewhat myopic or insular where if I channel Steve jobs for a minute, he used to quote for it all the time. I asked people what they wanted. They would tell me they wanted a faster horse. So how do you keep this process aligned with what people can’t quite see and while meeting that requirement for the right information in the right format?
Brian O’Neill: 04:06 Well, I would say two things. One, I’m going to shoot myself if I said this, I don’t think I said what people want. I don’t care what people want and as designers, and when I do training, you’re not there to give people what they asked for it. You’re there to figure out what is needed, and there are probably latent needs, and they’re not stated on the surface, and we have to dig for those. And so part of that comes through how do you train people to ask good open-ended questions and probing questions and to dig into the dark areas and to the vagueness and the silence and figure out what is really like inspiring this person to do their job. This waiver is that way. How are they going to use this data or not? You can’t know for certain until you provide some type of solution or prototype, but I would say there’s a huge opportunity here that’s largely missed because people are providing what’s their stakeholders ask them for, and they assume that that ask is informed.
Brian O’Neill: 04:56 It’s like going to the doctor, my arm hurts. Please provide me with the cast, and that can you imagine if the doctor came out and just said, great, let me get my cast materials. Here you go, slap it on. Here’s your bill. Have a great day. Of course not right. They’re going to do it diagnostic. They’re going to figure out what you need and the recipe maybe you need to go home, stretch your legs out, rest for a little bit, take a couple, eat some apples and call me in two weeks. I’m not giving you a cast. You don’t need a cast, you know?
Allison Hartsoe: 05:22 Exactly. Okay, I got it.
Brian O’Neill: 05:24 But you can figure out what people are going to react to with this prototyping process. Right? So I don’t particularly like using the Steve Jobs, the Apple methodology, the genius design methodology and all of this. I think that this is not, I don’t like the black turtleneck design, the kind of magic stuff that happens in the corner. I’m really big on, if you’re making any choices about what’s going into your software, your solution, then you’re effectively doing design work. So you can either do it intentionally with more information about how to do it well. Or you can just say, I’m not a designer. I don’t have any creative, whatever. I’m just going to do whatever. Fine.
Brian O’Neill: 05:58 But you’re basically a designer, whether you like it or not. So let’s make you a good one. Like, let’s learn how to do it right and getting visual solutions into people’s hands early can help you figure out, Oh, it’s this. Oh, when I thought we were going to get this thing from my side, I just assumed it was going to be integrated into HubSpot. So I have to login to Tableau and then go to reports this quarter one, change it to pivot table, line graph, and then I query this thing, and then I get the list of sales prospects. Is that what you’re saying? Yeah. That’s how you’re supposed to do. Okay. You know I don’t have a laptop on when I’m on the road. Right? Like I just go to HubSpot, and it tells me what to do like, right? And it’s like you can immediately see there, there’s a disconnect like we’re not even talking about, and this is like, this isn’t a database problem, right?
Brian O’Neill: 06:43 This is a completely different problem. It’s not about the final visualization, which itself may be fine. It’s about all the stuff that it’s the context is completely wrong, right? The salesperson doesn’t want to be in this other tool. They want to stay in their HubSpot or their CRM, you know, and so you could tease that out very early.
Allison Hartsoe: 06:59 It also strikes me that there might be a bit of a human psychology challenge when you’re having these conversations. Do the visual solutions that you put in front of people help them come forward a little bit because they feel like they’re part of the process as well.
Brian O’Neill: 07:17 Oh absolutely, that’s part of it. I mean, right, like if you’re part of that thing, and this is why I also tell people, it’s like, Oh, it’s hard to get other people’s time. It’s like we can’t, sometimes it is hard to get customers feedback, but a lot of times what happens is when you start show telling this person that we depend on your information to make this thing work and they realize that like, wow, my voice matters, like and that you’re building this thing kind of for them. A lot of times, they become more than happy to participate in the creation process because they feel like they’re getting this custom solution that’s made for them, and I’m painting a rosy picture. It doesn’t always happen. There are some times like I don’t have time for that. Just give us something, and I hate to say it, some places have to go through a complete failure before they realize like I told you we need to have this department involved or else this project is not going to go right, and now you can see what we’re talking about.
Brian O’Neill: 08:04 They’re not using it. There’s no business value. So next project they need to be early on in the prototyping phase with us and the research fate, we need them involved. It’s not a lot of time. I find that’s usually not the case. I find it’s more the case when especially cross-departmental teams when they understand what an impact they can have, like how important their feedback is and this gets it back to like how you talk to people and you data scientists really opening up and bringing things down and saying things like, okay, what does an accountant do? I don’t know what an, I have a data science degree. We’re here to help you. I don’t know what the answer, like figure out which accounts payable things to go after for lack of payment or whatever. I don’t know what it is, but just getting to that point where you can open up and say, what is it?
Brian O’Neill: 08:45 What do you do all day when it comes to accounts payable. Can you take me through your process about how you figure out who to call and chase payment down cause they haven’t paid you on time? Like what does that look like? Really opening up yourself to that person. Realize like, Whoa, okay, this person really has no idea what my job is like, and I need to help him understand it. Now this, the accountant realizes you’d completely don’t know my world, but you’re interested in it, and you clearly want to get this right. And over time you kind of meet in the middle, they’re gonna learn something about data. The data person is going to learn something about the accounting field and that domain. And hopefully, we arrived through participation together, not the throw over the walls methodology. We learned together what it’s like through empathy to be an accountant and that’s partly what the role of this, the data teams is in my opinion, are the data, the analytics translator or whoever your title is, someone better be doing this work, or you’re going to hit that 10 to 15% success rate, which that’s not a good batting average in any game except soccer.
Allison Hartsoe: 09:41 Do you think in general, like most data science folks are not high in the people skill area and is this more of a unicorn type role where somebody can balance those two sides where they can be empathetic and really understanding or digging into what somebody does without putting them on the defensive versus somebody who’s like just firing bullet questions like an engineer sometimes does. Is this a unique role, or can anyone with a data science background be coached to understand this and do this?
Brian O’Neill: 10:15 Personally, I think anybody can learn how to do this, and I’m not just talking about data scientists. This is also for the analytics field, the traditional analytics field as well. This happens in a lot of different roles. I don’t think this type of approach is being used all the time. And the other thing I would say about this is there are different levels of engagement, right? These with this in terms of how do you use this design process to get better outcomes. But I would say this, you don’t have to be good at it. It just needs to get started with it. And there are some foundations and some principles to get going with it, but you don’t. Let’s talk about, when we talk about prototyping, we talk about doing usability testing or customer evaluation, right? We need to go figure out if the solutions worked.
Brian O’Neill: 10:55 You can get a Ph.D. and human factors and learn how to run very scientific usability studies that are completely controlled and all of this. The reality is, and most business settings, you don’t need to run a perfectly curated study. You just need to start getting feedback, and over time you’ll learn how to not ask bias questions and lead the witness and some of these things. But it’s not rocket science, but you can learn a ton just getting, having a script like we need. Let’s get five questions about what we want to learn from this design to see if it’s going to stick or not. You know, and here are the questions, and here’s the pass-fail criteria. If we’ll all agree as a team they passed if this happens, they failed if this happened, now let’s go figure it out. You don’t need a Ph.D. to do that kind of work.
Brian O’Neill: 11:38 And I’m about encouraging the behavior cause I think any of this if you get any of these different phases of the design process going, you’re probably gonna start seeing some return on it. And it just goes up with the more that you decided to integrate it. I mean obviously you can bring in your ex, if you have a design team, a data, if there’s data scientists or analytics people here consider basking the design group, the UX group. If they can come like bring a person into your next project. You don’t even have to really know. You can just say, I heard that on this show that like we would have a better outcome if we had a designer and I don’t really know, but do you have someone we could use? And it may start with that cause you don’t really know. But I can guarantee you that if you get the right person in there, they’re not going to be saying, what do you want me to do?
Brian O’Neill: 12:19 They’re going to naturally, they’re like glue going into cracks, right? They just kind of start seeping into all these holes and start gluing things together. And that’s part of what the design thing is as well. So you can do that. You can go to external, whatever it is. So training, bind your internal team, go find an external vendor to help you out with it. But to me, it’s how fast do you want to move? How much do you value it? How much are you worried about risk, right? If it’s a low-risk project, then yeah, let your, maybe your data people can do it. And if it’s a high-risk project, well maybe you want some expertise to make sure you don’t screw it up, you know?
Allison Hartsoe: 12:48 So let’s bring that into what I do first, second, third. If I really liked this idea, can we structure it into step one, step two, step three?
Brian O’Neill: 12:59 Sure. At the highest level, I would say there’s kind of only three phases to this. And one important thing to state here is that this process is not entirely linear. So even though it’s one, two, three, it could be one, two, three, two, three, two, three, one, one, one, three, two, three, two, three, one. You know what I’m saying?
Allison Hartsoe: 13:16 Spoken like a true analyst.
Brian O’Neill: 13:20 So at the highest level, there’s what used to call requirements gathering, but I would say it’s getting to problem clarity. And this largely comes through the process of doing research with customers and understanding who the stakeholders are, both business stakeholders and customers. Because there can be multiple tiers, right? Of people involved through that process of doing some contextual research with these people. And I mean like one on one or two on one conversations with the right people. You come back together, you look at what were the common problems that we heard, and we get a clear statement of what the problems and needs are that we need to solve for, which may not be what the original business owner asked for. There could be some gaps here. And so you need to reconcile that.
Allison Hartsoe: 14:03 That point about one on one or two on one. Is that purposeful so that you’re not developing group think by having ten people in the room?
Brian O’Neill: 14:10 Exactly. And by two on one, I usually mean a note-taker who’s silent and a facilitator who is asking the questions. And one of the core things here is kind of this 80 20 rule of talking 20% of the time and listening for the other 80% of the time. And this is the other thing, even if you’re like, I’m maybe introverted, or I’m not really a people person. Well, guess what your job is just to ask the question and then shut up. Like that’s mostly what it is. And just asking probing questions and asking open-ended questions, which means questions that don’t begin with do, which give you a yes or no response. Typically we’re looking to dig in a little bit more. So this is something that we talk about in my training and like here, example questions or I have an article, we can put it up if you want. It’s questions to get going on data projects, and they’re kind of just a general list of questions about data stuff. But, and then from there you can get.
Allison Hartsoe: 14:58 I saw that on your site. It was really good. We will link to that in the show notes.
Brian O’Neill: 15:01 Yeah. And there’s a whole process there. Yeah, so I kind of put this empathy problem clarity and the last stage being ideation, which is generating as many ideas as possible at the lowest, not creating solutions. Just it’s literally as a team coming together and throwing together. How might we solve this? I kind of put all those into one phase. You could call them three phases, empathy, problem definition, and this ideation phase as three. But I kind of think of it as one. And so if that’s one, then the second phase would be your prototyping. And this is where most places are spending most of their time, right? It’s the making of stuff, making the sausage. So this is anywhere from sketching to visual comps or prototypes. So we’re talking about something like screenshots of what Tableau might look like, but not writing the code, not necessarily building out any data pipelines or anything like that.
Brian O’Neill: 15:51 It’s just working in a fidelity that’s enough to get some useful input back from the customer. And then there’s the testing and evaluation phase, right? Does this solution actually help with the problems that we came up early on? Remember that thing like we have to align that back with the solution and then test against that to figure out is there a likelihood that someone’s going to use this? So those are kind of the brief steps or five if you want to call it. But again, you don’t necessarily go one pass through on the project and check off each phase and it’s done. It’s much more like prototype test. Realize that you completely don’t understand what the actual need was. Go all the way back to the beginning. Oh my gosh, we need to be talking to this team. I had no, I didn’t even know this team existed, let alone they’re a linchpin in this whole process.
Brian O’Neill: 16:34 Like if they don’t slide the dial to seven and push this button, then the next team who we are building the solution for can’t do anything. Oops. And so now you’re like, okay, there’s another team involved, and this is where you can start to bring these surfaces to your stakeholder. If you’re not, the leader on the project simply says, why is this behind? Because no one ever said that sales operations or whatever, and marketing had to provide this ingredient in the cake like they’re not even mentioned anywhere. But we found out that if they don’t get involved, then the next team can’t do anything. That’s why this project is delayed. Do you want us to put something out without their feedback? Because we have a good feeling from our research that this is not gonna happen. They’re just not gonna use it because they don’t have the green light from marketing or whatever. I’m just riffing here, but you get the point, you don’t know this stuff. If you just work in an isolation, build a thing and then serve it up at the end and pray that they’re going to use it. This is when it lands with like, that’s nice, but the model doesn’t get into production
Allison Hartsoe: 17:29 or what’s worse is they try to use it, and it creates a lot of frustration, and you don’t end up creating a strong relationship where analytics is serving the business. It becomes almost like the IT department, where you’re always just trying to get something you need but not getting it.
Brian O’Neill: 17:44 Right. One other last thing on this is I would just say if you got one thing out of this process is there’s a thing called journey mapping, which maybe your audience is familiar with, but it’s a really great tool for, there are two kinds of it. There’s journey mapping, which is not simply modeling out the customer experience, which again, this may be an internal customer, but kind of like from the earliest phase all the way to the end. It’s just visually mapping this out like horizontally how someone works through their job, what kind of information they need at different points, et cetera. There’s also more of an aspirational mapping, which is here’s what it could be, right? This is the future version of that thing, and from there, you can say, here’s where the data science kicks in. We’ve been able to skip these four steps here because now we have a model that will predict this or whatever.
Brian O’Neill: 18:26 The point here isn’t run out and do journey mapping. The one thing I wanted to convey here was thinking horizontally and kind of temporarily about the fact that even though you may build a thing which is a model or it’s a Tableau thing or a software application, that output, it’s going to be experienced over time and there’s this temporal process, right, of someone doing something, and then maybe they go to someone else and then they take the ball. It’s like a relay race. You need to be aware that there’s a race track to begin with and that there might be multiple people on it and if you just spend some time figuring out where on the race track is this going to fall through the cracks, find those problems as well, and they may not be data problems, they’re probably people problems, but just even beginning of definition of the problem and the need will help you kind of get out of your box of being just a maker of things and solutions and see the bigger picture about, Hey, there’s five problems here. One is the data science and modeling problem. The other four here are going to be adoption problems, right? Belief, trust, like all these other kinds of things. So just think horizontally. That’s just timeline. Think about this journey and the fact that it’s not just a once and done open the spreadsheet. There are the numbers, end of story my work has done on to the next thing. It’s probably not that clean.
Allison Hartsoe: 19:38 Angels sing, and everybody says, wow, here it is.
Brian O’Neill: 19:43 Exactly. My mind completely blown. Unfortunately, it’s doesn’t usually go like that. It’s a spreadsheet with one column. You know.
Allison Hartsoe: 19:52 I can totally respect the iterative process in this, and yet I can also see that there can be really tough issues to get through in that iterative process. Somebody needs a tool, and the stakeholder doesn’t want to provide the tool, or somebody needs a new person to help support the process. And is there a way that you get people to look at the bigger picture to say, if you don’t have this, you won’t get X and really level their focus back to the outcome in order to get through those problems?
Brian O’Neill: 20:22 Well, so this gets back to the definition of a successful project. And so this probably is part of this empathy problem definition phase. Sometimes there’s, it’s almost like being in a sales process, right? With an internal person. But if no one can collectively define what a successful outcome is and how that will be measured, good luck landing with success. Right. And so one thing I would suggest here is sometimes we’ll use what’s called like benchmark use cases and projects. So this is a, early on and you may need to tweak these again cause it’s not a linear, but you come up with some set of you, and I’m assuming your audience knows what use cases are, but you have some kind of like list of benchmark use cases, and you get agreement from the stakeholders that would you agree that the way we would evaluate this project, not just did you hit the date or not, right?
Brian O’Neill: 21:09 It’s also can someone get through this list of tasks at an 80% success rate? Would you say that’s the definition of success and get that collective understanding because now what you’ve committed to is we have to build something, and we have to evaluate it against this, or else we cannot objectively say this project was successful or not. We simply don’t know. And that’s a way to kind of get that alignment out of it. And it’s a really good way to just force the whole team to realize like, Hey, this use case involves three departments and like yeah it does. So if they’re not at the meeting, we have a really, we’re back at the 10 to 15% good luck prayer like version of software, you know, building products, you know. So that would be this kind of benchmarking would be one way to do that and mushy try to get out of the mushy.
Brian O’Neill: 21:52 If it elevates the customer experience, that’s word salad mush, you know I’m sorry but like anybody can bend that into saying yep we hit that Mark way too mushy there. Tried to get some, there should maybe be some numbers in here. We did this by this much over this time we reduced X by Y. Not all design things can be quantified like that, and it may be hard to even measure them. And I would also say if you can’t come up with a number because no one’s ever done it before, what a low number in as a steak, just get agreement. Let’s just say it’s 20 cause no one knows it doesn’t matter. We have no basis to compare it to. Let’s just put a number in, and then we’ll do the first test. We’ll see how far off we are, and then we’ll make a judgment about what the goal should be. Don’t get lost in coming up with the perfect number, and it’s like, Oh my God if it’s not 65 or higher, I’m screwed. No, like you’re missing the point. The point is that you even care about measuring it and that you want to make an improvement on it, but you need a baseline to get going, and the exact number probably doesn’t matter because no one knows what it should be anyway, so don’t spend a lot of time on it.
Allison Hartsoe: 22:53 That makes perfect sense. Brian, this is really fantastic. Now, you’ve mentioned a couple of times training. If people want to reach out to you, how can they get in touch?
Brian O’Neill: 23:02 Oh, sure. My email is just brian@designingforanalytics.com b r i a n, in terms of the training, I do have an online seminar coming up, so if you just go to designing for analytics.com/seminar you can learn more about that. There’s an early opt-in there to save half off is it’s the premier edition and early 2020 of that seminar so you can get on the early access lists for when registration opens.
Allison Hartsoe: 23:23 When does the early opt-in end?
Brian O’Neill: 23:25 It will probably end at the end of 2019.
Allison Hartsoe: 23:28 Oh good. Okay, so there our listeners have a little time, right. Well, Brian, this has been just really revealing a whole area that we don’t often think about in our depth of data science and model and everything else we talk about related to the customer. I really appreciate you spending this time with us this morning and enlightening everyone to the human side of decision making and really landing those projects so that they are useful. I think that’s what everybody wants at the end of the day, so as always, links to everything we discussed are at ambition data.com/podcast. Brian, thank you so much for joining us.
Brian O’Neill: 24:02 Yeah, it’s been great to come on. Thanks for having me.
Allison Hartsoe: 24:05 Remember, when you use your data effectively, you can build customer equity. It’s not magic, just a very specific journey that you can follow to get results.
Allison Hartsoe: 24:16 Thank you for joining today’s show. This is your host, Allison Hartsoe and I have two gifts for you. First, I’ve written a guide for the customer centric CMO, which contains some of the best ideas from this podcast and you can receive it right now. Simply text, ambitiondata, one word, to three one nine nine six (31996) and after you get that white paper, you’ll have the option for the second gift, which is to receive the signal. Once a month. I put together a list of three to five things I’ve seen that represent customer equity signal, not noise, and believe me, there’s a lot of noise out there. Things I include could be smart tools I’ve run across, articles I’ve shared, cool statistics or people and companies I think are making amazing progress as they build customer equity. I hope you enjoy the CMO guide and the signal. See you next week on the customer equity accelerator.