Ep. 100 | The Evolution of Customer Analytics at Electronic Arts

This week it’s our 100th episode! Zachary Anderson, SVP and Chief Analytics Officer at Electronic Arts joins Allison Hartsoe in the Accelerator. Zack has been on the vanguard of customer analytics innovation to improve both products and customer satisfaction. He and his team continue to grow and focus on what motivates players, what they like in games and how to make games better. But it’s not easy to bring the whole company along. Zack shares his insights on how to do it.

Please help us spread the word about building your business’ customer equity through effective customer analytics. Rate and review the podcast on Apple Podcast, Stitcher, Google Play, Alexa’s TuneIn, iHeartRadio or Spotify. And do tell us what you think by writing Allison at info@ambitiondata.com or ambitiondata.com. Thanks for listening! Tell a friend!

Podcast Links:

CLV Transformation with Zack Anderson

Read Full Transcript

Allison Hartsoe: 00:01 This is the customer equity accelerator. If you are a marketing executive who wants to deliver bottom line impact by identifying and connecting with revenue generating customers, then this is the show for you. I’m your host Allison Hartsoe, CEO of ambition data. Each week I bring you the leaders behind the customer-centric revolution who share their expert advice. Are you ready to accelerate? Then let’s go. Welcome everyone. In today’s show, we’ll check in with a well known CLV leader to see how it’s going. Zach Anderson is the SVP and chief analytics officer at Electronic Arts. He’s been featured in Wharton case studies, Pete Fader and Sarah Tom’s latest book, the customer centricity playbook. And numerous conferences, including our own customer centricity conference where he shares his experiences about building up a substantial practice of analytics and insights at EA, which I just find very generous. Zach, I run into you all over the conference circuit, and thank you so much for sharing that and for just being such a great advocate out there. Welcome to the show.

Zach Anderson: 01:14 Thanks. It’s great to talk to you this morning.

Allison Hartsoe: 01:15 So it’s been about a year and a half since you were on the show last. I can’t believe we’ve been going that long. It’s great. And this is, of course, is our 100th episode, so very excited about that. Thank you.

Zach Anderson: 01:28 Thank you for, I’m honored to have me on for the hundred episode, so that’s great.

Allison Hartsoe: 01:31 Could you start by telling us a little bit about what your team is doing now just to set the context and then we’ll dive in?

Zach Anderson: 01:38 Yeah. Since teams continue to grow and evolve. Lots of work around what motivates our, I’ll say players instead of customers. That’s what we refer to as our customers, as players, as a lot of work around what motivates our players to play our game, how they think about our brands. Looking at behavioral data, trying to figure out what they’re doing in our games unless they like there, how to make the games better and more engaging. And then I think lately we’ve been shifting more and more to how to utilize the data to directly impact the player experience through recommendations that are matchmaking. And friend find these on those types of activities have really grown over the last year and a half. Um, I would say that algorithmic improvement of our product itself has really become a bigger and bigger focus of the work we’re doing.

Allison Hartsoe: 02:24 And it’s interesting about that as it reminds me of the conference where you talked about how long it takes. It took you about seven years to get up what is our customer-centricity curve and to go from that very small slice of data that you were getting out of the Xbox 360, and the PS3, which I’m sure seems like eons ago and turned it into player-centric data and so is the algorithmic improvement. What you would say is the signature of companies who are at this higher level of success. In other words, you can’t do it unless you’ve done the precursors.

Zach Anderson: 02:57 I mean and I think it depends on the business. You certainly see some upstart full digital businesses that are, I would say are digital natives move, build into their initial products. That kind of algorithmic optimization of the customer experience. I think it’s been a little longer for us. We’ve had examples of that, but with a 10,000 person company and 30 products being developed, it takes a long time to make it systematic. And so for us, the seven years has really been about bringing the whole company along on that journey, which definitely has taken time. I think we certainly have studios within our portfolio that we’re able to capitalize much faster, but it depends on product cycles and everybody’s attitude towards analytics and data and CLV. It takes a lot longer to get the mass of a large company moves that way. So I think it really depends on where you start and how many people you have to change the way they’re thinking and change the products and processes to get there. At EA, it’s definitely been seven years, and it’s still a battle. There are still products that are in the market that are old that launched, you know, at the beginning of our journey that we’re just now starting to think about even retrofitting because they’re big enough portions of our portfolio.

Allison Hartsoe: 04:08 When you say it’s still a battle, is it that you’re having to convince certain people or certain studios why they might like to look at the data in a different way or is it not so much about the people to convince but about the churn of people who used to be there and are now gone?

Zach Anderson: 04:25 It’s all those things, and sometimes you even have people convinced, but the investment just doesn’t really payout in terms of bringing a product and older product up to speed to be able to generate the data that we typically use. So I mean there’s a lot of difference within our portfolio. There ends up being a lot of different variations. It’s kind of a giant program management task to figure out where the studios are individually on their journey, where their products are at, and then how my staffing and investment of time and resources matches that. And it’s kind of a fascinating operational question just to drive the change and which is maybe something else that’s changed over the last few years. I’m much more self-aware of the scale of change and the speed at which all of our business units can actually make that change. Initially, I would say I thought that if I showed them a great idea, everybody would follow immediately, and there’s just a lot more logistics to driving large organizational change than I had probably anticipated originally. But it’s been a fun journey for me to learn that.

Allison Hartsoe: 05:27 That makes me laugh because I think all analysts come at it from that perspective like, wow, here’s a nugget in the data! This is awesome! And then you kind of take it to the people, and you’re like, yay, look what we found! And then the real process kicks in. Is there a signature of understanding what that post finding process is going to be like? Like you mentioned that in certain studios like the juice isn’t worth the squeeze, it takes too long to get the telemetry or the other things in place. But are there other signatures that tell you that this isn’t going to move even if we bring them this great insight?

Zach Anderson: 06:02 I mean, it depends on the nature of the insight. And there’s a couple of things going on. I mean, we’re certainly in a creative world like you can think of us as a CPG or engineering bread product or company. But the reality is we’re actually a creative company. And so I think how you come with the ideas, how, where the company wants to make its bet and ultimately what the product is itself for us makes a big difference. So we have some products that are not live services, they’re just a grace narrative game with a story. And the, actually we’re launching one in the next few days called a Jedi Falle Order, which is just a straight forward narrative story about a Star Wars story. And it has some telemetry in it, but not nearly as much as say our FIFA game, which is a massive many hundred million dollar live service that is changing the game and the product itself every week, almost every day.

Zach Anderson: 06:56 And so those two things operate very differently. And so when I go to the response team that built the Jedi Fallen Order. We had a really several conversations about what is the game for this product, what is the game for the overall business to understand consumer journey through this product? And we set much more simplistic expectations about how much telemetry, even simple things like are we going to build dashboarding for this game or not, or are we just going to try to drive some insights? We focus much more on the consumer insights and UX research part of my organization to help them develop the game rather than on the telemetry and the optimization and the messaging and all of those kinds of things in the game. Sometimes I think about my group, like a consulting organization, we have to pick the right products for each of our studios based on their position on the curve, their product set up, what their customers expect and how much contribution the knowledge of their game will make the overall portfolio of our products and understanding of our customers.

Zach Anderson: 07:53 And so it’s kind of a, I’d say at one point maybe I was a little more of a zealot about everybody had to have maximum telemetry and then we had all these insights and tools in order to make every game better and I think over time I’ve just become much more realistic on and flexible on how much effort should we put into this. Based on the insights, we’re going to get letters, minimum set of tasks that I can ask the business to do and get the maximum return for each of those businesses. It’s a much more complicated calculation than I ever thought that it was.

Allison Hartsoe: 08:24 But I like the way that you’re thinking about it because obviously from the consulting perspective, that’s always the analysis we’re trying to do, but the way that you’re breaking it out by product makes me wonder how that affects the player calculations. Are there really two different types of players, or are the products grouped in such a way that they’re driving different player bases in different ways, and maybe the company doesn’t really have 30 products? It has two groups of products that where the buckets fit together or something like that.

Zach Anderson: 08:53 I mean obviously I’ve been a huge advocate over time and probably the one thing if I could do anything of the EA be to help our game developers and our marketers to understand the heterogeneity of our customers. There is a massive set of differences in motivations and play styles, engagement with our products across all of our games. And so none of our products actually even have one or two sets of customers. They usually have 10 or 15 or 20 or 20 million depending on how you want to cut it. So for Jedi Fall Order, we’re still capturing, for the most part, the data about how far people get through the game, how they engage with it, obviously install it or not. So we are able to calculate the basics of lifetime value for the customer sets. So that contribution still exists, and that’s part of our portfolio.

Zach Anderson: 09:40 You need to contribute that, and that helps us understand what the players are doing and how they’re interacting with our products. But we don’t necessarily need to be triggering really detailed messages within that game, which would require a whole different set of telemetry. And so it really just differs by product. In general, you could say live service and non-lead services. It’s a good cut. That certainly makes one step. And then how complicated the end game economy is, how complicated the product is, how many customers are in it, how much heterogeneity is there of that customer base in terms of different styles of play. All of those things contribute to how we think about what is the correct set of telemetry for a game.

Allison Hartsoe: 10:18 And so when you have so many variables coming in, and obviously with CLV, we love the concept of heterogeneity, but it starts to make a human’s head spin when we get beyond seven clusters or so, or seven 14 whatever’s the magic number. But I always think about seven from my journalism background where billboards, you would only put about seven words cause that’s all anybody could grasp at one moment.

Zach Anderson: 10:40 Yeah, that’s fair. Let’s go and said that you should only ever have four KPIs, and I worked for him at Nissan. So.

Allison Hartsoe: 10:46 So does this mean that you have to be algorithmic in order to really respect that heterogeneity, and if you are algorithmic, how do you get humans who love heuristics to trust that?

Zach Anderson: 10:58 Yeah, so there’s a couple of answers to that I think you can build, and we have actually in the evolution of our algorithmic delivery of say recommendations in a game. We’ve built a lot of player journey that do take advantage of pretty detailed data to be able to figure out. And even if you’re just plotting those out directly, you can get to a pretty complicated journey for a very large population just with complicated journey essentially. And you can utilize a lot of that data. And then, in fact, I would say the richness of that data enables better development of hypotheses by say our product managers or live service producers to figure out kind of what those clear journeys are. One of our most successful financially, but also one of our largest products in terms of customer bases are FIFA games. And one of the things that’s been interesting is they’re one of our top studios or using recommendations, and I’m really trying to get players to the fund and to the part of the game that they want to play quickly and driving engagement through that.

Zach Anderson: 11:58 And one of the things that they’ve, what I’ve seen over time as they were an early adopter of the technology and a telemetry and they’re adding new pieces of telemetry or hooks to the game based on the player journeys they want to manage. And that’s actually become the fastest growing aspect of their telemetry or what are the triggers that they need to find to be able to test the hypotheses around how they could drive better engagement. And it’s really an interesting case if we were to talk about CLV, they’re improving the CLV for their game increment by increment, by improving each of those journeys and making it more seamless for the customer over time. And at the same time also using all of that data to improve the potential offers or the potential parts of the game that will engage players and they’re just getting better and better at it every year.

Zach Anderson: 12:43 It’s really quite amazing. And then the telemetry is going along with that, the desire for insights, the sophistication, the player journeys that they’re building, the use of algorithms to optimize those player journeys and all of it’s moving together in this really virtuous cycle, which is quite amazing. And with it, their longterm financials, their player satisfaction and kind of everything going on and it’s just getting better and better. And it’s a great story for us. As a company, it makes me quite proud, and I’m quite amazed at that team and how much they’ve taken advantage of that over time.

Allison Hartsoe: 13:12 What strikes me in that comment is so much about the virtuous cycle because there’s the solving for the big problems that is the low hanging fruit. But then after that, it sounds like in this case they got into the new normal where they really saw it as a continuous cycle of form a hypothesis, run the test and bring that through into the game. Is that right? Where they’re just constantly optimizing?

Zach Anderson: 13:36 Yeah, I think they really have, they have this really positive cycle of experimentation, automation, personalization, and player journeys as the kind of means to achieve that and that’s just really accelerating the business, accelerating the player engagement, happiness with the product. I would say overall there are actually two cycles though that are quite interesting, so there’s a kind of testing automation and personalization cycle, which is quite quick, which is can be on a daily, weekly path for us, and then there’s a larger cycle outside of that which is developing new features in our case and doing things in the product to improve the features that already exist. That’s kind of a larger cycle, which is also being driven by the experiments, by the player behavior in the game. By understanding that attitudes and motivations of those players and looking at where people drop off kind of churn analysis but then striving the product to be improved on more like a, I would say monthly for small changes but quarterly and annual for large changes and that there’s really, both of those cycles are moving together but at different time frames, which is pretty amazing.

Zach Anderson: 14:45 I like that. That kind of dual cycle and thinking about how you allow for that. It’s interesting I would say we got earlier adoption out of the short cycle. You know, if I broaden out to not just FIFA, but across our portfolio, we’ve had earlier adoption for kind of some of the experimentation and testing and short cycle unlike our mobile products in the smaller products and, but it’s been harder to get the long cycle going on. The larger products that kind of have annual releases or releases every two years we’ve done better with the long cycle, but the short cycles are harder to pick up on it and which is kind of interesting. And then when we get them both going like on FIFA, it’s really a great cycle. Oh, actually, I just have to push a little bit here and there, but mostly I have to try to keep up with the demand for the analytics, data science, and research services that that group needs, and they just don’t want more and more all the time.

Allison Hartsoe: 15:37 I think what’s so amazing about that dual cycle relates to the professor that I think you studied under. There was a Nobel prize-winning economist Lloyd Shapley who was very influential and in fact, probably created most of the mathematics underneath game theory alongside Nash and some of the other famous economists early on. And he talked about the mathematical study of conflict and cooperation. And what I think is interesting is you basically have this conflict in cooperation on multiple levels. You’ve got it within the game as you’re doing analysis, you’ve got the dual cycles, and then you’ve got the human side of people who are taking people within EA who are taking up the analytics and making demands for more data science. So how do you handle all of that human behavior?

Zach Anderson: 16:27 Yeah. Where to start on that? Yeah, I think it’s interesting. I mean, I tell people quite often the basic equation that I used studying with Dr. Shapley, and another one of my advisers was a Jack Hirshleifer both at UCLA and the basic equations underlying especially the models with power relationships in them ended up being low equations. And a lot of the modeling we still do is based on a lot of login equations. So it’s various types of distributions, but so kind of at the gameplay level, it’s interesting that we’re doing that and now, of course, we’re doing it. I mean, much of my work was on small games. It’s like three-person games with different power relationships, and obviously, we’re now studying hundreds of millions of players and the tons of different interactions. So the computational needs are much larger, but they could have increased at least the staff for us to be able to keep up with it.

Zach Anderson: 17:18 I think the interesting one that you raised though is really the internal one and I don’t know if I do it explicitly, but I certainly think about building coalitions and how different parts of the organization respond to with their locus of power and how we kind of motivate people and analyze, I guess subconsciously at least. What’s the best way to move the organization forward? I think it’s interesting to me, game theory is, for me personally, it was such good teacher of negotiations, market dynamics and then that could be microeconomics and pure math underneath it were really fundamental to my becoming a great analyst or a good analyst, at least as I was younger. So it’s just definitely influenced me in lots of different ways. One of the first classes that I took with Dr. Shapley early in my graduate work at UCLA was over in the math department and he would routinely come in and just say a couple of simple words of greeting to us and then start writing on the board from memory and working through a set of equations, which he was going to teach us for hours at a time that day.

Zach Anderson: 18:24 There’s only three of us in the class. So we sat at a big table and looked up at these boards. And so one day come in, done this, been going for about, I don’t know, an hour and a half or something, and just stopped, turned around and walked out of the room. And the students, we were all sitting there looking at each other like, Oh, that’s going on. So we diligently waited, and a half-hour later he walks back and goes to the board. He’s got a paper in his hand and says, you guys aren’t paying attention. I made a mistake an hour ago, and none of you caught it. And so now he’s collected the paper that he wrote in the 50s that covers this equation and has gone back and fixed it. But he realized he had made a mistake much earlier than we did. And so that was kind of a classic Shapley story of how the classes went. But it was amazing. I mean, just to see the way his mind worked than the sheer amount of papers and theories that he had written along with Nash and Shubik and Robert Almond in the early days of the game theory was just really, I mean amazing and such a great pleasure to study under him.

Allison Hartsoe: 19:20 Do you think they would have made even more progress with today’s compute power, or is it not that the theories themselves are strong enough? It doesn’t matter how much more data you throw at it.

Zach Anderson: 19:31 That’s a great question. I actually think about that a lot. I wonder, I’m kind of, I’m checking myself, so in my mind the question is with all the data that we have, do we have to have hypotheses because essentially what they were doing was building a hypothesis and then constructing a set of equation around that to represent that hypothesis and build a nice model of the world. But now we have so much data and reinforcement learning techniques among the many others that allow us to be much more bottoms up in the way we construct our descriptions of the world that we’re trying to study or the human behavior that we’re trying to study. My bias is still often to want to have a hypothesis first and then go into the system and do your specification search work after you have documented and know what your hypothesis is.

Zach Anderson: 20:19 But you know, increasingly I think that’s not necessarily a correct assumption. I mean, one of the things that I’ve been thinking about a lot lately is, you know, one of the reasons these machine learning models are so powerful is that they’re able to uncover patterns that the human analysts can’t or won’t pick up even after thousands of hours of studying the data. And so then the question is, is should we really interrogate that machine learning model so that to determine whether it’s sound not by forcing it to explain itself to the human who would have never understood it in the first place. And I think I’m starting to come to the belief that if we can build these models and then test them to see whether they work or not in this case tests, that maybe we don’t need to understand the exact pattern that is picking up and that we’re actually letting the effectiveness of the technique by requesting to interrogate the model itself. But that’s a really hard thing for me to come to actually.

Allison Hartsoe: 21:15 Yeah, I mean in a sense it’s like dynamic snowflakes, right? The patterns are constantly shifting and changing, especially in a live model, were not just going to set it and forget it. So the limit is the human brain.

Zach Anderson: 21:28 Yeah, that’s right. So it’s just interesting to me. I mean, I think train is a game theorist. I certainly want to have a model, and I think in terms of relationships within the business between our players and the games themselves as a structural kind of equation set. But I think that that actually dates me and maybe makes me not as good as the analyst now as I was before. When compute power was present, you had to have a good hypothesis that’s before you went and use the computer, and now because compute is so cheap, you just don’t have to have the same strength of hypothesis.

Allison Hartsoe: 22:00 I love where our conversation went today. If we were to summarize in general the learnings that you felt were most valuable over the last couple of years, whether it’s about moving the organization forward or about different ways to think about dashboarding or new dashboarding or models or new models from hypothesis, how would you summarize things that people should be thinking about now as they try to drive their organizations with better customer-centric player-centric data?

Zach Anderson: 22:28 I mean for me the last two years have really been about scale, about not just having a good insight about or many even about how the business should operate differently or about our players and how to build our games or operate our business, but really to think about how do I scale those insights, scale the change, scale management, drive new patterns of operation into our business. And so I think if I had been thinking about scale earlier, we would have achieved it earlier. I believe my biggest two conclusions over the last year and a half or even two years has been that I should have foreseen the scale at which we were going to be operating based on the scale of our business, and I should have let go as the leader of it, really trying to generate the insights and start thinking about the program of change and scale that we needed to develop.

Zach Anderson: 23:23 If I had gotten there earlier, I think we would’ve gotten there earlier as a company. So that’s a learning for me. I think the second thing, which I kind of mentioned, is really learning flexibility in the way that we operate and a lot less dogmatic. I thought the zealousness by which I drove the change and drove the insights and would, in fact, the organization that I think what I’ve become as much more dogmatic and recognizing that that’s not always the best way to get there. And then I think the third thing that a lot of organizations are challenged with the scale of my organization, we’ve become over time a little bit of a bottleneck in how to deliver data and insights back to the company, and so we’re really working hard to take our analysts still keep them obviously doing the great work that they’re doing and contributing to the business and the way that they do, but get insights and make more people in the business able to arrive insights from the data, from the behavioral data than the attitudinal data without an analyst sitting next to them or delivering a presentation and that’s also a pretty big challenge.

Zach Anderson: 24:23 It requires a lot of processes. Sometimes people’s changes and technology changes that aren’t all now possible, but are another kind of set of accelerants that take a kind of big program to do.

Allison Hartsoe: 24:36 Yeah, that alone I think is a huge challenge. We always go back and forth between how much should you expect people to be able to self-service out of the data and how much should you need an analyst to ride alongside and it’s like driving, right? It’s a series of small adjustments that eventually gets you in the right direction.

Zach Anderson: 24:55 Yeah, and it’s a hard one. I mean, when I go out and speak, a lot of people want to talk about self-service tools and what technology you put in and you know that’s, I mean, the technology is important and evolving, and you certainly need to stay up on that, and there are lots of great, increasingly great tools to be able to do that, which you have to be flexible in order to adapt and fit into the system, but at the same time, the human changes, the thinking changes that demands is actually a much harder problem to solve than selecting a technology.

Allison Hartsoe: 25:23 I completely agree. All right, Zach, if people want to reach you or I don’t know, if you need more people on your team, how can they get in touch?

Zach Anderson: 25:32 The best way to get in touch with me is shoot me an email and reference this, and my email zanderson@ea.com, very easy.

Allison Hartsoe: 25:39 Excellent. As always, links to everything we discussed, and I think I’m actually going to link to a little bit of Lloyd Shapley’s background cause that’s just such a rich, interesting piece of information.

Zach Anderson: 25:51 Yeah.

Allison Hartsoe: 25:51 Yeah, I’ll link to that in our notes at ambitiondata.com/podcast. Zack, it has been such a pleasure watching you go on this journey and checking back in with you. Thank you for being part of our hundredth episode today.

Zach Anderson: 26:05 Yeah, my pleasure. It’s such a fun thing. It’s great to work with you, and I’m sure I’ll see you at another conference or with one of your clients or something soon.

Allison Hartsoe: 26:12 Thank you. Remember, when you use your data effectively, you can build customer equity. It’s not magic. It’s just a very specific journey that you can follow to get results. Thank you for joining today’s show. This is your host, Allison Hartsoe and I have two gifts for you. First, I’ve written a guide for the customer centric CMO, which contains some of the best ideas from this podcast and you can receive it right now. Simply text, ambitiondata, one word, to three one nine nine six (31996) and after you get that white paper, you’ll have the option for the second gift, which is to receive the signal. Once a month. I put together a list of three to five things I’ve seen that represent customer equity signal, not noise, and believe me, there’s a lot of noise out there. Things I include could be smart tools I’ve run across, articles I’ve shared, cool statistics or people and companies I think are making amazing progress as they build customer equity. I hope you enjoy the CMO guide and the signal. See you next week on the customer equity accelerator.

Previous
Previous

Ep. 101 | Customer-centric growth with NakedWines.com

Next
Next

Ep. 99 | Bonding the CLV Community