Ep. 80 | Customer Trust Through Privacy

This week Aurelie Pols, a DPO or data protection officer at customer data platform mParticle joins Allison Hartsoe in the Accelerator. Aurelie is based in Europe at the forefront of privacy law-making and interpretation. In this episode she connects privacy and trust, including how GDPR and the California Privacy Protection Act are raising new questions around data governance, corporate transparency and a consumer’s right to be forgotten.

Please help us spread the word about building your business’ customer equity through effective customer analytics. Rate and review the podcast on Apple Podcast, Stitcher, Google Play, Alexa’s TuneIn, iHeartRadio or Spotify. And do tell us what you think by writing Allison at info@ambitiondata.com or ambitiondata.com. Thanks for listening! Tell a friend!

Podcast Links:

West Wing Short List

In the matter of Google/DoubleClick – Dissenting Statement of Commisioner Pamela Jones Harbour

Democracy

Ethics Advisory Group Report 2018

What is privacy by design? A deeper dive into this GDPR requirement

Read Full Transcript

Allison Hartsoe: 00:01 This is the customer equity accelerator. If you are a marketing executive who wants to deliver bottom line and impact by identifying and connecting with revenue generating customers, then this is the show for you. I’m your host Allison Hartsoe, CEO of ambition data. Each week I bring you the leaders behind the customer centric revolution who share their expert advice. Are you ready to accelerate? Then let’s go. Welcome everyone. Today’s show is about building customer trust with privacy, and to help me discuss this topic is Aurelie Pols. Aurelie is a DPO, and if you haven’t heard that term before it stands for Data Protection Officer and it’s a term you should definitely know, and she works at the customer data platform mParticle. Aurelie, welcome to the show.

Aurelie Pols: 00:55 Thank you for having me, Allison, wonderful to be here.

Allison Hartsoe: 00:58 We met years and years ago in the digital analytics space, and I have just watched your career with awe. What I’ve noticed is that you are really amazing in the deep ways that you think about this space. So can you tell the listeners a little bit more about how you were originally drawn to the topic of privacy?

Aurelie Pols: 01:17 Yes, absolutely. Well, it’s actually a funny story, and I have to confess that it’s because of Rob Lowe. You might remember the movies we watched many, many years ago is in Elmos fire, but basically, Rob Lowe talks about privacy in the west wing. One of my favorite American shows, and it was once he started talking about privacy that I got interested and decided that it was time to start looking at this thing called GDPR that was evolving. But before that actually a first alarm bell rang a bits in my head as, as you mentioned, I have a background in digital analytics, and it was when Google acquired double click back in 2007, that’s a couple of years ago now. And it’s also where I first heard the names of some people I actually had met along the way in privacy today. So that was kind of the shift from, well did you just do analytics towards data protection and privacy, but my background is actually in Econometrics and statistics.

Aurelie Pols: 02:26 So I love data, and I went to the Internet, you know, like anybody else who loves the data because there was a lot of data there. So long story short, I co-founded my first startup with my ex-husband. We were one of the first Google Analytics certified consultants in Europe, and we sold that company in 2008, and we moved to Spain. And basically, I exchanged that startup for two wonderful kids and who are now a bit grown up. They still dig privacy actually, and I watched this regulation called the GDPR evolve, and this was actually brought about by two amazing women, Vivian Redding and Nabi Cruz who is also Dutch like me. So you know, the GDPR came into force in 2018, right? Actually, the ink was dry in 2016, so it typically takes two years before enforcements. And, um, negotiations had been going on for about five years with that 5,000 or more amendments for the GDPR.

Aurelie Pols: 03:32 It’s really an incredible story of how the GDPR came about. They even made a movie out of it so you can actually find a movie which is called [inaudible], um, called also democracy and which talks about how it came about. It’s really interesting.

Allison Hartsoe: 03:50 We will link to that in the show notes. Wonderful. So tell us a little bit about what your team specifically does as a data privacy officer. What do you think about on an everyday basis?

Aurelie Pols: 04:03 So today I sit between mainly two different worlds. Uh, certainly digital data because this is where we all come from as a data protection officer or the customer data platform and parts point of New York, but I still spent quite a lot of time in Brussels and mainly talking about ethics. So maybe just a bit about the ethics parts. It comes through the fact that I was part of the ethics advisory group of European data protection supervisor.

Aurelie Pols: 04:31 Ethics is something that has been on the table for the past year probably we started in 2016.They asked us was actually a bunch of philosophers and two kinds of data geeks amongst me, um, to set the ground for upcoming ethics discussions. And so the initial question was about now that the GDPR is there, what’s next? And this is what we’re currently living in are all these discussions about AI about ethics, about global rights and things like that. Um, so we published a paper following this. There was also a big conference in October of last year where apple spoke, Facebook spoke. It was really, really interesting to hear and from their other works in Brussels amongst which being part of European Center for privacy and cybersecurity out of the Maastricht University where I also teach in their DPO courses. So that’s one hat’s where, well, despite not living in Brussels anymore, I spent half my time still in Brussels.

Aurelie Pols: 05:35 And the other half is actually being data protection officer for mParticle where I report to the CEO and work together with the project management office to first of all sure that we are compliant or as compliant as possible with the GDPR. So the thing is that GDPR, it’s a regulation. It’s a law. It’s not a technical requirement. So you spent a lot of time trying to understand what it means to be compliant. It’s really very interesting collaborative efforts from the entire company to make sure that we are as compliant as possible. And after over a year we are now moving more towards this idea of competing on privacy, um, with initiatives such as, for example, open GDPR, which is an open-source framework to pass data subject requests through the digital ecosystem. As we all know, the tools are linked, and the data are being transferred and not only through cookies but also server connections as the case and things like that.

Allison Hartsoe: 06:39 Now I assume we can link to your paper that you mentioned, I think you call it toward digital ethics and maybe a link to open GDPR that you just talked about would be nice as well. So we’ll add those links into the show notes.

Aurelie Pols: 06:52 Yes, it’s on Github. Go open GDPR is, is there on Github. It’s open for comments. Um, and I share with Facebook also a couple of weeks ago as they shared their data portability projects as well.

Allison Hartsoe: 07:05 Excellent. Excellent. So let me just ask you this question based on a couple of weeks ago I was on vacation, thank God, Yay. And my friend gave me this book that I had not read, and usually, I’m reading books about technology and you know our industry because it moves so quickly. But she recommended this book which was fairly fictionalized and it’s called the circle. And it basically walk somebody through the breach of trust and privacy by a fictional tech company. Though I have to say, it sounds a lot like some of our big silicon valley companies. And so it really got me thinking about how we trade privacy for convenience. Every day I share my location with Waze, and Google tells me if a store will be closed when I might arrive. And even the local grocery store has a giant camera in my face at self-checkout. So in our field where we’re thinking about customer analytics, we always want more data. We always want to try to understand that customer journey in order to make the marketing more timely and targeted, you know, more relevant. But that’s really gotta be balanced with privacy. So how do you think companies, and you mentioned a lot about different countries, so maybe even countries. How should they be thinking about privacy?

Aurelie Pols: 08:24 Well, I think generally speaking, um, first of all, it, it should be a holistic position from companies to say, well, you know, there’s this thing called GDPR where suddenly the risk of using data is very different. The fines before the general data protection regulation where maximum half a million euros in, in Spain and in the United Kingdom, this now moves towards a maximum fines of 4% of global turnover or 20 million euros, whichever is higher, you might have heard, um, they intend to fine this week by the Information Commissioner’s office in the UK for British Airways and we are talking about 183 million pounds, uh, estimated. And Marriott also is around 99 million pounds. So suddenly, this risk changes. The risk of equation changes. What we could say before the GDPR under the directive was, well, even though we might not be doing the right thing, you know, it’s, we get a fine, it’s, it’s not that bad.

Aurelie Pols: 09:30 Something else also that is, that has really changed, and not only the GDPR, even the federal trade commission talks about this. It’s the fact that when we talk about personal data or PII, it used to be we don’t collect PII, so it doesn’t apply to us. The thing is the scope of what personal data or PII is, is getting broader. And the GDPR, for example, talks about IP addresses, cookies, Mac addresses and things like that. So we have to start taking notes of what’s the GDPR and also the privacy regulation is about, and so I think awareness is typically the first step to start thinking about this because the risk actually exists. And it’s interesting when you say, Allison, you know, we used to collect everything and tried to figure out what we were going through with this. A lot of companies are starting to move away from this and they will, what’s can I collect?

Aurelie Pols: 10:32 And that I can explain because what the GDPR does it reinforces rights of data subjects. So citizens you, me or children or parents or family or friends where they can complain and say, Hey, I don’t agree. And we’ve seen this also evolution with technology where people kind of find certain things a bit creepy. So do we really want to go towards let’s collect as much as possible and let’s see what we do with the data or do we want to make sure that in the long run, we use data in a qualitative way to support business decisions and also certainly foster trust for our customers? So it’s interesting to see, we go back to this idea of customer lifetime value that some of our, you know statisticians have loved for many years but haven’t really seen, certainly in digital.

Allison Hartsoe: 11:26 There are so many fascinating points in what you just said. And I want to circle back to a couple of things. One is the scope of PII broadening. I think in general I oftentimes see technologies in the Martech space that are reliant on some things that we might consider PII. So, for example, there was a company that I saw recently that was connecting your IP address from your tablet or your mobile phone with what you were watching on TV at the time so that they could more or less close the loop with whether ads were actually being viewed by people or not. And to me, that seems like an area where people might not know their privacy is being breached but is highly at risk for this PII definition for not being able to do this kind of tracking.

Aurelie Pols: 12:18, Yup, absolutely. And this is also where there is not a similar view between the two continents we represent here in the sense that US legislation decide year of PII, which typically is a single variable, are defined by each US states. Um, privacy legislation is based on this idea of notice and consent. I tell you what I do, and you agree, or you don’t, it’s a take it or leave it stance. European data protection is grounded in this idea that if you process personal data, it should be lawful. So we have pushed a lot on this idea of consent there are actually six ways to make processing lawful under the GDPR, the privacy regulation, that’s something else. But lawful processing puts basically the responsibility within companies to say, well, if you want to use this data is the new oil. You have to make sure that you do that in a lawful way. It’s very different from this idea. I tell you what you do, and you agree, or you don’t. So noticing consents versus lawful processing and also in Europe, grounded in this idea of human rights. I have a right to kind of privacy within the charter of fundamental rights is a very different way of looking at it.

Allison Hartsoe: 13:39 I guess that’s where the, I have a right to be forgotten. I have a right to tell you that you can’t process my data even if I have given it to you before.

Aurelie Pols: 13:48 Absolutely. So this idea of, for example, so we’re so focused on consent, but it’s not only about gathering consent, it’s also about allowing you to retrieve your consent if you change your mind.

Allison Hartsoe: 13:59 Right. And I think that’s where the California Privacy Protection Act goes. But correct me if I’m wrong there.

Aurelie Pols: 14:05 Yeah, so the CPPA is going in a very similar direction to the GDPR, so there are differences obviously are also certainly in the way the legislation are kind of created in the US, I work a lot with external us legal counsel even they talk about the sausage-making when it comes to US legislation. I would not want to comment on that. I just compare European legislation to the Sagrada Familia in Barcelona, which is axial that was built by Goudy and is continuously under construction but is very strong. So on the one hand side, you have the cathedral. The other side, you might have sausages. So yes, it’s interesting to see that US states, so California, Vermont, New York are pushing legislation and very specific areas and sectors and well certainly the California privacy protection act takes over a lot of concepts and ideas from the GDPR and is more grounded in rights than before.

Allison Hartsoe: 15:10 Would it makes sense if a company is trying to figure out, let’s say they’re not based in California, let’s say they’re based in New York or in a different state, would it make sense for them to comply with the California Privacy Protection Act as it’s getting or try to, as it’s getting figured out, so that they can engender more trust amongst their customers? Is that a good place to start?

Aurelie Pols: 15:32 Yeah, I think, you know, it’s also where basically ethics comes around and certainly this idea of corporate social responsibility of your company. How do you position yourself from an ethical perspective, from a responsible perspective with respect to data, well possibly also assuring that you are compliance? Um, and um, you know, that there’s a big battle going on in the global level about these discussions. Which law, which privacy law should I take? Should I fork my data setups because I need to be compliant with CPPA, I need to be compliant with Vermont. Uh, I need to comply with, uh, the Japanese privacy act or African country that’s coming up and how am I going to do that? Or can I decide to take a baseline somewhere and move it up a notch? So the thing is in the US we’re kind of waiting for a federal law. I’m happy to see and, and maybe wager whether this is going to come out in 2019, 20, 21, or 22, but it’s increasingly complex, I think for American companies to make sure that they do the right thing. And I think the first thing certainly for American companies is to make sure that they understand which legislation is coming out in which states and map that to their customers to say, well, there is a risk that somebody might complain in Albuquerque about a point we never thought about.

Allison Hartsoe: 17:08 Well, let’s come back to customers and think about when companies are calculating customer lifetime value on their customer base. Is that an area for transparency, or is that an area where it’s so fluid? Maybe it shouldn’t be transparent?

Aurelie Pols: 17:25 Well, I think customer lifetime value is also part of the intellectual property of companies. So it’s not only about privacy, it’s also about while your own sausage-making in terms of data internally. Um, and I think if companies want to come out, uh, or be more transparent and it’s not an easy question, you know that you come out with a word, you know, oh, companies should be more transparent. It seems so easy, but it’s, it’s really not. Um, because what are you going to communicate to your customers? You know, I’m very happy to be Iberia Gold for example, because I fly so much, but I’m not sure that if I was classified as, you know, overworking mother often eat enough vegetables, I’m not sure, I really want to know that. These are also, you know, the questions that are coming up, for example, in terms of predictions or trying to help people in health data.

Aurelie Pols: 18:26 Should, you know if we predict that you have 80% chance of having some, some form of a horrible disease, should we tell you yes or no? How far should we go in terms of transparency? So I think companies should really think about this idea of, yes, you want to obviously identify your best customers so that you spend your resources on this because it makes sense. It’s a, it’s a win-win relationship. But does this mean that you need to be overly transparent with those that you will serve less? Well, maybe not. And so trying to find that balance between the two.

Allison Hartsoe: 19:04 I think that makes sense. And I think what most companies, uh, at least some of the forerunners think about is if we tell someone where they are in the customer base, does it help them become a better customer? You know, if I knew what a high-value customer did and looked like, could I make a clear decision about my relationship to that company? And to me, it starts to beg the question of, is the company operating almost like a person in which you want to build a relationship or maybe you don’t, or is the company operating as a company around intellectual property? And this is mine, and we’re not going to share that with you. And I feel like the lines are blurry.

Aurelie Pols: 19:47 Yeah. I think that obviously, you know, companies have obtained personhoods their legal status has evolved through time, but we, we should certainly not forget that our companies have and certainly in the US fiduciary duties, so are towards the shareholders as well. Make sure that they maximize profits. Um, so I think it’s more a matter of addressing and making sure that you optimize your resources related to those customers you consider most profitable, uh, while possibly not undermining those that you might not serve as well. Uh, after all, it’s also a choice of the customer to say, well, I’ll spend more money with this company, or I don’t. Obviously, it’s about trust, which was a very human traits. But I think I would ground this also in a certain form of responsibility where we should never forget that there are obligations you need to shareholders for companies, fiduciary responsibilities are there whatever you do.

Aurelie Pols: 20:53 So let’s not kid ourselves, it’s not going to be about that. The way I often see all sides is I’m always surprised, you know, how, how data is being used for very short term, uh, visions of optimization. I think you can use data in order to support a longer-term vision and I think you can actually use data also to support this idea that you are as compliant as possible with privacy legislation and in that sense thinking of, oh we should get rid of the data because it’s not compliant. Maybe also not a way to think about it. Maybe you want to use that data to actually highlights how good you are with respect to compliance obligation.

Allison Hartsoe: 21:41 I love what you said there, especially the longer-term vision for data. Oh, are there examples of where companies have either been compliant or maybe have made missteps that everybody can learn from?

Aurelie Pols: 21:54 I think the classical example, and I think some of my friends would typically say, well we’re not sure this actually happened is you know the, the target’s case we heard many, many years ago about the data teams trying to define whether somebody was pregnant and that apparently I, I don’t, I still don’t know if the story’s true, but the fact that a father of 15-year-old was very, very upset with a target manager because he can declare his daughter not to be pregnant. Besides that creepy aspect and things like that. I think what’s really important to understand here as well is that by predicting a health state like pregnancy from your shopping behavior under the GDPR, you actually crossed the line and I think a lot of data, people work in data science don’t understand that. If I use shopping behavior and I predict whether I prefer Banana York rotor or strawberry yogurt, that doesn’t really have a lot of consequences.

Aurelie Pols: 22:58 It might be personal data, but I’m not touching upon what we consider to be a sensitive data or special categories of data. Once I predict somebody’s pregnant, suicidal or has health states that are very specific, actually I can’t just do it like that. So I need to ask consent to individuals to make sure I can use that data. And this is what is often forgotten within the story of, of targets is it crossed the threshold that we need to understand certain data points even though their personal data are totally acceptable. Others are slightly more tricky, which is also why, for example, the GDPR, um, mentions facial recognition, biometric data because we are moving along, you know, technology continues to evolve, and it will touch our lives more and more.

Allison Hartsoe: 23:51 I couldn’t agree with you more. One thing that absolutely it more or less freaked me out and there are very few things in the data space that really caused me concern, but this particular one had to do with applying AI to the emotions on someone’s face. And so coming back to that example, I mentioned earlier where my local grocery store has a camera right in my face. The application of AI behind that to identify, am I a happy shopper? Am I an unhappy shopper? Am I healthy, am I not healthy, is highly concerning because I don’t know what they’re collecting. There’s no sense of trust there. And it actually pushed me out of the store. Only because I understand more about what could be done with their data, so even though they might not be using it in the ways that are nefarious today, the capability for them to do that in the future without my consent was too high for me to go back to that store. I don’t know if you’ve heard examples like that, but have you heard examples of other companies toying with facial recognition in a way that really requires more consent? And is that part of the special category of data that you mentioned?

Aurelie Pols: 25:02 Yes. So facial recognition is, is actually are listed within the GDPR, a special category of data. Um, there was a story a couple of months ago, or maybe a year ago, it was British Airways. Ah, they have been in the news quite a while lately. Um, who did indeed, uh, undergo facial recognition, uh, when you boarded a plane? Um, I think it was a peace fro. And what they did was they basically put a banner out there to say, hey, you know, smile at the camera, you’re being recorded. And what’s interesting also is that there are more and more, I would call them kind of privacy freedom fighters and lots of them are friends of mine. Today’s as well who took pictures of this and said, well, this is unacceptable because basically, the banner said, well, you are being recorded if you don’t want to go to the counter and let us know.

Aurelie Pols: 25:55 Um, and while, I’m sorry, but biometric data, facial recognition is opt-in. It is not, it’s something for which you need consent before you actually processed the data and not the other way around. Um, so we’re seeing more and more, I would say power to the people. Um, I think this, this activity at Heathrow lasted maybe a couple of hours before it was actually taken down by British Airways. Um, and, and there are more examples of this, people starting to regroup, uh, and to, you know, share their knowledge of the law and go in front of the courts to go against certain decisions. And, um, we might remember something called safe harbor many years ago. Um, which was the framework that’s allowed to transfer data from Europe to the United States. And what basically happened was that there was this, at that time, law student called Max Schrems who following the Snowden revelations said, well, I don’t think that if Facebook pushes data to the United States, my, my fundamental rights are being upheld.

Aurelie Pols: 27:06 And so he asked data to Facebook, they gave him a big, big, big pile of printouts. I don’t know if she saw these pictures many years ago. And it went all the way up to the European Court of Justice who decided to declare safe harbor in the ballots. That was in October 2015. Um, and so a couple of months, lots of companies were kind of asking questions. What do we do now? And this is when privacy shield was actually born a couple of months later, pushed by the commission. Um, two days ago on the 9th of July, um, Matt Trans was in front of the court of Justice, the European Court of Justice and again for the same thing, standard contractual clauses, privacy shield, only things like that. So we really see people starting to better understand what is at stake when it comes to technology and we see groups, companies, and not for profits, privacy, international, none of your business, um, pushing courts to make decisions about what is acceptable and what is not. So they don’t influence the law, they influence the interpretation of the law. And I think as people understand better understand what the toolboxes are of legislation, this shift of power between big tech does whatever it wants will change slowly but surely. But it takes education.

Allison Hartsoe: 28:33 Well I think what you’re hitting on is the fact that it’s a cultural change. And I think you might’ve mentioned this to me years ago and, and apologies if I can’t remember the example exactly, but I think it was when the camera first came out, people didn’t want their photographs taken. There was a lot of concern, cultural concern about the use of photography and cameras, and over the years it became more and more acceptable for people to have their photograph taken. But now that pace of change is moving so quickly, and the culture and the laws perhaps are having a harder time keeping up. Is that true? Is that a fair assessment?

Aurelie Pols: 29:12 Yeah. So that story is actually where one of the first proposals for privacy legislation came out in the United States asking for more sunlight and transparency by somebody called Brandeis, a judge at Brandeis. I think also, generally speaking, we can say partially that the law has problems catching up with technology. On the other hand, you know, there are companies whose motto is move fast and break things. They possibly broke democracy. We can’t have a conversation about that as well. But one has to remember that I always tell my students the law is not too technical requirements. Um, and there are a lot of things within the GDPR that address a lot of issues we face today, and we’ll face in the future. But it’s a toolbox, and it’s not specific to technology, and it will never be because it’s, it’s a legislation. Um, and there is an entire logic.

Aurelie Pols: 30:15 So this, this cathedral I talked about, which allows for users to exercise their rights. So for example, something that’s very different than the GDPR today is that all complaints, uh, send through to the supervisory authorities, each country has one d ICU as one in the UK have to be taken into account. So they have to follow up. It wasn’t the case before. So in a sense, the toolboxes, their power to the people to make that toolbox work. Now you have to use the tools if you prefer not to and you say, well I accept to download this app, and you use this and things like that. It is your choice. What you have a way to make sure that these choices if they are not in your interest, are covered by the law. And for example, one of the articles within the GDPR articles 22 which talks about automated decision making, including profiling, is there to make sure that decisions are not made about our lives without us being able to ask questions.

Aurelie Pols: 31:25 Now the big question today is how to interpret this. How far would that transparency go in terms of how that data are being processed, where certain companies have said, hey, I’ll just give, you know, my source code to anybody who asks. That doesn’t really help the consumer to better understand why his credit has been refused. So we still have to refine this. There is a rights, you have a toolbox to exercise that rights. Now, how is this going to work? How am I going to understand other company, why they made certain decisions and how is the company going to define what’s kind of information they bring in front of customers, consumers, citizens, so that they can make the best kind of decision?

Allison Hartsoe: 32:15 Is that point about automated decision making and profiling something that companies have been fined for yet or has the fines been regulated to perhaps just the storage of information, the storage of PII?

Aurelie Pols: 32:31 So there is like a graduation gradients of these rights were some of them are kind of obvious um, the right to be forgotten. You mentioned it for example because you know, we’ve been playing around with this for a while. Let’s make the decision making is still one of these that we have these discussions about, what does it mean, what should it be and things like that. I’ve been fine about this. Um, so the reason fines, for example by the ICO have mainly been around data two breaches. And to be honest, it’s, it’s the less complicated part of it. You know, it’s the security part also of data protection, which has a lot more, how should I say, history then certainly data protection and privacy. So no fine as to yet. It really depends on who’s going to come to say, well I understand that this company made this decision about me and I have no recourse to make sure this is done in my best interest or not.

Aurelie Pols: 33:36 So we’ll still have to see how this is going to evolve. Other rights, the right to be, to be forgotten, the right to deletion rectification. There’s only one of them, which is also new within the GDPR which has been tackled and where the best in class are. The Facebooks and the Googles is the right to data portability. So certain rights are better represented than others, but obviously, profiling is absolutely subject to, um, to the GDPR fines. Talking to a lawyer a couple of months ago, it was funny because he said profiling is a very dirty word in European legislation. Yeah, okay. The few ones. But you know, tell me what you’re doing with your data. Let’s have a conversation about what you’re actually doing.

Allison Hartsoe: 34:24 Yeah. And I think that makes sense because when you profile someone, it does sound like a dirty word, but at the same time it’s the tradeoff for convenience and convenience builds trust. So the fact that I can, you know, be a high-value customer and breeze right through the boarding process may overwhelm the privacy that I sacrifice. It almost reminds me of comedy in a way. Not that it, not at that situation is funny, but in comedy, there’s a rule that something has to be less offensive than it is funny in order for the joke to work. So if it’s more offensive, then you just defend the audience, and everybody’s unhappy. But if you can kind of find that tipping point where it’s not too offensive, but it’s just offensive enough, then it’s funny, and I wonder if the same thing happens in privacy where you’re willing to trade enough. There’s like you said, it’s a gradient. There’s a certain amount that you need in order to make the services and the good things that are happening in technology run, but at the same time if you cross that line perhaps with healthcare predictions like you mentioned, then you’ve broken the rules.

Aurelie Pols: 35:34 Yeah, no, I totally agree. And I think the comparison with comedy is a great example. Thank you for that. I’ll be using that in, in the chat, but certainly, this balance between personalization and fundamental rights and how far we go, what is acceptable and what is not will also depend probably on culture. Certain things will be acceptable in certain countries and possibly not in others where also it’s possible that education might need to basically bridge the logics of all of this. I remember a couple of years ago, um, when suddenly I got, you know, my flights automatically into my calendar, I thought that was really creepy. Now I’m kind of used to it. So I actually expect it, I was surprised by it. I also get certain, you know, I call them ghost flights where basically I’m booked on them, and I don’t know where they come from. So that happened a couple of times, and it was interesting because I, I called the Airline, it was Brussels airlines and I was like, I don’t understand because I’m not on that flights, so I don’t, I don’t understand why this comes from, and it was interesting because they answered, they said, well, it’s not us, it’s a company in Romania.

Aurelie Pols: 36:46 So that was like, well, yeah, so it’s like you should maybe, you know, verify your data flows because there’s something rather wrong here. So I don’t mind not boarding flights. I would mind if I, you know, would undergo surgery I never asked for. And this is also, you know, this idea of lawful processing, being responsible for these data flows. Um, it’s, it’s really about also data quality. And we should not forget that. I think the GDPR is the best excuse in the world to actually get going on documentation, data, governments, master data management, you know, projects. We’ve heard a lot of that through the years but which, you know, had problems getting funded and things like that, but that also support customer lifetime value. So I think this is part of, of this getting back to, you know, this balance between personalization and compliance obligation, data protection rights and things like that.

Aurelie Pols: 37:49 I think that the extreme example of over personalization can be found in the insurance industry. And the reason I say this is because the very concept of insurance is based on this idea that risks of individuals are bundled together. It is the group that makes insurance work. Go towards hyper-personalization what happens to the concept of insurance? And so this is kind of, you know, the extreme flow of this pendulum to say, well this might happen. What do we do then? And so from there, I think it’s important for companies to understand, okay, personalization. Absolutely, it makes life so much easier when I don’t have to park my car at the airport. I just walk in. I will search through a, you know, VIP and things like that. But this, this positive, there might be a negative aspect to this that we need to think about, but where these use cases have not surfaced yet because it’s impossible to think about everything all the time.

Aurelie Pols: 39:04 And this is where the mechanisms of the law allow consumers to well bring about, you know, the uses of their data to say, well this was not in my interest or the interest of the group or things like that hence I go to court. This is also how best practices are being created and things like that. I think Jerry’s speaking in terms of culture, there’s more and more tension about what we consider to be acceptable or not and that we kind of lose, you know, this sense of, well this might actually go wrong. Is this a good idea? And I think this is where if companies do automate a lot, they should have this person sitting on a desk somewhere with possibly a phone and the email that listens to potentially complaints of people or explanation of how these data flows might have created harms for them and see if they don’t need to readjust the way they think about data. So you know this term scale, everything is about scale and scalable, and I have like pressure behind me all the time say, Oh, you should come up with a product about the blog. I appreciate scale, I understand it. But as I said at the beginning of our conversation, this is about making sure we don’t lose our humanity, making sure that we remain individuals and that technology serves us at the detriment of the most powerful, the riches that I don’t know.

Allison Hartsoe: 40:39 That makes sense Aurelie I love how you said that. So if I’m a company and I want to think about preparing for the future, in addition to perhaps having somebody who listens to the data flows and is monitoring what’s happening from the voice of the customer, what else should I be thinking about? Or, or maybe what should I be doing first, second, third, in order to not get caught off guard in order to prepare in a, in a logical way.

Aurelie Pols: 41:09 Yeah, absolutely. So, um, to be slightly more concrete, um, something I also tell also a lot of my students is I don’t ask my teams to read the GDPR. I think they would go nuts and we would have discussions for like hours and hours. I think, generally speaking, reading the Charter of fundamental rights is a good idea. So this idea of fundamental rights, if you’re a data scientist, you know, you’re not allowed to discriminate, you’re not allowed to torture your, so the fundamental rights is something we should keep in mind just to make sure we go in the right direction. Then if we’re talking about specifically compliance with respect to evolving privacy legislation, uh, this week we had a discussion about privacy legislation involving in Africa. A lot of companies are changing there as well. I would say the first thing to say is take a look at your regional scope, which laws apply to my business?

Aurelie Pols: 42:04 Are Most of my customers in California, in Vermont or in Vermont? Do I address the European markets? Should I keep my data in Russia? Russia does not want data to come out of their, uh, of their borders. Um, so these kinds of questions, what might my obligations be in Singapore and Japan, South Korea, China, privacy legislation is something that is evolving globally. So don’t think it’s just us, Europe and the US. And once you know the laws, you have to define what your responsibilities are for your data flows. So it means that we talk of controllers and processors under the GDPR, the CPPA talks of three groups. They talk about businesses, service providers, and third parties. So it’s, it’s kind of similar. So as emperor tool, we’re building CPPA compliance for our customers on top of what we did for GDPR, but it’s not exactly the same.

Aurelie Pols: 43:00 So it will be interesting to see how the California Privacy Protection Act will evolve. And also, as I mentioned earlier on is how companies will fork their, their data endeavors or choose a specific baseline. It really depends. Um, so typically, for example at mParticle we are data controllers for human resources, financing, Marcom activities. And I have to confess the latter is a conundrum because it’s a visible risk and I, to be honest, we’re far from perfect, but we’re evolving. Um, and on the other side for the platform, we still remain within the midst of the data processor. Um, I always say we hold the line, so it’s important to make sure we remain within that scope while we also built for our customers to help them show compliance, supporting their accountability obligations. So once you have an idea of who you are, what your obligations are, or which data flows, you can start to finding what would be needed, and how to hedge for risk.

Aurelie Pols: 44:05 It’s one of the reasons why I stripped away Google analytics from our CMO. He wasn’t happy, and it’s why I also reach out to the engineers on a weekly basis to chat about new features, product evolutions, things like that. I have an open-door policy for our customer success teams. The entire company goes to GDPR training. We have dealt with data retention periods. Now we’ve been through SOC two ISO 27,001 certification. It’s all to reduce the risk for us and make sure we do the right thing for us and our customers basically. So that’s, you know, your three-point checklist to start and every company is different, and as the lawyers would say, it depends. The devil is in the detail.

Allison Hartsoe: 44:53 yeah, this is so true, but at least we need to start. And the customer trust is what is at risk. And for companies that really want to grow their customer equity, pulling that customer trust through I think is an essential part of it. And knowing or at least understanding the basics of the privacy policies, the base within the countries, the and states that you operate as well as the charter of fundamental rights is kind of the starting gate, right? That’s where we begin.

Aurelie Pols: 45:22 Absolutely. And then you build according to, well what you consider to be acceptable risk levels. So supporting that customer lifetime value and building that trust, hopefully through time.

Allison Hartsoe: 45:34 Aurelie, this has been such a fantastic conversation. If people want to reach you, what do you think would be the best way for them to get in touch?

Aurelie Pols: 45:42 Well, to be honest, LinkedIn is what’s being used the most for the moment. I’m not sure why, though. A just check me out on Linkedin, and I’m happy to take on introductions and connections as long as you explain where you come from. If you don’t either, I ignore you, or I send you email back, which is why should we connect. Um, so it’s up to you to make up stories. Storytelling is important as well. Um, so I get questions through Twitter as well. Um, so I’m pretty open, and I’m not very complicated for me.

Allison Hartsoe: 46:15 Fantastic. As always, links to everything we discussed and there were quite a few in our show today are@ambitiondata.com/podcast. Aurelie, thank you for joining us today. I’ve really enjoyed our conversation.

Aurelie Pols: 46:28 Thank you for having me. And thank you for listening. Allison. I know, I know I can talk for hours about this. So thank you very much for your questions and your comedy comparison. I will use that through the next months, I promise.

Allison Hartsoe: 46:41 Fantastic. Remember everyone, when you use your data effectively, you can build customer equity. It is not magic. It’s just a very specific journey that you can follow to get results. Thank you for joining today’s show. This is your host, Allison Hartsoe and I have two gifts for you. First, I’ve written a guide for the customer centric CMO, which contains some of the best ideas from this podcast and you can receive it right now. Simply text, ambition data, one word, two three one nine nine six and after you get that white paper, you’ll have the option for the second gift, which is to receive the signal once a month. I put together a list of three to five things I’ve seen that represent customer equity signal, not noise, and believe me, there’s a lot of noise out there. Things I include could be smart tools. I’ve run across articles, I’ve shared cool statistics or people and companies I think are making amazing progress as they build customer equity. I hope you enjoy the CMO guide and the signal. See you next week on the customer equity accelerator.

Previous
Previous

Ep. 81 | Customer Data as an Asset with Doug Laney

Next
Next

Ep. 79 | Visual Data Disasters with Alberto Cairo