Ep. 128 | Customer Data Privacy: Who Has The Right To Use It? with Richard Whitt
This week Richard Whitt, President of GLIA Foundation and former Corporate Director of Strategic Initiatives at Google joins Allison Hartsoe in the Accelerator. GLIA Foundation’s mission is to better align modern day markets, technology, and political institutions with the greater public good. Richard brings over 30 years of experience in technology, law, policy, and strategy to this conversation about all things data privacy.
Please help us spread the word about building your business’ customer equity through effective customer analytics. Rate and review the podcast on Apple Podcast, Stitcher, Google Play, Alexa’s TuneIn, iHeartRadio or Spotify. And do tell us what you think by writing Allison at info@ambitiondata.com or ambitiondata.com. Thanks for listening! Tell a friend!
Read Full Transcript
Allison Hartsoe: 00:00 This is the Customer Equity Accelerator. If you are a marketing executive who wants to deliver bottom-line impact by identifying and connecting with revenue-generating customers, then this is the show for you. I’m your host, Allison Hartsoe, CEO of Ambition Data. Every other week I bring you the leaders behind the customer-centric revolution who share their expert advice. Are you ready to accelerate? Then let’s go! Welcome everyone. Today’s show is about customer data, privacy, and who has the right to use it. To help me discuss this topic is Richard Witt. Richard is the president of the GLIA foundation. Richard, welcome to the show.
Richard Whitt: 00:44 Thanks so much, Allison. It’s great to be here.
Allison Hartsoe: 00:46 Tell us a little bit more about your background, what this foundation is, what the word GLIA means? These are all so many important questions as we start our discussion.
Richard Whitt: 00:56 Sure thing. Well, so my background is in technology law policy and strategy. I’ve been in this space for over 30 years. Most recently, I was at Google, their corporate director for strategic initiatives, which meant I got a chance to look at a lot of really amazing cutting-edge technologies and think about the implications medium and longterm for society. So it was a really unique opportunity to have sort of that function to look out over the horizon. Before I was at Google, I spent 12 years at MCI communications, where I helped founded their internet data policy group and got a chance to initially work with meet and work with Vint Cerf.
Richard Whitt: 01:28 One of the fathers of the internet, which was a really tremendous opportunity. And I, he’s still a friend and mentor of mine, which is such a treasure to the internet. And before that, I worked in private practice when it came at a loss for the very early pre-internet companies. So CompuServe and prodigy EDS. These are the late eighties, early nineties before the web really got going.
Allison Hartsoe: 01:46 I remember those. I remember people would make artwork because they were trying so hard to get people online, and you get a disc in the mail.
Richard Whitt: 01:56 That’s exactly right. Yeah, that’s right. And we didn’t have membership names. We had numbers. I think there were like 15 digit numbers we had to memorize every time we logged in. So obviously a long way since then, and really been drawn to this whole topic from watching this evolution of the internet initially, which Bob Kahn and Vint Cerf helped to keep going in the mid-seventies there really with the worldwide web and that search and Berners Lee, which he started working on it in the late eighties.
Richard Whitt: 02:20 And really, by the mid-nineties became this commercial force in the world and watching it grow, watching the notion of the client-server relationship and all of us as users of this network. And then I think in more recent years, I think many of us had seen with some trepidation, the rise of the platforms, the multi-sided platform companies for whom we are the users in some ways, the objects of these relationships. And then it’s actually the advertisers and the marketers and the others who utilize their platform and pay them, but money for that, for the data and insights they receive from them. And over time, the notion of us as a user became a winnow down. We started to become more passive. We were recipients of the really great technologies and services without a whole lot of our own opportunity to exercise our autonomy and our agency through the web.
Richard Whitt: 03:02 So I left Google two and a half years ago in part because I was increasingly concerned about the direction that Google and some of these other platform companies were taking. I’m now a fellow with the Mozilla foundation, but I’m also, as you mentioned, I’ve started up my own foundation called GLIA. So GLIA is the ancient Greek word for glue. And I chose it for a couple of reasons. One is there’s the old saying that trust is the social glue, right? That binds us together as human beings, as societies, as marketplaces. And I could see that coming on, but not just on the web, but really, I mean, many of us have witnessed this and the institutions, the modern institutions of the day, social political-economic. And it felt like to me, there’s an opportunity here for the web itself and the technologies and the commercial relationships that we create on top of the web to bring some trust back in that I believe has been, unfortunately, been slipping away.
Richard Whitt: 03:47 The other reason I like the term GLIA is because it was actually a name given about 120 years ago to certain cells in the human brain. We’re all familiar with the neurons, right? We’re all very proud of having these really lightening fast neurons that make us think fast and do amazing work. But there are these things called the glial cells, which in the late 19th century, when they first recognized them in the brain, because they act on chemicals and not on electrical impulses, they look a nerve. And so they couldn’t figure out what they did. So they came up with a term, okay, they must glue the neurons together. That’s their primary function. Let’s call them glial cells. Well, that was more or less the status quo for about a hundred years. And then just recently, really enough in the 21st century, neuroscience has uncovered an amazing number of things that glial cells do in terms of promoting, enhancing, protecting, repairing all the neural structures. So in some ways, they are, in fact, sort of a life support system for the brain. And so I like this idea of building an ecosystem that could be maybe somewhat analogous of a digital life support system. For those of us in society, well, I think all could use more support, right? Protection promotion enhancement of ourselves and our digital selves on the web. So that was sort of another connotation I like in choosing the word GLIA.
Allison Hartsoe: 04:55 Oh, very nice. And I love what you said about the promotion enhancement protection. That is such a nice match, but let’s tie this to something that we recently have all been talking about. And that’s the infamous Netflix movie, the social dilemma, which basically posits that some of the largest tech companies, probably all of them have used our data, not just in a particular direction, which we all kind of expect from advertising but to actively create these passionate addictions, and to many of us, this really feels like a big breach of trust and it should, but it’s further concerning because at least from our point of view, we think about how companies are using customer data. And so, if tech giants were doing this years ago, we know that everyday companies that we interact with could be just a few steps behind. So the question I have for you is, has the horse left the barn? Is it too late for us to reclaim anything in our data? It’s all spread out there so aggressively. And if so, how can we trust companies with that data?
Richard Whitt: 06:00 Well, I guess I would answered that. Let’s see. No, and it depends. So yeah, the horse has not yet left the barn, although it’s definitely gearing up to bus through the open doorway there and to the fields, the web is still an adolescent. We have to remember that it actually, it’s quite young. It’s only been around, as I said, commercially since roughly the mid-nineties. So our experience of it is still fairly new and fresh. I think one of the things the platform companies have taken advantage of is utilizing the web in a way that we basically get a bunch of free stuff in exchange for the data that they extract from us.
Allison Hartsoe: 06:30 Clarify just briefly like web 1.0 was what we originally had when you had individual websites, but web 2.0 is more platforms and networks and integration. Is that correct?
Richard Whitt: 06:42 Yeah, exactly. Right. And that’s the way a number of analysts think about it. The notion of the client-server relationship is the basis of the web. We are, in fact, on the client side and the servers out in this. What does eventually become a cloud is where all the activity happens. That’s where the websites are. And so our web browsers take us to the websites, and that’s where the interactions occur. And in the web 2.0 configuration, Google was really the first one to figure this out when they matched up ads and search back in 2000, roughly 20 years ago, but Facebook and many others followed suit, and the deal was they could create a platform. And this is an economic term. It’s not a technology term. It goes back to the ancient Greeks. Again, it’s the Agoura, right? It’s the place you go to where buying and selling and bartering and trading happens.
Richard Whitt: 07:22 And in this case, the platform companies figured out that they were able to sort of extract data from users. That data would be very valuable to advertisers and marketers, and others in terms of the insights that they give. So valuable, in fact, that the platform companies can afford to give us all these amazing services and applications essentially for free. Yes, for free, of course, in stereo quotation marks. The freedom there is the freedom to have them know what you ate for dinner last night and what your size of your jeans are and a whole bunch of other things, right? So I think many people are okay with that until recently, I think, and the social dilemma, I think, I think provides a really great community service here. We didn’t understand the depths of this. There was this concept that while you sort of take my data, and then you get some insights from it, and I get my free stuff, and that’s all good.
Richard Whitt: 08:07 And the social dilemma demonstrates quite effectively why it goes beyond that. It’s not just about understanding us as who we are. It’s about influencing us and the word that comes up again and again in the documentary. And I think again, with good reason, is manipulation. And in fact, the writings I’ve been doing in the past couple of years analysis I’ve been doing, I employ a term, I call the seams cycle S E a M S. And the notion here is this feedback loop that’s constantly being created multiple times during the course of a day. And your interactions with the web ask is that the Fort, the surveillance, the devices that are in your environment that are studying you, watching you waiting for something to happen, think of the Alexa sitting in the living room, always sort of on standby. Right?
Allison Hartsoe: 08:48 Always on. Always listening and every now and then popping up with an answer that you didn’t ask for.
Richard Whitt: 08:52 Yeah, exactly. Right. And if you have a daughter name, Alexa becomes even more awkward, and E is for extraction. So it’s actually pulling the data from you from the environment. The A is for analysis. That’s the computational side, that’s these algorithmic operations, which again, documentary to demonstrate with this three gentlemen standing there trying to figure out how they’re going to extract the maximum value out of the interactions and then the data they draw from the interactions. And then the M of the seams cycle is manipulation. So when I originally came up with that two years ago, I thought manipulation sounds like quite a strong term. Some people might push back on it, but I’m now seeing it everywhere. And Shoshana Zuboff, who wrote surveillance capitalism again, did a masterful job of demonstrating the depths to which it’s not just taking our data and selling those things we might want. It’s trying to sell us things that they’re not sure we want, or they want to influence or manipulate us into wanting. And that selling is not just about goods and services. It can be about viewpoints politically. It could be about your vote. It could be a whole number of things, and that these interactions in some ways are trying to define you. And so that had been the real concern.
Allison Hartsoe: 09:54 It’s through all this experimentation they’ve found how to really effectively influence people about much more than goods and services. That’s where all of us kind of said, Oh my gosh,
Richard Whitt: 10:06 Right. It’s one thing to watch an advertisement on television, or even in the movie theaters, or see ads in your newspaper. That’s a relatively passive experience. Their ability now is to make this a two-way experience, that one that they control. But I think that’s art, and I think Tucson Harris put it well. It’s not that we’re reaching a point in a singularity where the computers sort of taking over from human strengths rashly to point much earlier than that, were there in fact, taking advantage of our human weaknesses. They know us psychologically better than we know ourselves. And that I think creates some real grave risks for us, again, as a matter of being in the marketplace. But also, as the social fabric, we inhabit the political systems we try to build for ourselves. There are so many implications here in a way. So, on the one hand, there’s time to change that the horse hasn’t left the proverbial barn yet, but on the other hand, there’s a lot of machinery here that’s been embedded in place of the last decade or two with the platforms that we need to figure out how to unharnessed and allow us as human beings to have more recourse they would say.
Allison Hartsoe: 11:03 And I want to talk about the machines a little bit because we oftentimes think about this as algorithms. Being an algorithmic leader means that you can really use the data behind everything your company is touching in order to know your customers better and serve them better. But I oftentimes think about this in a little bit of a U shaped curve. There’s a certain point where you kind of have to stop pushing so hard because even though we can drive sales higher, we may have a responsibility to the customer equity, to the goodness of our customer base, to the sustainability, to the environment, all the ESG groups in order to further maintain the quality of our company, not just the sheer volume of sales.
Richard Whitt: 11:47 Yeah, absolutely. And that’s, I think another challenge that’s been created by the rise of the platform companies on the web is that they essentially had habited this notion of user hood, right? So we are not technically in their minds, customers or patrons or clients over and over again. We are their users. Through to customers and clients and patrons are the advertisers and the marketers and the retailers from the other end of the platform. And so, because we always were users out of the web, we just maintain that status now on the platforms. And so they don’t really view us as being part of, but I would consider a healthy give and take commercial transaction interaction relationship, right. They see us as a resource to be mined. Again. One of the things I think we need to try to come to grips with is that the web has really taken a turn it’s. It’s this ultimate incredible market-based system it’s leaving behind all the addition that makes for a healthy marketplace, which is willing buyers, willing sellers, a meeting of minds and treating us again, not as users, but as you’re saying, as customers in a true commercial relationship, you don’t push the boundaries, right?
Richard Whitt: 12:52 Because you want to maintain that long-term relationship is sustainable. It benefits both sides. When your user that is pushing the boundaries that you mentioned, the U curve, that’s probably way far on one side or the other. It’s not in a place where in most normal commercial transactions, somebody trying to sell me something otherwise would sort of back away from because it feels like you’re going past a certain point. Those points don’t seem to exist online.
Allison Hartsoe: 13:16 Do you think the paradigm is the same? We’re talking about the platform companies largely and the big tech players. Is the paradigm the same for a corporation that maybe doesn’t have as much data to work with? And they don’t have an advertising model. They’re basically selling products. Does the same problem of treating the customers as a resource still exist?
Richard Whitt: 13:38 I think it can. It’s my own personal viewpoint here, but I think when corporations get to a certain size and scale, I think it is harder for them to see the people that they are working with serving on a daily basis as actual customers and clients, as opposed to points on a spreadsheet. So I think there is a certain sort of, I don’t know what the term would be exactly. It’s sort of a withdrawal or pulls back a bit from the human dimension of being somebody’s customer when you have that kind of scale and stuff going on. So when you have that in the web, when it’s then attached to the web sort of ethos of the seams cycles I’ve talked about, I think that exacerbates that sort of notion of us being these sort of abstract entities to them rather than flesh and blood.
Allison Hartsoe: 14:17 Yes. I completely agree with you here. One of the things we talk about all the time is the distancing language that’s used to express data and analytics. I’m not a person I’m an eyeball or an impression or view or different things that are not human things, not people describers. Yeah.
Richard Whitt: 14:35 And now you will get to the obsession with data itself. Right? We talk about data as if it’s this thing sitting on a ledger somewhere that you make some money. Data is I can say data’s become a four-letter word. And from an economic standpoint, it is a very unique resource. It is not a fixed asset. And if you think about it, yeah, there are certain additions about ourselves like our identities, things like my social security number and my credit card number. There are certain aspects of me that are fixed and obviously very sensitive I don’t want to be sharing, but so many other things about us is about our experience is about things that are meaningful to us. So, for example, the fact that I bought that pair of shoes last week on the web, there could be a dozen reasons why I bought those particular shoes at that particular time.
Richard Whitt: 15:16 All it is known is that that is a data point. And that data point is in fixed to me as if it’s somehow is meaningful about me as a human being, and it may be, but it also may not be. So I prefer to think about data as much more about a flow of experience than meaning. The term I use is, is a life stream, a digital life stream, I think tries to get us away from this idea of this sort of fixed thing, this asset, this resource, that key, again, to be extracted from the environment. And if you talk to economists, they will say putting aside for a moment that that notion sort of the flow of data, data is a combination of a couple of unique elements. One, it’s non-fungible, which means essentially it’s unique. There’s no data point. That’s exactly the same as another data point because the context around it is always different.
Richard Whitt: 15:58 And that context often gets lost in the data stream when companies are trying to take a look at us. Second, it’s what they call it partially excludable, which means you can use any technologies and other things prevent people or limit people from getting access to it. So it’s not completely out on the world. It’s something you have some control over in terms of controlling the access to it. And the third, I think by far, the most important thing is it’s what they call non-rivalries. A non-rival was good, which means that many people can benefit from it multiple times over. So you can share your data with 20 different places. Let’s say people, individuals, companies, governments, they can all make use of it, which actually enhances the value. And in some ways, there’s a multiplier effect. The more data you share with more people, the more value that’s generated began, mutually, not just for you, but for all the people you share it with.
Richard Whitt: 16:44 And when you look at a platform company or other large retailers or others in line, they have a very narrow conception of you. This is what they want to know from you as much to know so that they can take that away and then try to sell you something. But if you could tap into that, non-rivalries nature of the data of you as an individual or as a member of society, as a family member, a friend, a part of a larger group, all those aspects of that data share with multiple parties, you get more value from it as do they. So society benefits from having data to be this non-rivalrous thing. And the irony again, to me, is the more data protection laws you put in place. The more privacy legislation you pass, the more safeguards you ensure that people are fully protected in terms of their data leakage. That’s good from a certain perspective, but socially it’s actually not so great because that value of that data could be immense for all kinds of reasons, both economic and non-economic.
Allison Hartsoe: 17:35 And I love that concept because it goes so closely to the tribes’ concept and companies acting more as a magnet for different tribes of people. And when you have non-rivalries data, you’re then able to see which tribes am I attracting, who is really in my customer base, and where can I find them? They’re not just a name, address, and phone number, or a DMA. They are the life stream of activities. And maybe at a certain point, they’re mine, and they are appropriate for me. And then, at another point, maybe they’re not.
Richard Whitt: 18:06 Yeah. And if I’m the person who paid Google to put that ad for those shoes online, and I bought those shoes, they’re happy, but that was sort of a one-off. What do they do next? Well, typically, what I see it, he made his experience. I see those same shoes, follow me around the web for weeks and weeks afterwards. And it’s like, okay, that’s not good for me. It’s wasting my time. It’s wasting your money. Wouldn’t it be better if you knew that I was very happy with the shoes? And in fact, I would love to find some matching jeans. How do we create that connection right now? You can’t really do that online because the platform sits in the middle and essentially is sifting all these insights for their benefit. The retailers get some of it, but they don’t get the full value, and nor do I as the user slash customer or client. So this notion of the non-rivalry, this is also if I can, if I can lower my barriers if I feel comfortable if I trust you in a relationship. So it comes back again to this idea of trust, then I’m willing to share more data. That data becomes more valuable to both of us. And then we’re off to the races, and you can create a much better commercial set of interactions and even true relationship, not just sort of these one-offs we see online,
Allison Hartsoe: 19:05 But I want to pick up on this concept where you’re talking about the platforms, making the match, as opposed to where else I might place my trust. If I’m a consumer and I need to broker my data, um, there’ve been suggestions of, Oh, I can put my data on the blockchain and get a dollar per transaction. Is that the right Avenue that consumers should be thinking about? Or is there another way to go about trust and creating all the benefits of sharing data with limited downsides?
Richard Whitt: 19:34 Yeah, that’s a great question. So this is, I think we’re shifting from if we see that it’s not just a problem, the web today is a problem. There also is an opportunity, right? There’s a lot of value to be unlocked here that frankly, and again, ironically, we all think the platform companies have it all figured out. They’re making money, hand over fist. Their stock is through the roof, and yet they are still basically stuck in a business model of roughly 20 years gone by. And we’re all seeing the weaknesses in it. We’re all seeing that they’re not making the true human connections that could, in fact, unlock more value for everybody. So my conception of this is we are now in the 21st century and we need help. So back to the GLIAL concept, we need someone we can trust, and we need that kind of support level.
Richard Whitt: 20:12 And what I found compelling in my research. So I’ve been now doing some writing and talking about, is that going back to the common law fiduciaries? So fiduciaries are the original. This goes back to sort of the middle ages, roughly of England, but also Europe, but the concept behind fiduciary is very interesting. It’s about uneven commercial relationships, where one entity has power over somebody else. And that power can come from a certain expertise can come from certain confidences that are shared, could come from a certain socioeconomic status, their variety of ways of that power can be exhibited. But the bottom line is when you have power over somebody, and there’s the attempt to create a commercial relationship. The one with power has certain duties to me and to two commonly cited duties. It’s an issue of law, or the duty of care, the duty of loyalty.
Richard Whitt: 20:57 And the examples of fiduciaries are actually all around us in the analog real world. So we have doctors, we have lawyers, right? We have certain financial advisors, not all of them, some of them try to escape it, but there are certain ones. In fact, they advertise openly that we are a fiduciary. We have a higher standard, right? Why brands? Interestingly, they treat us as patrons. They zealously guard our library records. You may remember after nine 11, there was an attempt to get access to certain library records of these librarians. And they were like, no way. We have a fiduciary duty of loyalty to our patrons. When they check out a book, we are not going to tell the US government what that book is. So that’s a similar kind of example there. So the point is we have these entities today in our real-world lives and even go, especially on it, things like dry cleaners, right?
Richard Whitt: 21:41 They have a certain bailment obligation to us through the territories are got part garments, things like that. When you hand a car to a guy to park your car at the restaurant, make sure he’s not gonna drive away with it because there’s a certain notion that he has a Belmont, as they call it, another common law duty. But the point is we have all these duties in the real world. There are none of these duties in the digital world. We have nobody who is acting as I would put it as our digital agent online to go back to, as you said, protect, enhance, promote the very concepts of having a sort of a support system for us. We have none of that. We are completely on our own. And so I believe we should be taking some of these lessons from fiduciary laws and a common law and importing that into the digital space.
Richard Whitt: 22:17 And in fact, creating a whole new breed of professionals, digital agents giving us a duty of care, which means don’t harm me and a duty of loyalty, which means promote my best interests without having any conflicts of interest. These are high duties, right? But this is the same kind of thing. If you’re going to a doctor because something hurts, you’re going to your lawyer because you’re being sued. You’re going to your certain financial advisor because you’ve got some convoluted arrangement you want to make to sell something. These are all important, sensitive relationships where you’re relying on their expertise and giving them confidence. As we’re doing the same thing online, we just don’t call it that when we’re sharing our data, whether it’s deliberately or surreptitiously being taken from us, that is sensitive information that’s flowing to these entities. And so I feel like there should be an opt-in situation where we have companies and others who want to work with us as fiduciaries, where we now get that full benefit of being a customer or a client to them to no longer just to the user.
Allison Hartsoe: 23:09 Are you saying that the company can take on the mantle of that duty of care or that a third party should take on that duty of care? Like maybe a 21st century FICO?
Richard Whitt: 23:21 Yeah. So it could go either way. So you could have an entity decide that it wants to step up to be a fiduciary. So have that direct relationship with this customer as a fiduciary, some sort of a duty that’s agreed to with them contractually, or maybe they’re part of a professional association.
Allison Hartsoe: 23:37 Like a doctor with a hospital.
Richard Whitt: 23:37 Exactly. Yeah. Or you could have a third party arrangement. So the company may be still teach you normally as it would today as a customer, but then maybe there’s some other entity, some other fiduciary, some called a data. Trust is one example people point to, which is a similar idea, which is collectivizing my data, pooling it, and then having third parties get access to it. There’s a variety of arrangements, which is, I think is also really exciting is that as many different ways you can imagine having these online connections and interactions to what turn into relationships, there are different examples from the common law, fiduciary law, bailment law, trust law, et cetera, that all could fit in and actually fit in quite nicely because these are developed over hundreds of years in all kinds of human situations. And the notion that the online world is that much different. It really isn’t true, right? It’s the same power dynamics and control and access dynamics that we have online as we do offline. And so no surprise. I think these common law aspects, these principles work quite well when you put them in the online setting.
Allison Hartsoe: 24:29 So I want to take one issue up about, we talk about the power of the fiduciary, having a duty for care and loyalty. But what I didn’t hear there is transparency. Is there also a requirement to be transparent about what data is known about a particular person and this, particularly from a company lens? I think companies are oftentimes wondering, should I tell people what I know about them? Or am I opening up a mass?
Richard Whitt: 24:58 Yeah. So the short answer is, yes, there’s a duty of transparency. There’s a duty of good faith. There’s a duty of confidentiality. These are what they saw as the so-called secondary duties because some scholars have suggested that between care and loyalty, you sort of cover the whole waterfront. There’s not much left. I mean, those two and others have said, no, no, you should have a separate, you know, clearly annunciated set of duties around these other aspects to being confidential, to be transparent, have good faith. I think there are a few others. So, but yeah, depending on either way, as you look at it, those are clearly understood as becoming, they would be part of that kind of relationship.
Allison Hartsoe: 25:31 And so what is your personal opinion? If I were running a company today and I said, okay, we are going to really make a play for good customer relations. And to that effect, when you log into your account, we’re going to show you everything we know about you kind of like Facebook did after Cambridge Analytica is a way that companies should be trending with or without a third party in place.
Richard Whitt: 25:54 Yeah. So I think that’s certainly would be a step forward. Again, it comes down to what the customer or consumer or client wants. Some may not have any interest in that, right? So you don’t just say, Oh, here’s something that got about you. But for those who have an interest, I think being fully transparent about the data that’s collected can be a very useful tool and also a way to start building further trust because it all comes back to where it is trust is this sort of endless cycle, and you don’t just start with it. You have to sort of build it over time. And so I think opening yourself up in that way to be transparent, I think to be a first step, one of the nice things about it is while fiduciary law comes from the law, you can be a fiduciary. You can step up to those obligations and not necessarily have to import all of the legal system into that. It’s sort of what they call it, private law regime. So you could do it through contract. You can do it by being part of a professional association of, let’s say digital fiduciaries, where you all agree that these are the principles you’ll buy by and you put them on your website, and it’s all very clear,
Allison Hartsoe: 26:48 But this association doesn’t exist yet. Right?
Richard Whitt: 26:52 Not yet. I have some near term hopes that something like that can get off the ground because I think the time is right. I think there’s room for this. I think there’s appetite for this. If I can be involved in some small way, that would be great. Yeah. But I think companies don’t need to wait on that. They don’t need to wait on an act of Congress. They don’t need to wait on their secretary of state to announce a new kind of corporation that takes these principles in. They can just start doing them now and just post them to the world. Let them put their users on notice for that. And then I think that act by itself or those act by themselves could really start to build some of these trust levels where they could start talking about this mutual exchange of data that creates value.
Allison Hartsoe: 27:25 So along those lines, just yesterday, world wide web founder Tim Berners Lee announced to his startup interrupt that an enterprise version of the solid privacy platform is now available. And it’s supposed to allow large organizations and governments to build applications that put their users in control of the data. Is that what you see as a solid step forward? Is that an example of a fiduciary?
Richard Whitt: 27:51 It’s a technology platform that very easily could include a governance mechanism. That’s basically a fiduciary, right? So they could call it a fiduciary. They don’t do that today, but the intention behind it clearly is that this is a new form of technology platform that is intended to let people have much more control over the data, keep the data localized within a localized cloud. So they have much more control and access over it. It’s actually an example of what I call edge tech, which is technologies that are built specifically for empowering people at the edge of the network. In his example, sir, Tim’s project is basically what he calls personal online data stores or pods. So your data is sitting sort of in this certain location nearby you.
Richard Whitt: 28:29 And so by itself, you have more control over who can access that, but there are other forms of these kinds of edge tech technologies is one that I’m really fond of that’s developed by Sandy Pentland at MIT, which it calls Opal and Opal is an algorithmic open algorithm concept. And there, you can almost pair that with the solids. If your data is local, what Sandy’s project would do is say the computation moves to the data. Today on the web, as you know, everything is the other way around. All of our data goes to the web and, in fact, sits on the web. And we can all remember back to all the different data breaches that have happened over the past three, four, five years. Equifax was one of the big ones, right? And they find hundreds of millions of dollars. And my thought there when I first heard about it was, this is kind of crazy. Equifax needs my data for maybe an average of three times a year. They’re running a credit report on me or something.
Richard Whitt: 29:13 They don’t need it sitting on their very open vulnerable server farms, 24 by seven. When in fact, they’ve come into call honeypots for all these cybersecurity hackers to swoop in on and try to get at. So why can’t we just create a situation which I think solid plus Opal would allow, which is they moved their computation to where my data is. They come through the appropriate access points, my fiduciary, or my agent, or whoever gives them the appropriate thumbs up. They come through, they access whatever data they need to run the credit report. But then they leave the data where it is. They take back essentially the insights, the insights necessary to complete the report. But those insights don’t have to be sensitive. They can just be enough to sort of say your score is eight O seven or whatever it is. So they go back with what they need to run their business.
Richard Whitt: 29:55 But the actual underlying sensitive data stays with me. So that’s an example of what a call it an edge pull technology, where I’m pulling the computation to where my data is. You run the algorithms. They go away. And I am all the more protected behind my fully encrypted wall around myself, and they have what they need to do their business. So these are the kinds of ideas around these edge tech technologies that I think if you combine that with a fiduciary, you helps you manage that. Then you have that much more power and control over your data. But again, that power control and gives you the ability to say, I’m going to now make myself vulnerable to certain entities to share these insights, share my data with you because I have more level of trust. That’s going to go to a good place and end up benefiting me sort of the two-step dance within the clarinet project is it’s the fiduciary governance structure of an entity willing to take on those duties. And it’s the technology, the edge tech, in this case, personal AI is another one that we can talk about if you like where I now have much more control over my data and my digital self online,
Allison Hartsoe: 30:52 It sounds like what I’m sensing here is a little bit of a critical mass challenge where I’ve got my own company data and I’ve got my individual data. And all of these are very small nodes, discrete data points in my digital life cycle, but you only really get power out of a network effect when you’ve got a lot of critical mass from the adoption of the edge tech from the use of Opal from basically what a platform can bring. Are we seeing the evolution of the big platforms, or are we seeing perhaps an opportunity for a third party to come in and become the next Google or the next Facebook?
Richard Whitt: 31:31 Yeah. So there’s some signs, and in an interview last year, Mark Zuckerberg commented on the fact because actually he was asked the question about being a fiduciary. And he said, well, we think we already are acting as a fiduciary on behalf of our users, the fact that you use the term users and the fact that nobody really believed him, I think sort of shows that that didn’t really go very far, but it was interesting, at least in his mind that that is something that they have already looked at, or they are exploring, I think, much more interesting near term about a month ago now there was an article in the economist where alphabet, which is the holding company for Google was indicated. The report indicated that one of the things they’re looking at as a way of sort of moving forward, moving ahead beyond their existing business model, is to become a data fiduciary on behalf of their users.
Richard Whitt: 32:10 So I find that fascinating. And I think if they went about it in good faith overnight, that could completely change the conception of what this thing is really elevate this and people’s mind as a truly viable way of doing business. So those are just sort of data points we don’t really know for sure, but I also think this has room now for, in fact, smaller players to get involved. As we talked about earlier, in some cases, they’re the ones who are closer to their customers, they know them better, perhaps they’re not at the same sort of distancing and remove that some of the larger corporations might be, so might be the ones who are primed to actually take advantage of being of this new approach, where again, they could get access to data that the larger platform companies may not be able to. And particularly with data protection laws by GDPR in Europe, we’ve got CCPA now, CPRA in California, the notion of owning, controlling access to someone else’s data is becoming expensive.
Richard Whitt: 32:58 It’s becoming compliance risk. And so if you can do a business where in fact, you don’t need to touch the data, you’re just getting the insights you need to, in fact, then turn around and do some good things on behalf of that customer. That’s a way that right now, the platforms are not equipped to go to. So I think they can certainly try to take it on, but I really think it’s much more intriguing to think about various ways that assisting retailers online and offline, for example, financial institutions, ISP could step into this, basically this breach, this opportunity and say, we’re going to take this on and become your digital fiduciary, and also give you this really cool technology at the same time.
Allison Hartsoe: 33:32 So let’s say they did that. And I understand the transactional relationship that might exist between the fiduciary and the company who’s trying to do more with the data, especially in that third part of your model, where the given the, get the sharing of the data gives you much broader benefit. Is it possible for our company or a senior executive to go back to their board and say, we’re going to quantify the value of customer trust? This is so important to us. And it’s been a concept that I think every company would easily agree that it’s critical to their brand, but is it possible to quantify it?
Richard Whitt: 34:07 That’s a great question. I’ve been seeing more reports coming out just the last year or two where researchers are trying to do just that. Try to identify a metric by which you can measure what trust is. So I look at it and sort of in two ways, one is the upside of trust, which is if you have more trust what that does in terms of potentially more revenue for you, ways of extending your business model, expanding the product line, but the downside when you don’t have the trust, it hampers your ability to do some of these things. And that also, again, it gets me back to the compliance role, I guess because I’m a lawyer by training. I think the idea of owning and controlling someone else’s data, this will become more and more expensive. It’s going to become a cost because between the fact of all the different compliance regimes you have to put in place. And then on top of that, all the fines that come out of breaches and they come out of misuses of data, there’s going to be more and more and more of that. Now, increasingly the United States, not just in Europe. So it’s both the value of the trust that I think is worth exploring, but it’s also the downside of owning and controlling someone else’s data.
Allison Hartsoe: 35:06 Risk. Yeah. And I’m sure, you know, from the legal perspective, any company, when they’re trying to size up the risks, that’s always a big, hot issue.
Richard Whitt: 35:12 Right, I think here, you can actually look both directions, right? So I think it actually, there’s a, yeah, there’s a benefit and a cost that both together, I think makes the notion of trust as something of value on your balance sheet as something that it’s worth it.
Allison Hartsoe: 35:25 Right. So Richard, let’s say that I run a company or I’m a senior executive at a company. What should I do first? I’m interested in this concept. I want to start exploring these ideas and seeing how they fit for my, where should I begin?
Richard Whitt: 35:39 Well, as I said, I think there’s some good research that is being done out there just on this question of trust, for example, the value of trust. So I would endeavor to get your best people to pull together some solid reports on what that looks like, and then get your general counsel to write a company report on the risks and challenges of compliance and enforcement around data protection laws and all that kind of all those kinds of things. Again, coming back to the formula you mentioned before, it’s really sort of exploring those two worlds. And then I think it’s getting creative around what it is to inject these decrees of trust into the existing relationship. You mentioned one earlier, right? Becoming transparent about the data that you’re collecting about somebody. I think that kind of a step is an important one that instills the trust and also starts to take you down the road of becoming a fiduciary.
Richard Whitt: 36:21 So, and this is the part that I think is sort of the ultimate end game would be to agree either on your own terms or maybe using an ecosystem of third parties, as you mentioned that somehow you get these fiduciary duties of care and loyalty injected into these interactions because I think that’s the way to raise the trust. It’s also the way that these duties, as they play out over time, opens up access to the data, opens up access to these insights, right? As you’re saying, it’s not about the data. It’s helping me, the insights, it’s the information, the knowledge you gather from the data. So the further you can go down the road of assembling those data is the data into meaningful terms, the better for you. And I think really the notion of if you walked down the street and there’s one doctor who is licensed as a physician under the appropriate professional code, the doctor right next door has nothing like that on the door. I know which one you’re probably more inclined to walk into. And so similarly, I think over time, it becomes known that these fiduciary duties and obligations and the relationships create these zones of trust that people are more attracted to, right? That there’s evidence that people will go there. Can you use their services then that’s then that they can also buy it also is another sort of set of data points around the value of it?
Allison Hartsoe: 37:25 I think there’s also something hidden in your last comment, which has to do with what we didn’t talk much about, which was about the algorithms. And when you talk about insights and the fiduciary duties of care and loyalty, transparency, and other things like that, it seems to me that you’re then putting an ethical lens on top of the outputs of the algorithm. So a lot of times, an algorithm will come up with interesting things, but people don’t always think about, should we, can we, how do we, and then what are the ethical pieces like, Oh, over there in the corner, there’s Bob saying something that we should probably, but let’s go.
Richard Whitt: 38:02 Right. This thing off is Aristotle’s ethics book and say, well, we should think about this. Maybe. Yeah. Ethics is baked in from the outset at the end. That is another issue with the web. If the technology can do it, some are going to find a way to do it. And the ethical side of it sort of goes off to the wayside. One of the things I mentioned real fast, she talked about algorithms. I’m a fan of the idea of a personal AI. So this is an AI that resides on my device that, in fact, interacts with other AI. So, in addition to having an entity of human beings, I’m pretty sure you have some sort and maybe other kinds of technologies like data pods and identity layers and stuff. You can have an actual algorithmic system and machine learning-based system train on you that interacts with all of that stuff for you.
Richard Whitt: 38:38 And because it’s being programmed and managed for you by your fiduciary, you can trust it, right? Because it’s going to do things that benefit you. It’s not going to be Alexa sitting in your living room waiting to listen for something so they can try to sell you what they want. It’s walking into your living room. And this AI tells Alexa, shut up for the next four hours. It’s family time, right? So it gives you more control over your digital environment. So I’m excited about the prospects there, and there are things happening at universities and startups, where I think the next three to five years, we’re going to see a rise of these kinds of digital agents, which accompanied by these governance structures and new ways of thinking about data. I think there’s really is a potential, whole new evolution of the web. Maybe a web 3 O that’s staring us in the face.
Allison Hartsoe: 39:14 That’s perfect. I love that note, Richard. That’ll be our ending notes. Thank you so much for joining us today and bringing these concepts of the fiduciary and the way that we should be thinking about customer data and sharing and trust and just all of these great elements together. I personally can’t wait to have my own personal AI to manage all these other pieces around me. Just a side note. I married to a person who’s very technical, and every time I have to out architect the network in order to just turn the TV on, I would love to have an AI that did all that for me.
Richard Whitt: 39:48 Great. Well, the line forms behind me, but as long as your husband does, then you’re in good hands.
Allison Hartsoe: 39:54 As always, links to everything we discussed will be at ambitiondata.com/podcast. Richard, can I put a link over to GLIA net and any other specific papers you want me to include?
Richard Whitt: 40:05 Yeah, please, www.GLIA.net. Uh, that’s glia. And then the Twitter handle is RichardS as in Sam Witt, all one word, and I thank you again, Allison. This was really fun.
Allison Hartsoe: 40:15 Thank you, Richard. Remember everyone. When you use your data effectively, you can build customer equity. It is not magic. It’s just a very specific journey that you can follow to get results.
Allison Hartsoe: 40:27 See you next time on the customer equity accelerator.