University of Arkansas

Walton College

The Sam M. Walton College of Business

Season 5, Episode 5: Technology Changing What It Means to Be Human with Mary Lacity

February 17, 2022  |  By Cindy Moehring

Share this via:

Host Cindy Moehring and University of Arkansas professor Mary Lacity chat on this new episode of the BIS podcast. Join in as the two talk about the technologies changing what it means to be human, metaverse experiences, ethics surrounding digital heath statuses, and the new age of “surveillance capitalism.”  

Podcast

Resources From the Episode

Episode Transcript

Cindy M.  0:03  
Hi, everyone. I'm Cindy Moehring, the Founder and Executive Chair of the business integrity Leadership Initiative at the Sam M Walton College of Business, and this is the BIS, The Business Integrity School Podcast. Here we talk about applying ethics, integrity and courageous leadership in business, education and most importantly, your life today. I've had nearly 30 years of real world experience as a senior executive. So if you're looking for practical tips from a business pro who's been there, then this is the podcast for you. Welcome. Let's get started.

Cindy M.  0:41  
Hi, everybody, and welcome back for another episode of the BIS, The Business Integrity School. And in this season season five, we are discussing all things tech ethics, and it is a riveting season. I can tell you already from the conversations we've had. And today to add to that conversation. I'm really excited to tell you that we have our own Dr. Mary Lafferty who is a professor at the University of Arkansas and the Information Systems Group at the Walton College Hi, Mary, how are you? 

Mary Lacity  1:09  
Hi, Cindy. I'm fantastic. Thank you for having me. 

Cindy M.  1:12  
Yes, well, we are very fortunate to have you here. And in addition to being a professor in Information Systems, Mary is also on my academic advisory board for the Business Integrity Leadership Initiative. And we're doing a lot of fun work there. And in tech ethics. Mary is not only a professor here for us, but she also is one of the most prolific authors and researchers that we have at the university. She has written over 30 books. Her most recent one is blockchain foundations for the Internet of value. And her publications have appeared in Harvard Business Review, Sloan Management Review, MIS Quarterly Executive, IEEE Computer Communications of the ACM, many, many other publications as well. So Mary, we are just really lucky to have you here. I've talked to you before about how you possibly fit all that in. And I know you've got a very structured schedule when you're writing, so that you can get to get that in and still do all of the all of the teaching and the other things you do like serve on my academic advisory board. So can you just start by telling the audience a little bit about who you are? 

Mary Lacity  2:21  
Okay, 

Cindy M.  2:22  
How you're doing what you're doing and how you fit all this in? It's really amazing.

Mary Lacity  2:27  
Okay, well, I don't think I'm unusual. This is just the life of a professor. So I don't do it. I don't think I'm any, you know, more distinguished than anyone else on in our, on our faculty. We all do research, we all do teaching, and we all do service. So it just sort of the, the nature of the beast. And I know you rattled off this big, long bio, but Cindy, it's because I've been a professor since 1992. So I've been around for a while. So you can accumulate books and publications over over time. And I guess, you know, why am I a professor, I do come from a family of teachers. So my mother was a teacher. My father was a teacher, my older sister was a teacher. So I always say that, you know, teaching is the family, the family business at the Lacity household.

Cindy M.  3:11  
Oh, fun. Very nice. Well, and you've been some other places before you came to the University of Arkansas as well. And you spent time as a visiting professor at MIT and London, London School of Economics, Wash U and Oxford. How did some of those experiences shaped what you've now brought to us at Walton College?

Mary Lacity  3:31  
I feel like I've been so fortunate and lucky. It was always research driven. So I would have I'd be working on a research topic and I'd happen to find somebody at that university that would invite me to come for a year. So the Oxford one which was always any scholars dream was I had met Professor Lesley Wilcox, and he was doing work on outsourcing in the United Kingdom. And I was doing work on outsourcing in the United States. So we thought, wow, what if we got together and pooled our data and looked at this theoretically, and developed papers for practitioners, so that that was just one example. And then probably the most recent one at MIT is because I was really interested in studying this emerging world of blockchains and are organizations getting ready to do it. And I have some longtime colleagues at the center of the MIT Center for Information Systems Research, CISR. And so you want to complete a project, exploring how blockchains or enterprises are preparing for blockchain. So I did that in 2017. 

Cindy M.  3:56  
Wonderful

Mary Lacity  4:13  
And then this position opened up the Director of the Blockchain Center of Excellence in 2018 and I applied for it, been here for three and a half years. It's been great!

Cindy M.  4:41  
A perfect fit, and you brought just a wealth of information and experience and, and also really had this passion that I share, obviously, with you about ethics and integrity, how does that fit into business? And particularly, given your interests, how does that fit into tech? which I've always described as sort of it's it's the it's the next Greenfield, if you will, for the topic of ethics. Which has been around, you know, forever, the philosophical question about what does it mean to be ethical? What does it mean to act with integrity, and then you have the, the business practical side of that as well. But with tech, which is just emerging and blowing up, literally every day and becoming so much, much a part of our life, there are a whole host of new questions that haven't been answered, right? You can have some frameworks, you maybe can come up with some principles and, and you can think about that. But the actual application of that to these new and sometimes yet unknown technologies is really where I think the fun is, in figuring out like, how do we how do we how do we help guide humanity? As we become more almost technologically dependent? You wrote a really cool article about this a couple of years ago, which I want to start by talking about, because it asks big questions. Like, what do we think the balance should be? And what is it between humaneness and technology? What does it really mean to be human? So is this article that you co-authored with a couple of other folks called "Can Computer Based Human-Likeness Endanger Humanness?" So let's just start with what did you what all were you all exploring in that particular article? 

Mary Lacity  6:18  
Okay. Yeah. Thank you. Thank you. Delight, delighted to talk about that paper. So I did that with Jaana Pora, and Michael Parks. These are two longtime friends and colleagues from the University of Houston. And it really just came about because, and I wrote this a couple years ago with them, as we started to see more of these agents in our homes, right, like, you've got these devices that talk to us and in our cars and on our phones. You know, we're just adopting these technologies and, you know, we need to be more reflective about it. And what was really starting to irritate me is when they were pretending to be a human being. Like we anthropomorphize their their programming logic. So when computers would say things like, "Oh, I'm so sorry for your wait!" No, you're not. You're not sorry for my wait. You don't have any human emotions. You were programmed to utter that sentence, but you don't have any empathy or feeling. So we started looking at what are some design principles that we can come about, and I looked at something called speech act theory and Jaana is really into philosophy of humanness. And we basically came up with we can program computers to interact with us, but they shouldn't pretend to have human emotions. They shouldn't pretend to have psychological states, they can't be sorry. They can't have regret. They can't have empathy. So it's fine if we want these machines in our lives. But I don't personally like when we anthropomorphize like we've created like, they're really showing human empathy when they're not.

Cindy M.  7:45  
Yeah, now that's a big word. So anthropomorphize, what does that does that actually just mean them having emotion acting as though they have emotions and speaking as though they will interact with us like a human like, I will do that for you next, as soon as you hit tell me to restart or something? Is that Is that really what it's all about?

Mary Lacity  8:04  
It's when we take anything that's non human. 

Cindy M.  8:06  
Yeah 

Mary Lacity  8:06  
And we ascribe human attributes to it.

Cindy M.  8:09  
Yeah. Yeah. Which that's, you know, there's so there's so much of that I feel like happening sort of with technology. It's like this line between where humaneness begins and ends. And where does the technology pick up? Really does get blurred when you start, you know, anthropomorphizing with the with the technology. So what are some of the dangers that can come from that? What did you guys find in your article?

Mary Lacity  8:33  
Okay, well, first before because I'm a professor, I'd like to show both sides. So yeah, this is this is not a fact what I just said, you know, this is an ethical position that we're taking, there are other people that will say, "Oh, we should be anthropomorphizing our technology devices." So there's, we looked at all kinds of crazy technology, in our mind crazy should probably be so you said value laden statement, but like, if older people are lonely to develop a robot that talks to them, so they're less lonely. So there are people that will say we need to anthropomorphize our technologies, and that it can lead to good outcomes. I just feel like we need to keep having these conversations, and just shouldn't automatically do that without being more thoughtful and reflective. So I did want to make I did want to make that thought, I did want to express that thought that this is my opinion. 

Cindy M.  9:24  
Yeah. yeah, yeah, I totally get it. 

Mary Lacity  9:26  
Okay. So I think what Yana and Michael Parkes and I are concerned about is what is it to be a human being? 

Cindy M.  9:33  
Right. 

Mary Lacity  9:33  
What is it to have an authentic relationship? What is human bonding, what's a friendship? What's a romance? And when we're relying so much on technologies, I just don't know what's going to happen to our humanity. And so I just want us to be more thoughtful and think about these things while we're designing them. Because what often happens, and we could talk about this with medicine, we could talk about it with science and technology. We can do it, but when does the "Should we do it?" and "how should we do it?" 

Cindy M.  10:04  
That's right. 

Mary Lacity  10:05  
So and you and I have talked about this before, Cindy, that, you know, product development teams need to have some kind of an ethical 

Cindy M.  10:11  
Yeah 

Mary Lacity  10:11  
aspect during development,

Cindy M.  10:13  
Right 

Mary Lacity  10:14  
And use, right and educating people on the use that we need to more closely marry our development and our use with our ethics. Ethics just can't come after the fact. 

Cindy M.  10:23  
Right

Mary Lacity  10:23  
It's too late. 

Cindy M.  10:24  
It's too late. 

Mary Lacity  10:25  
It's too late. So that's kind of what we're trying to argue in, in that paper.

Cindy M.  10:30  
Yeah. But you could almost get a sense that kind of where we are right now in 2021, that we're looking at many instances where it's almost too late. You wonder if those questions were really asked and answered appropriately at you know, kind of the front end of it. I mean, a lot of even large tech companies as you and I've now learned through many of our conversations have just recently started to recognize that they needed that voice at the table during program design and development. And then it actually was a a point of view that needed to not just be somebody's 10th thing they were thinking about, but actually part of their job description, actually something you compensate people for. So that so that people understood the importance of bringing those questions to the table. So yeah, I thought,

Mary Lacity  11:16  
Yeah, I'm proud of what some of the companies that we're working with and how thought, how thoughtful they're being. And some of it was prompted by the unintended consequences of technology. 

Cindy M.  11:26  
Yeah, 

Mary Lacity  11:26  
Right, no company wants to put out something that ends up being, exposing terrible bias and their technology. So they're starting to look through their innovation cycles, that we need feedback loops, we need to be able to pivot. And we need to have this these more thoughtful design sessions about what could the ethical implications be?

Cindy M.  11:47  
Yeah, and this should question not it isn't, everyone gets very excited about what can we do? What can we do 

Mary Lacity  11:52  
 What can we do, yes. 

Cindy M.  11:53  
and how can we make this, you know, this technology do more and there's not enough you know, stopping and pausing to think about the should. So, so let's talk about the these digital assistants for a bit and about you know, the humanness of, of the design. What kind of problems can that create for children?

Mary Lacity  12:09  
Well, first of all, children's have great imaginations. They anthropomorphize everything. Right? When you see little children playing, that's what they're doing. And, and I think that's very healthy. But we have to be careful about how much technology that we're imposing on our children. Many computer games, for example, are designed to be addictive. So when they're playing them, they're getting they're getting little happy, happy hormones in their brains. 

Cindy M.  12:33  
Yeah. 

Mary Lacity  12:33  
So you know, parents really have to intervene on how much screen time they have, what kind of technologies that that their children are interacting with. 

Cindy M.  12:40  
Yeah 

Mary Lacity  12:40  
So yeah, but, these are conversations we've had for a long time. But I think they're, they're really, they're really going to become even more important, as the next levels of technologies that come out are going to be more mobile. I mean, like, we can talk about things like the metaverses that are coming. Where it's almost like you're walking around like a Terminator, where one of your eyeballs might have a screen. So that's your, that's your connection to the artificial reality, this eyeballs and the physical reality, they're transposed together and it's mobile. So it's not like you're just sitting in your sitting still, with a little console playing a game, it's like our whole world might become way more a combination of of a physical and an augmented reality at the same time.

Cindy M.  13:27  
Yeah, so let's go to there's a lot of conversation right now these days about the metaverse.

Mary Lacity  13:31  
Yes 

Cindy M.  13:31  
some people

Mary Lacity  13:31  
Yes

Cindy M.  13:31  
I've talked to have some understanding. Others say, "I've heard that word, but I don't even really know what it means." What do you, can you help us understand it? What is a metaverse?

Mary Lacity  13:40  
Yes well it's a very general, it's a very general term that's been applied to, the term has been around for a while, but it is this idea that I just described that, you know, we will create as like, kind of avatars and we will we can live in these, you know, digital worlds. Now, we've had this for a long time, like Second Life has been around for a long time. But what's different about this next version, is we're we're seeing models and this is where the blockchain comes in, where instead of a computer arts company owning the this virtual reality that it's it's really managed by a decentralized community, so kind of like the Bitcoin of the metaverse. 

Cindy M.  14:25  
Okay. 

Mary Lacity  14:25  
Um, so I'll give you a good example. So probably the one that's most interesting is called Decentraland and it runs on ethereum, which is a public blockchain. And when it was launched, there's only 90,000 plots of land. So this keep in mind this is a virtual world, right? 

Cindy M.  14:39  
A virtual world, okay. There's 90,000 plots of land. 

Mary Lacity  14:41  
There's 90,000 plots of land. 

Cindy M.  14:42  
Okay 

Mary Lacity  14:42  
So early people in there started, they use, they use digital currencies to buy these called manna. And some of them are now going for $100,000 Because it's a fixed scarce asset. There's only 90,000 plots and so now companies are thinking "I want to be in this, I want the plot of land in the center." And anyone can if you own the plot of land, you could build whatever you want on there. So you could build a you could build something maybe that's constructive, like a little school, or you could build something like a virtual art gallery, or you could build a pornography site. Because once you own it, if you own the private keys, you can completely control that. So it is democratized. It is decentralized. It relies on all of our new technologies, like blockchains, and non fungible tokens. And increasingly, we'll have more interactive head devices on here. So it is very much early days. But that's kind of the excitement. I think that you're hearing about the metaverse. And then of course, Facebook's big announcement about the metaverse, and it's a large tech company there. It's going to be a centralized model. I'm having a hard time seeing what the

Cindy M.  15:53  
Yeah 

Mary Lacity  15:53  
business model would be for a big tech company will have to seek as though there'll be new business models because of the metaverse.

Cindy M.  16:01  
Yeah, yeah. And so the Wall Street Journal actually, a couple of days ago had a really interesting "24 hours Living in the Metaverse" where one of their young investigative journalist sort of went out with a you have to have like a pair of glasses. I think it's yes, the Oculus 2, do you have a pair of the Oculus 2 glasses by the way? 

Mary Lacity  16:20  
No 

Cindy M.  16:21  
For the first time. I'm thinking about it, put that on my Christmas list, by the way. 

Mary Lacity  16:24  
Okay 

Cindy M.  16:24  
But I better hurry up so that I can get here with all the shipping delays. But anyway, that was a 24 hour sort of living in in the metaverse. And what I found sort of interesting is you there's there's cryptocurrency kind of far out feels like almost play fun examples. And there were some of those that she had. But then there was also the you can go into a meeting and you could actually sit and be in a meeting. You can like do fitness classes she was showing sort of like, you know, through the metaverse, which makes you feel like you're really there. You can climb up Mount Everest, and you know, and feel like you're really there. But what I heard this morning, which I thought was just kind of interesting, too, and I hadn't thought about is there is some belief that our avatars will end up being smarter than we are. And because we're teaching them and put it in the business context of being in meetings and how we think and how we feel go back to that about certain things, and they will learn about us. And so they'll end up being smarter, and they will end up they don't get tired, right? So essentially, your avatar can be there doing business for you for longer and better, arguably, than what you could do just as a human being. Should we go there? Let's come back to the basic ethical question of not can we sound like we believe again, but should we?

Mary Lacity  17:43  
Well, I think this is where our ethical principles have to come in. Right. So we have to first of all, we, we get excited about these new technologies. And we jump in and Cindy even I'm have made mistakes in the past, like one of the things I really, really regret is that I did my DNA test with a with a well known company, right? And how they hook us is everything's free upfront, right? Think about, you know, it's really low cost, or it's practically free. That's why we use all these tools, right? Why we go to these browsers and type in things. And we're giving away all of our data and leave leaving digital crumbs everywhere? Yes, because the services are free. And we love the services, right. But our risk, we don't think often thoughtfully think about the risks and the unintended consequences of the amount of use, where we are leaving our digital crumbs across different platforms. And thanks. So I think we need to be way more thoughtful about what we do. So I don't want to say it's going to be all good, or it's going to be all bad. It's just we have to, we just have to make sure that we understand what are the intended and unintended consequences of using these technologies. Right.

Cindy M.  18:56  
Right. Right. And, and from a platform perspective and a hardware perspective, Apple's ability, you know, which they've now done in their, their latest software update of, of, of letting you know, different apps that can track you and what they're doing and giving you kind of puts it in front of you, as opposed to you having to find it in the privacy policy, just to say, you know, upfront Yes, I want the app to be able to use my information for whatever wants to be able to use it for marketing purposes targeting purposes or you, you know, you now have the ability to say no, but But folks, our generation, let's say, seem to care about that a little bit more than least my kids do. What what do you think that is? Is it just back to this evolutionary difference of the way we're going to be communal as humans and this kind of crossing of humanity and technology, or is it just that the younger generation just isn't as informed yet? What do you think it is?

Mary Lacity  19:50  
I think it's the risk profile. If you go back to risk profiles, right? And we our risk profile changes over the course of our lifespan, right so even think about you You're a mother, I'm a mother, the two year old that runs across the street know without looking right, no sense of of risk profile. So it takes it takes them a long time to see the negative consequences of things for where. And sometimes they suffer terrible negative consequences before they think about it like being cyber bullied, right? They don't think about that until it happens to them. So I think the risk profile definitely changes over time. And that's our job as parents to protect our children. And that's why I think having limitations on the amount of times that a child spends on different devices and with different technologies, you know, and there's great value to going out and playing and disconnecting in your backyard. And yeah, and I don't want us to lose that. humaneness doesn't mean we can, you know, I'm not saying I'm obviously love technology, I wouldn't be a technology professor, but I love thoughtful uses of technology.

Cindy M.  20:52  
Right? Right. And you're right, your risk profile changes. And probably as you get older, you become a little wiser as to you know, the ways of the world. So it's incumbent upon us as well, not just as parents, I would say, but as you know, professors and instructors and those working in higher ed to continue to try to get our younger students to understand what their risk profile is now that it could change over time. So let's talk about one more big topic. Surveillance capitalism. So another term thrown around a lot. What do you what does that term actually mean? Again, for people Oh, great.

Mary Lacity  21:29  
I am going to this is the fabulous Professor Zubov from the Harvard Business School. Now this is not a light read. Okay, I just got

Cindy M.  21:40  
back got on me for picking nice big thick books.

Mary Lacity  21:44  
This is that she's fantastic. She's got other more accessible outlets to so she her concern is big tech companies that are using our personal data and making money off of it. So the the monetization of our activities and and how they surveil everything that we're doing online, right? So that that's her big thesis there that big tech has gotten too big, that they're that our privacy, we sign those privacy agreements, but we don't control or own our data, right. And that's one of the reasons why I fell in love with blockchains. And why I fell in love with these kinds of decentralized models. Because in those worlds, we control our data, we own it, touch it, control it, decide who sees it, if we want to make money off of it, then we would make money off of it, right? It's gonna change all kinds of business models. So for example, instead of a big browser making money off of advertising, all right, I'll watch your advertising and you compensate me with some crypto. So all these business models are going to eventually change. I mean, we always say the technology comes first, the changing of organizations and business models can take quite a bit of time. But that's why I'm excited about it, because it can possibly address her quite legitimate concern about surveillance capitalism. Yeah, so

Cindy M.  23:05  
let's just stop there for a minute, I want to I want to, I want to just just just make that point even a little more explicitly. So you're right. Right now, I have heard it said for years now we are the product, it is our data. As you know, the end, we're entering all the information in and we're just like, offering it for free giving it away, you know, but like you said, it's all free. And we want the services. So we give them our data. But there's value in that they're monetizing that. So to your point, if If a company wants to monetize my data, or your data or someone else's data, then it's like, okay, well wait a second, then if you want my data, you can have it but it comes at a price potentially, right? And then you you know, there's that bargain in that negotiation for Alright, if you want my data, then you're going to have to pay for it. But do you think we could actually change the thinking that much if like, right now, we've gotten so comfortable with everybody just doing it and doing it for free?

Mary Lacity  24:00  
Well, we're doing it for free, but we are getting getting services, right? I mean, we are getting fantastic services, right? I mean, at our fingertips is you type in something in a browser, you're getting great information. So there is an exchange there, there isn't

Cindy M.  24:16  
like putting in your data so that they can use it.

Mary Lacity  24:19  
But they're just watching what you're doing. That's where the surveillance comes in. Right. But yeah, that's why that there's new models that I mean, you there's there are browsers out there, like the brave browser that is based on blockchain technology that's not owned by a centralized company. So they're, you know, but again, adoption is not the same as kind of the brand name ones, but you know, there are options out there.

Cindy M.  24:42  
Wow. Wow. So the question is whether or not that's going to become more and more the norm as to the norm the way it is today with the browser's very interesting. Um, okay, so we can't have this conversation about technology without bringing it kind of I would say back to an issue that we're still dealing with Today almost two years later, which is COVID. And there's been some really I think, well, there still are today, many ethical questions that are arising around how the world has chosen to deal with this and, and track it in be able to, you know, know whether or not somebody had a vaccine or not. And to know whether or not it's, it's, it's actually a valid vaccine, if you will. And you've done some work in this space around digital health passes. Yes. What is that? How, like, what is a digital health person? How can it help answer this did this ethical question of Are you excited? Are you not? And is it valid?

Mary Lacity  25:42  
Okay, so let me just say that, again, ethics have been driving this, which I'm very proud of. So the we belong to like the good health pass collaborative, there's over 150 organizations that are saying, if we are going to track our health status, and quote, your COVID-19 health status, and we are going to be designing digital apps that sit on our phone that show it this, that we need to do this in an ethical way, in a way that protects your healthcare data, because we've got HIPAA compliance and those sorts of things here. And of course, there's GDPR, and in Europe and things. So I am pro digital health passes, if they follow ethical guidelines. Now, what is some principles of a good digital health pass? One is that they're actually pulling the valid health care credential from like your state immunization records, that it's not some of them we've seen is like, Oh, I feel fine today. And then you get a green pass. Right? So one is, one is that you can trust the data that it comes from an authentic source. The other one is that I get to control this, I decide who gets to see it. And I decide what they get to see do you really have to see where I got tested which vaccine I got? Or do you just need to know I don't have an active COVID virus. So we call that the principle of data minimization. So that that's the one hand that's the technology, you also need the regulation on the other side. So where I seen this work really well is in the city of New York, because New York City says before you can go into a restaurant or a gym, you have to prove that you don't have an active COVID virus that could be a vaccination, a test, etc. The New Yorkers walk around with the Excelsior pass. And it's so easy because it's just on their phone, I had to walk around with multiple pieces of paper, right, showing us I'm not in New York citizen, but they had pulled all of that from the New York State immunization records. So that's an example I think of one we call it closing the trust diamond of the governing authority, the issuers of the credentials, the holder of the credentials and the verifiers like the people at the at the restaurant that have to verify it. So that's an example of a good of a good design.

Cindy M.  27:49  
So in that example, though, maybe I didn't understand this part. You have to be a resident of New York in order to use it. You can't use it if your city you have

Mary Lacity  27:58  
Yes, because in order to their polling, the vaccination got it, right. The testing records are from the New York State and actually New York City has its own immunization database. So in order to be in that immunity, if you're in those immunization databases, then you can use the Excelsior pass. Yeah. So I again, I don't know if it means that you have to be a legal citizen, or but if you're in that, in those databases, then yes, you can use the Excelsior pass,

Cindy M.  28:25  
right? It's, I mean, it's so much better. I mean, I like you, I'm walking around with still my like, little like my card that if it gets wet, it's probably just gonna disintegrate or put it in the washer and the dryer. I'm not gonna have it anymore. And you know, all Ah, there you are.

Mary Lacity  28:40  
And it's also very easy, you know, it's very easy to forge those things. Yeah. I mean, even when I got my test, I could have walked off with a stack of them while they were just sitting there. So you can forge those very easily. But if you're pulling the official record from an official database, then you can have some confidence in the in the the authenticity of the data.

Cindy M.  29:00  
Yeah, yeah. And I think even Walmart has come up now with a digital health pass that if you've gotten your vaccine at Walmart, they also have a digital where and how that's used, as I still think a little unclear and hasn't necessarily been accepted. And you know, in a lot of different places, like you're talking about New York, you they pull the records from New York, and it's valid if you're in New York. So yeah, it seems like there's a lot of room for growth here. But also we need to bring some the a lot of it is just disconnected. It's not doesn't feel like it's it's it's widely accepted broadly around the world yet, or that there's been enough connections to the different digital health vessels that are out there. But it

Mary Lacity  29:37  
really does depend on jurisdiction. So some jurisdictions are you have governments that are mandating this. Some of them have had some states have mandated No, you can't have them. So it really that's what I said New York work because of the legislation, and the fact that they that the government paid for this digital health pass. Yeah. And did the data polls to make sure that Everything's good, they made it convenient and easy for the citizens to use it. I mean, the app is very, very easy to use. The only tricky part is you have to do what's called Identity binding. So, you know, what's to prevent me from walking around with somebody else's phone. So when you are at a restaurant or a gym, it's not enough to just look at the digital health pass, they also asked for, like a driver's license, got it to do the match. So we call that identity binding. So we got to do that. Okay.

Cindy M.  30:28  
Well, this has been a fascinating conversation. You and I, in particular, could probably go on for days talking about it. But before we leave for the audience's sake, I know you showed us the one great book about surveillance capitalism, but in case somebody wants to go a little bit deeper, you know, kind of beyond your article or that book. Do you have some other good recommendations on resources? I?

Mary Lacity  30:48  
I do, and I'm gonna make them very friendly and accessible. Cindy, right. So the one thing I think all of our students should watch is the social dilemma. It's available on Netflix, you'll see you'll see Professor zuboff, in a little soundbite. There. It'll just sensitize them to their digital, their digital crumbs that they are leaving on all their social things. So I think it will be sensitizing and it's very accessible, the social dilemma on Netflix, then the other thing, you're gonna laugh that I recommend this, I recommend that people watch Star Trek The Next Generation, it still holds up, we just went through the whole series. Again, what I love about it is so much of technology is dystopian, yes. Look at the future. And this is not this is really saying, you know, there's no money in Star Trek, because we've evolved beyond money to where we promote human dignity and human values. And it's all about self actualizing. And everybody in that, and there's great equality that you see, I love the prime directive don't give technologies to civilizations that haven't evolved enough to know how to handle it. So anyway, I love the next generation, I think it presents a world where we can see ethical uses of technology that uplift mankind.

Cindy M.  32:06  
Yeah. And that's the way that they should be, you know, if we're going to be fully evolved as human, at least in my opinion. And I think, from an ethical perspective, yours too. Those are two great suggestions. So let's leave it there and watch Star Trek, if you have some time. Or the social dilemma, either one. I think those are great. I'll pick that up over the holidays. Mary, thank you so much for this conversation and for sharing your time and your thoughts and your wisdom with the audience. I know that everybody will take a lot away from it and it was a great conversation. Thank you. Thanks for listening to today's episode of the biz the business integrity school. You can find us on YouTube, Google, SoundCloud, iTunes or wherever you find your podcasts. Be sure to subscribe and rate us and you can find us by searching the biz. That's one word th e bi s which stands for the business integrity school. Tune in next time for more practical tips from a pro

Matt WallerCindy Moehring is the founder and executive chair of the Business Integrity Leadership Initiative at the Sam M. Walton College of Business at the University of Arkansas. She recently retired from Walmart after 20 years, where she served as senior vice president, Global Chief Ethics Officer, and senior vice president, U.S. Chief Ethics and Compliance Officer.





Walton College

Walton College of Business

Since its founding at the University of Arkansas in 1926, the Sam M. Walton College of Business has grown to become the state's premier college of business – as well as a nationally competitive business school. Learn more...

Business Integrity Leadership Initiative

The initiative strives to foster a culture of integrity, and promote thought leadership and inquiry among scholars, students, and business leaders to address the ethical challenges inherent in our increasingly complex business world. Learn more...

Stay Informed

Engage with our initiative on social media, and get updates from our email newsletter.