University of Arkansas

Walton College

The Sam M. Walton College of Business

Season 2, Episode 10: Interview with Kirsten Martin

Kirsten Martin
November 12, 2020  |  By Cindy Moehring

Share this via:


She started her career in engineering, but our guest, Kirsten Martin, is a Professor of Technology Ethics at the University of Notre Dame's Mendoza School of Business.

Kirsten is a nationally recognized expert in privacy, technology and corporate responsibility. She's published numerous articles on privacy and the ethics of technology in leading academic journals including the Journal of Business Ethics, the Harvard Journal of Law and Technology, the Journal of Legal Studies, Washington University Law Review, Journal of Business Research, as well as in practitioner publications such as MISQ Executive.

Kirsten serves as technology and business ethics editor for the Journal of Business Ethics. She's also a member of the advisory board for Future Privacy Forum, a fellow at the Business Roundtable Institute for Corporate Ethics, and she's highly a regarded expert speaker on areas including privacy and the ethics of big data.

Podcast:

Episode Transcript:

00:00 Cindy Moehring: Hi, everybody. I'm Cindy Moehring, the founder and executive chair of the Business Integrity Leadership Initiative at the Sam M. Walton College of Business, and this is The BIS, the Business Integrity School podcast. Here we talk about applying ethics, integrity and courageous leadership in business, education, and most importantly, your life today. I've had nearly 30 years of real world experience as a senior executive, so if you're looking for practical tips from a business pro who's actually been there, then this is the podcast for you. Welcome, let's get started.

[music]

00:38 Cindy Moehring: Welcome back to another episode of The Business Integrity School. We're in season two, and I am so fortunate to have with me today, Professor Kirsten Martin. She is a professor of technology ethics at the University of Notre Dame's Mendoza School of Business. Hi, Kirsten, it's good to have you here.

00:54 Kirsten Martin: Hi, good to be here.

00:55 Cindy Moehring: Great. Kirsten is a nationally... Kirsten, sorry, is a nationally recognized expert in privacy, technology and corporate responsibility. She's published numerous articles on privacy and the ethics of technology in leading academic journals including the Journal of Business Ethics, the Harvard Journal of Law and Technology, the Journal of Legal Studies, Washington University Law Review, Journal of Business Research, as well as in practitioner publications such as MISQ Executive. Kirsten serves as technology and business ethics editor for the Journal of Business Ethics. She's also a member of the advisory board for Future Privacy Forum, a fellow at the Business Roundtable Institute for Corporate Ethics, and she's highly regarded expert speaker on areas including privacy and the ethics of big data. And that's gonna be what we're talking all about today. [chuckle] Kirsten also authored a forthcoming book with Ed Freeman and Bobby Parmar, titled The Power of And: Responsible Business Without Trade-Offs. Kirsten earned her BS in engineering from the University of Michigan, and her MBA and doctorate degree from the University of Virginia's Darden Graduate School of Business, quite an accomplishment there.

02:03 Kirsten Martin: Oh, thanks.

02:05 Cindy Moehring: Many things you've been doing, and we are very fortunate to have you here today.

02:07 Kirsten Martin: Thank you very much, it's good to be here.

02:09 Cindy Moehring: You're welcome. So Kirsten, we are talking all about the future of business ethics in this season two. And in my opinion, you are really on the cutting edge of that with your background and experience and where you've chosen to focus. So just take a few minutes and talk with us about how you ended up getting a engineering degree, but ended up focusing on ethics and tech. How did you arrive at where you are?

02:39 Kirsten Martin: Oh, that, it's a good question, because it wasn't purposeful in a lot of ways. So I went to Michigan and got my Bachelors in Science and Engineering. And I, like a lot of high schoolers, I loved math and I loved physics, but really math, and my dad was an engineer from Michigan. We were living in Pennsylvania at the time, and so as a good alumni of the University of Michigan, I thought it was the only place to go, and I believed it's the best place in the world, which, of course, I passed on to my own children. He also said, "You really should think about engineering." And I knew a good friend of ours whose daughter was an engineer at Northwestern, and so it all seemed completely reasonable for me to go and study engineering at Michigan. And so I did industrial and operations engineering there. And then when I left, I graduated and then I went and did systems consulting work for Anderson Consulting. I just really coded and designed systems, and then coded those systems or implemented large ERP systems into a corporation, and I was in pharmaceuticals, and mining, and telecommunications, and all over the place. And I did that for about four years. And I always knew that I wanted to go back and get my MBA because I just, I knew that I didn't really want to code forever, or do those types of projects forever, although it was a great first job, 'cause I learned a ton.

04:02 Kirsten Martin: And then while I was getting... I went to Darden to get my MBA, and this is back in like '97, '99, and that's where I had Ed Freeman as my business ethics professor my first semester there. My first year at least, my first semester. I took a couple more classes with him my second semester. I knew that I wanted to get then, I was like, "I love school." I saw my business ethics professors, or just all my professors at Darden and I thought, "You know, I could do that. That is something I could do. I could be a professor at a business school." And then I applied and then got into the Darden School BIS PhD program, but by then I was already moving, I was going up to DC. So I went up there for a couple years. So this is like... I then went back to get my PhD after I worked for a couple years in telecom, and this was the fancy years of web hosting and building out big web hosting platform, so that was what I was doing in their strategy group. And so it's funny, web hosting is just general now, but this was back in '99, 2000, 2001.

05:06 Cindy Moehring: Yeah, different then.

05:07 Kirsten Martin: It was all different, it was kind of cutting edge at the time, and so... And then went back and got my PhD at Darden. And when I was there, they were great. They said most PhD students have to take... So business schools, you might have already talked about this, I'm not sure, but business PhDs are a little bit different than a lot of other masters, in that the Masters does not count for the PhD. So your MBA, that coursework is so different than a master's degree, the coursework that you have to take for your PhD, which is all theoretical, whereas MBA program is really like a trade school where you learn accounting to be an accountant, not the theory of accounting, or the theory of economics. But instead of making me take statistics, they said, "Well, you probably already have that from being an engineer undergrad. Don't worry about learning statistics again." I was able to go and take a bunch of classes in the engineering school with someone named Deb Johnson, a professor there in scientific technology studies, which is really like engineering ethics, where they talk about technology and the ethics of technology.

06:10 Kirsten Martin: How do we think about technologies being value-laden and having an impact on the world, and it being designed to have certain attributes that will then force the humans to do certain things. And it was this mixing together of learning how people talk in a very nuanced way about technology, but then realising that the business community were the people that actually made that technology, and we didn't have a similar robust discussion about the ethical implications of the technology that we were putting out into the world. We're really good on the environment or the strategy that we're taking, or the way that we treat humans in the organisation. What our relationships are like with suppliers, what our suppliers are doing, or, were you responsible for that? But we weren't really talking about the goods and services that we were making, other than the Ford Pinto case and a few... The Tylenol case, where there were huge blow-ups. And so that was the marrying of those two things.

07:04 Cindy Moehring: In the past, we've been exploring a Harvard Business Review article that was written about 25 years ago that was describing then, what's the matter with business ethics. It was written by Andrew Stark. And he had said at the time, 25 years ago, of course, that he thought that business ethics, the way it was being taught at the time was too general and too philosophical and too theoretical to really be useful. You've been exposed to it now within the last 25 years, the whole field. Do you think that's still the case today at all, in any way?

07:39 Kirsten Martin: I guess my issue would be, I don't know how other people are teaching it. So I don't know a survey of the field as to how people teach, 'cause I will say the way people teach is very different than the way people do research. And so even though I'm meeting all these people at conferences and they're very nuanced about how they talk about their scholarship, that's not to say that they actually teach in the same way. They could go in and be lecturing about Aristotle and for all I know, never talk about an actual case. In my experience, there's a lot more thirst for cases, where you start with an actual case and make people work through the case to get to the theory. So start with the actual situation, ask the students to put themselves in that position, ask what they would do and get them to debate it.

08:24 Kirsten Martin: And then in the aftermath of that case discussion, talk about what the theories were that they were talking about, whether they knew it or not, and how they illustrate major arguments that will be made in business. Those people that don't go by business ethics rules or the business ethics philosophies are pretty apparent, because they're on the front page of the newspaper or they're fired. There's a joke that finance gets you hired and business ethics get you fired. And even mistakes that people make in finance or operations or accounting, unless it's some form of fraud or harm, you're usually not fired for it if it's just an accounting error or something like that. So I think that there's a lot in other fields that comes to rest on people in business ethics, if that makes sense.

09:13 Cindy Moehring: Yeah.

09:13 Kirsten Martin: So the accountant that commits fraud, we don't blame the accounting professor, we look to the Ed Freeman and say, "What were you teaching him? Why didn't you teach him not to do fraud?" I think that that is one issue with the constant focus on business ethics, not teaching the "right way."

09:30 Cindy Moehring: So let's talk for just a few minutes about a couple of articles that you've written. One in particular was in MISQ Executive Lately, and it was titled "Ethical Issues in the Big Data Industry." And in that article, you examined big data as an industry in and of itself, and the issues that arise from re-selling consumers' data to the secondary market for big data. Big data, as you explain it, combines information from diverse sources to really create knowledge, make better predictions and tailor services. So let me ask you first, whenever I think about ethics and big data, the first case or issue that comes to my mind is Facebook and Cambridge Analytica and how all of that played out. So tell us a little bit more, if you wouldn't mind, about your article, talk about that case in particular, and share with us what some of your observations were and recommendations in the article to go forward.

10:29 Kirsten Martin: So that article was to say, was to make two levels of arguments. The first one was, we need to start thinking about these companies that buy and trade information from the front-facing website to the trackers that are on the website, to the data aggregators that are sucking up from the trackers and recombining the data, and then they pass it off and sell it to an ad network who then places ads or tries to manipulate us on a website that we're later looking at.

10:57 Cindy Moehring: Right.

10:58 Kirsten Martin: We need to look at that more as a supply chain, just like we do every other piece of goods and services that are passed along and manipulated along a supply chain. And when we think about a supply chain, there's a whole level of supply chain responsibilities and who's accountable for something that goes on in the supply chain. There's an open discussion as to whether or not someone like Nike is actually responsible for the problems of their supply chain, and whether or not they actually just turned a blind eye, or maybe even put incentives for the poor treatment of their workers. Same with Apple, more recently, with Foxconn. So we have a pretty decent understanding and thought process around the supply chain ethics, and what I was asking at one level was, we need to start thinking about that with the information supply chain, so that we need to think about whether or not a front-facing website, take the Washington Post, should be responsible for the third party trackers that it's allowing to be on that site to track me when I'm on it.

11:52 Kirsten Martin: And so we don't think about them being responsible for those third parties, but maybe we should start having them be responsible. So who was at the other end, who's responsible for the ads that I'm seeing or being manipulated by Facebook with Cambridge Analytica, who is responsible for that interaction? And what I was trying to say is, I think those front-facing gatekeepers that keep us watching that screen should be held more responsible for the third parties they're allowing in. So, that's one level of the big data industry that I was concerned about.

12:21 Cindy Moehring: Yep.

12:21 Kirsten Martin: So the other level... And you can see that playing out more and more, I have to say. So we see more and more... There was just an article yesterday about campaigns in 2016 hyper-targeting African-Americans to suppress the vote. But that level of manipulation isn't shocking to anyone in the industry; they just know that happens all the time. They're able to only place apartment ads for a certain race or ethnicity group, even though that's against the law. But that's possible within a lot of social networks. So that information supply chain and pushing out to the "gatekeepers" that I call them, the front-facing, consumer-facing companies, and asking them to be held responsible is one level. At a second level, my concern with the general industry was that the companies, those data aggregators that are behind the scenes, we have no relationship with them. We don't know who they are. Unless you read an FTC report, there's no way for you to know who those third parties are. There is no consumer pressure on them to stop their bad behaviour. There's no way for us to withhold our business to say, "I don't like you axiom a data aggregator, and I don't like what you're doing with our data. And that's odd in that they're able to with abandon, gather and suck up data without any regard to whether or not there is a privacy expectation on that data, if it was gathered with our consent. There's no pressure on them to make sure that's done at all.

13:55 Kirsten Martin: They can just gather up the data, in fact, there's an incentive for them to actually gather the data, 'cause they can actually make money off of it, and it doesn't really matter if it abides by our the expectations or not. And I liken it to... And this takes a little bit of a jump, but I liken it to the mortgage crisis back in 2008. In that you had all these banks that had an incentive to kind of suck up mortgages without any regard to whether or not they were well documented, whether or not they were underwritten well, whether or not the person actually had income, they just needed mortgages to fill the tranches of these credit default swaps. It's in the credit default obligations. And so in the beginning, there was almost a push. People... I went and got a mortgage, or we financed my mortgage, and then the mortgage holder would say, " I gotta sell this thing, who wants it?" And then they turned around and gave it to somebody.

14:47 Cindy Moehring: Right.

14:48 Kirsten Martin: And eventually the profits from creating those CDOs, and CDSs became so great that there was almost a turn in a pool to say, "We need more mortgages, can you go out and get more mortgages? I don't care how you get them, just get those mortgages." And so we got worse and worse mortgages, with No-doc loans, no one... No real income, people buying three or four houses. And there was this great demand for mortgage originators to actually go and get mortgages no matter what. You can almost see the same thing happen in the very beginning when people track all these, people who were going online, and people would turn around and say, "I've got this data, anyone want it," the data aggregators were like, "Okay, yeah, sure. That looks great. I'll take that data." And so they bought the data or they put trackers on the their website. And eventually that backend became so profitable that the data aggregators were like, "Look we'll do whatever you need, become a weather app, a flashlight app, a scan a PDF app." It's just the front. They're just trying to gather your data, get your contacts, your photos, your GPS. All these apps are just kind of fronts to be able to gather your data however they can.

15:54 Kirsten Martin: They're not actually concerned with being able to scan a document, give you the right weather, or be a flashlight app, they're really just there to gather your data. And so in the same way that it used to be a push, people would just gave over the data and then people would ask, "Does anyone want it?" Oh, these data aggregators do. In the same way, we now have almost a pulling where the data aggregators just want more and more data, more and more fine-tuned data about us to be able to paint a broader picture, and it's actually placing a demand for more and more of our data, and that's why you see more apps and more websites gathering a wider range of our data that they don't even need. So there's no need for that data. But at the time, right now, there's no pressure on them, there's not even... At least back in the 2008 crisis, there actually were people that had to go and check... The people that check the credit default obligations and credit default swaps, they were audited on a regular basis.

16:43 Kirsten Martin: Now, they didn't work very well, so I'm not saying that that was perfect, but there was at least somebody asking questions. Right now, those data aggregators, they can do whatever they want with that data. And they can sell it to an ad network, they can sell to a foreign country. So anyone can use it to manipulate or target us however they want. And that's the way I saw it back in 2015, it was the same type of unregulated gathering of data to be able to use without any market pressure, not from stockholders, not from employees and not from consumers which are the main touch points on businesses as to where the market might make a correction, so to speak, as to their bad behaviour.

17:22 Cindy Moehring: So what do you think now, five years later after writing that article about this state of affairs and what do you think ethically should happen with the Data aggregators?

17:35 Kirsten Martin: What I argue is that the oddest issue right now is that if an economist just came down from the sky and plopped into our world, he or she would say, "How did these folks get your information if you didn't give it to them?" That's the weird thing, because back in the '60s and '70s when different approaches to the economics of privacy and the economics of information were first being put out there, obviously the way information could travel was really, really different. They just made certain assumptions. There's a group of people from Chicago that just would argue. "You know, if you disclose the information, we're just gonna assume that it's free for someone to use", because they made the assumption that it's always easier to ask the person for the information. No one's gonna buy or sell information. They'd literally say that because it's too expensive and the machines would be too big. Like who could ever hold that data? Nobody could. The oddest thing right now is the question, how did these companies get our data? Data aggregators, ad networks when we did not give it to them? And in fact, that is the... What we need to do is add in those safeguards that every other company has in place when they share those, share their data with another company, is to add back those safeguards, is to say, "We need to have those companies being audited."

18:49 Kirsten Martin: So I would recommend that they are actually audited with Gap-like principles around data integrity, that they have to answer once a year or once a quarter as to what they're doing with the data, who do they buy and sell it to? How is it being used? Is it in the best interest of the individuals on which they hold data?

19:05 Cindy Moehring: Right.

19:06 Kirsten Martin: And then that's one, and that would add a big cost to the data aggregators. So there could be some data aggregators who just say, "I can't do that, it's too expensive because of what I do, I don't wanna explain. And so I'm just gonna go out of business." And in my mind, that's okay. So if they don't wanna be audited, they should just stop. And that would have to be... They have to be a government group within, say, Treasury or commerce that would actually go and do that type of work. The FTC could do it as well. There's different theories of that.

19:32 Cindy Moehring: Right.

19:32 Kirsten Martin: So that's number one, is to put a bigger cost on those data aggregators. But then the second is that I think those gatekeepers, like the front-facing firm that I go to when I go to Washington Post and New York Times, I think they should have to be responsible and say, "We've looked in all of our suppliers, they actually abide by these data integrity principles and so we only allow third parties in to get your data that we've approved," and they have to affirmatively say that. And it's funny, that actually is done in other instances. So for example, hospitals are actually held responsible for the third parties that they bring into the hospital. So we have a relationship with the hospital, we don't have to separately make sure that the third parties they bring in to give us medical care are kosher, are okay. We just make sure the hospital is actually in charge of that relationship because we're there and we have to be. Universities bring in people from career services. We don't have to vet every single company where students, were held within the trust of the university. The university vets those companies before they come in and interview. So this idea of being responsible for third parties isn't odd. It's something that we've done in other areas of life, especially when the consumer can't be held responsible for what those third parties are doing, and that customer-facing firm or company is in the best place, in the best position.

20:50 Kirsten Martin: So those are two things that I recommend. One is consumer-facing companies need to be held responsible and actually affirmatively say, "I've checked out these third parties. They abide by these principles and so, you can check them out yourself, but I'm saying that they're okay and they abide by the principles that I want." And then those third parties that are behind the scenes, I think they need to have a data ethics professional that responds to audits on a regular basis. And that would make them track how they're using the data, which they don't right now.

21:17 Cindy Moehring: So let's switch gears just a little bit and talk about ethics and AI, you wrote another article about that recently as well in the MISQ Executive Magazine, and how those two live together. What you say in that article essentially is that artificial intelligence and algorithms are biased, just like other decisions that humans make. So the question is, as we rely more and more on technology and algorithms, what can we do to make it less biased? Because there have been a lot of examples of things going really badly wrong with algorithms, like the Ubers driving car that crashed, didn't recognize the human form outside of the crosswalk. You know, the black man that was charged with a crime that he didn't commit because of faulty...

22:05 Kirsten Martin: Right, in Detroit.

22:05 Cindy Moehring: Facial recognition technology. That Apple Pay card, who knows for sure whether or not they were making appropriate credit determinations, but there were arguments out there that they weren't, and that people rejected it with men and women who had the same backgrounds, but the men got more credit than the women did. So what do we do? What did you have to say about that in your article and where do we go from here?

22:27 Kirsten Martin: I think some of the mistake is... Well, I shouldn't use the word "mistake," because I use the word "mistake" in that article all the time. So part of the issue that we're dealing with right now is that we see these types of mistakes as anomalies, like almost surprised that they are occurring. I think, just like every other human decision... 'cause it's really just an AI augmented decision; there's still humans that are working at the company... That these are gonna make mistakes. The problem that we have as a business is that they're made faster and they're also made so the impact is larger, and it's more likely they'll get caught. So that's part of the problem. So if you imagine, if you imagine a whole bunch of people asking for credit, taking the Apple case... Or an easier one might be whether or not you get insurance, just to pick on the insurance industry. If there were a whole bunch of insurance agents, all not really coordinated, barely.

23:20 Kirsten Martin: They have the same policy manual, but they're all spread out across the nation. If you have a few that are biased, it may be a problem but there's also gonna be some that are not that biased, so it's gonna... Not exactly wash out, but it's not gonna be so prevalent. The problem that we have is what this does is it centralizes the bias and it codifies it, so now everyone's on the same page as to what the "right answer" is. And so that's why we see instances of this so much more, is because it's more uniform. Everyone's getting the same treatment. So if we at least step back and think that AI is not this panacea to actually get rid of bias.

23:57 Kirsten Martin: In fact, there was an HBR article that literally said, "You want less biased decisions, use AI." And I was just like, "But that's the wrong answer," only because it makes us think the decision doesn't need governing. So when we have humans, we're constantly checking to say, "Did we let the right people into the university this year? Were the right people hired? How well did we retain these people? When it's human, we always know it's fallible, so we're constantly looking over our shoulders and thinking, "Are we making the right decisions? How can we check to make sure that we made the right decision?" So in some ways, that article was just asking the same level of governance that we do over humans, you've got to have the same level of governance over AI and be thoughtful about the computer program that you're putting in to say, "Okay, mistakes are gonna happen. Which mistakes do we prefer? Do we prefer a false positive, I hire someone that I shouldn't have? Or do I prefer a false negative, I let someone go that should have hired?" And that changes based on each context. Sometimes even in the hiring decisions, at the early stages, you wanna let more people in for Diversity and Inclusion. But at the very end, you really don't wanna hire the right person. So in the end, you better drift far to the false negative to say, "Look, I let the wrong person go, but I'm okay with that."

25:12 Kirsten Martin: So it just is to say that the types of mistakes... One, mistakes are normal, and we should expect and predict mistakes and just assume they're gonna happen. And the question is more, "How do I identify a mistake when it occurs, make sure that I have an apparatus to check, how do I judge to whether or not, it's a good mistake or not? Do I prefer it to the alternative? And then finally, how can I correct the mistake so it doesn't get in this vicious feedback group where the mistake has been compounded over and over again?" Cathy O'Neill had a great book on this called The Weapons of Math Destruction, M-A-T-H Destruction, but the main take away of that book was that sometimes this AI, what it feeds on itself with machine learning could actually create bigger problems than we had before, because the bias decision is feeding back in to train the algorithm, and then it goes on from there. The main takeaway was that mistakes are normal, they're expected. An unethical decision is one where those mistakes are not governed, so we have to talk about the governance of those decisions, not whether or not it made a mistake is being wrong. Mistakes are common and predicted, and so we just need to talk about the governance of it.

26:21 Cindy Moehring: Well, it certainly is an evolving field. And you are, again, are right at the apex, I would say, of leading into the future. So if I were to ask you what three words were to describe what you see as the future of business ethics, what do you think it would be?

26:39 Kirsten Martin: I think that's a good question. So, this I will say as in both scholarship and in practice, I think there's going to be a backlash towards more marginalized stakeholders. And what I mean by that, are stakeholders that are legitimate, that have claims that are legitimate, but have almost no power and they have almost no relationship with the company. And so, the people that this would be are people like, victims of revenge porn, cyber bullies, and people who are cyber bullied, teens that were using vaping products. These are folks that really have not a lot of power in the world, yet are disproportionately affected by the actions of business.

27:24 Kirsten Martin: And I don't think our current conversations I just did a quick... The victims of Gamergate, a great example from X number of, back like 10 years ago, where women in the software industry were being really bullied online and real threats going after them, and the platforms and the companies with which they worked just said, "Oh, what's to do? There's nothing really we can do about this." There were no laws on the book at the time, they didn't really have much of a say. They had a legitimate claims against the platforms and the companies that they worked for, and yet there was a hands-off approach. If you look in the Business Ethics Scholarship, no one has ever mentioned Gamergate. It's never come up in scholarship. And so, I think that that's another area where I think that there's continually people affected by technology that aren't consumers. Does that make sense?

28:16 Kirsten Martin: That aren't giving money to the technology. And we, within business ethics and business schools and in practice, have a hard time thinking about them. We don't know how to deal with it, because in the past the only thing we could think about was, if I drove a car and it accidentally hurt someone on the streets, I had to worry about that, but that was actually the driver's fault unless I had faulty brakes and then there was a tort claim that someone sued me. That's the level that we worried about, other parties being injured. The only other time it's come up was with pollution and the communities around a company.

28:49 Kirsten Martin: And that, we were slow on that too. We, within scholarship and within practice were slow, being able to think about, "How do I think about the community when I don't have a financial relationship with them?" I think that that's one place. That's a general area, but I do see that as a place where business and business schools and business ethics, struggle with having the theories to deal with it, if that makes sense. We usually assume... Now, this isn't what Ed Freeman meant with stakeholders. Ed Freeman actually would include those people as stakeholders 'cause they were influenced by the technology. However, a lot of people who use stakeholder theory think that there has to be some sort of financial relationship with the other party. And so, we've gone away from, I would say, the original ideas around stakeholder theory in that way. So that's one, marginalized stakeholders. Although, it encompasses a lot.

29:40 Cindy Moehring: It does.

29:42 Kirsten Martin: Yeah, I think automation in general, including facial recognition is gonna be big. I think businesses relationship with the government is gonna come up. That is a big piece of concern for a lot of employees. And so, whether or not companies like, I'm just thinking about Palantir just got in a lot of flak because they have a relationship with ICE, where they helped with the detention of immigrants coming across the border. It's not whether or not, I think we within the Business Ethics community and maybe professors in general, aren't as attuned to what 22-year-olds are talking about, but when 22-year-old engineers are talking about where they wanna work, they're not mentioning Facebook and Google anymore. They're not mentioning Palantir, you know what I mean? There's almost a backlash against those groups and somewhat for the relationship with the government, and so that'll be interesting to watch how that's impacted. To that, I would say those are three...

30:39 Cindy Moehring: Yeah, those are really good. Well, this has been a wonderful conversation. I feel like we could talk for hours because there's just so much information here to digest and think about. So, let's end on a fun question that I like to ask everybody, What are some recent, if you've got them, shows or books or podcasts that you've been watching or reading or listening to, that are fun but also have this ethical or technical and ethics combined lens to them?

31:09 Kirsten Martin: Oh yeah. Okay, so I can talk about the ethical line. The one that I have been enjoying over quarantine is actually Grey's Anatomy. I have been back looking at Grey's Anatomy. I never watched it, but there's a group of my family, there's three of us that all watch it together, and it is interesting to see the dilemmas that come up and how we will debate as to whether or not somebody did the right thing, working for that hospital.

31:37 Cindy Moehring: Well, that's a good one. I have to admit, I haven't watched Grey's Anatomy, but I think I may pick it up.

31:40 Kirsten Martin: It'll suck you in.

[laughter]

31:42 Cindy Moehring: And there's lots of seasons, so it's a really good COVID one to do.

31:47 Kirsten Martin: Exactly, and they're still making them, so I'm not even caught up yet, so I have plenty of time.

31:52 Cindy Moehring: Okay, very good. Well, Kirsten, this has been fabulous, and congratulations to you for your new role there at Notre Dame. I look forward to seeing what's gonna be happening with the Tech Ethics Center there, and thanks for just joining us today. It's been a great conversation.

32:06 Kirsten Martin: Oh, thank you, I really appreciate it. Thank you for your work.

32:08 Cindy Moehring: Alright, thanks. Bye bye. Thanks for listening to today's episode of The BIS, the Business Integrity School. You can find us on YouTube, Google SoundCloud, iTunes, or wherever you find your podcasts. Be sure to subscribe and rate us, and you can find us by searching The BIS, that's one word, T-H-E-B-I-S. Tune in next time for more practical tips from a pro.

 

Matt WallerCindy Moehring is the founder and executive chair of the Business Integrity Leadership Initiative at the Sam M. Walton College of Business at the University of Arkansas. She recently retired from Walmart after 20 years, where she served as senior vice president, Global Chief Ethics Officer, and senior vice president, U.S. Chief Ethics and Compliance Officer.





Walton College

Walton College of Business

Since its founding at the University of Arkansas in 1926, the Sam M. Walton College of Business has grown to become the state's premier college of business – as well as a nationally competitive business school. Learn more...

Business Integrity Leadership Initiative

The initiative strives to foster a culture of integrity, and promote thought leadership and inquiry among scholars, students, and business leaders to address the ethical challenges inherent in our increasingly complex business world. Learn more...

Stay Informed

Engage with our initiative on social media, and get updates from our email newsletter.