University of Arkansas

Walton College

The Sam M. Walton College of Business

Season 5, Episode 1: Ethical Agility & Technology Perspectives with Tom Creely

January 20, 2022  |  By Cindy Moehring

Share this via:

In the first episode of the BIS Season 5, Cindy visits with Thomas E. Creely, the director of Ethics & Emerging Military Technology Graduate Program at the US Naval War College. Join their conversation to hear about Creely’s journey from Navy chaplain to leading tech ethics, the need to update technology laws, and the differing perspectives on technology’s applications.

Podcast

Resources From the Episode

Episode Transcript

Cindy M.  0:03  
Hi, everyone. I'm Cindy Moehring, the Founder and Executive Chair of the business integrity Leadership Initiative at the Sam Walton College of Business, and this is the biz, the business integrity school podcast. Here we talk about applying ethics, integrity and courageous leadership and business, education, and most importantly, your life today. I've had nearly 30 years of real world experience as a senior executive. So if you're looking for practical tips from a business pro who's been there, then this is the podcast for you. Welcome. Let's get started.

Hi, everybody, and welcome back to another episode of the biz. The topic for this season is all things tech ethics related. And boy, do we have a great lineup for you this season? To start with? My first guest today is Tom Creely. Hi, Tom. How are you?

Tom Creely  0:58  
Hello, Cindy. How are you?

Cindy M.  1:00  
I'm good. It's great to see you. Let me just tell you all, although this will be published later, we're recording this on Veterans Day today. 1111 21 when there couldn't be a more perfect day than today to record this session on tech ethics with Tom. Tom is actually a retired Navy chaplain. And so Tom, Happy Veterans Day to you. And thank you, thank you for your service. Yeah. And in Tom's career in the Navy, he was at sea, he did some short tours, he was overseas and spent time in the Marine Corps as well. So he has a very varied experience there. And now on this end of his career, Tom is now the creator and director of the ethics and emerging military technology graduate certificate program at the US Naval War College. And in this role, Tom works with the joint Artificial Intelligence Center. He works with the Defense Innovation Board, the cyber solarium commission, and the National Security Commission on artificial intelligence, among many other things. Tom Creely, was also an ethics adviser to DARPA project cornucopia. He's also an affiliate of the John research group at the University of Wisconsin Madison, and he is a where he facilitated the creation of food source usa.com In response to mitigating food waste during COVID-19, which was super important. So we are just very fortunate to have you here with us today, Tom. And like I said, There couldn't be a better day to have this discussion with you. And I wondered if you might just start with sharing with our audience a little bit about your your background and how you got from being a Navy chaplain to now leading a graduate certification in tech ethics at the US Naval War College.

Tom Creely  2:52  
Okay. Yeah, back in the late 90s. I got into a pH. pH. D. program that was in the humanities, and focused on ethics of technology. I've always had an interest in technology, even though I'm a philosopher, by trade through the years. And I will say it's more of a calling to go into that direction, because this is a totally new discipline of ethics of technology that has really come up in the last. So I guess you could say it started in the 1950s, after World War Two, as a result of the impact of technology during the conflict, and in the industrial military complex. started studying and researching, and it just became an addiction for me, and continue it. So I did I finished my PhD, and which was on internet privacy for children at that time. Yeah. And then I started teaching the University of South Carolina to Georgia State and Kennesaw State, but no one really wanted to hear about ethics of technology. Everyone mentioned the word ethics of technology, and they would look at me like what planet did you step off? Yeah, I bet you with anything. Yeah. And so I finally got to the war college and invited to come up to be a part of the ethics program there. And I worked with the provost, and he was a scientist. And he liked the idea of concept and who was so he said, Go for it. And so I did even get some opposition of some people there. Because they were not familiar with this whole concept. Yeah. And so you know, eventually if it about three years ago, started it six years ago, three years ago, it just has exploded, I bet. And with our environment, vomit and a number of things that you mentioned the artificial intelligence Commission, which came out with its final report to the president Congress. Back in the spring, and it has expanded beyond government and working now with outside organizations. And certainly, you know, like the University of Arkansas and other universities, wanting to know, and also promoting this whole concept of ethics of technology. Yeah. So it's been fun. It's been a lot of work. And so we established the ethics and emerging military technology graduate certificate. And as of last week, we've had 39 people to complete the program. It is competitive. It is above and beyond their master's degree at the Naval War College. Wow. And the topics we discuss in the program, or they research as well as discuss, or artificial intelligence, Nano, neuro bio, social media data, you name it, anything that can be weaponized. And also, we look at it from a humanitarian point of view of how technologies can help in assisting humanitarian efforts. Yeah, we have a whole humanitarian section at the Naval War College. 

Cindy M.  6:11  
Yeah and one thing I think we probably ought to make clear for the audience is that you don't have to necessarily be in the Navy to attend the Naval War College. It is. Your program is open to mid level and senior level diplomats in the in the federal government and from the intelligence branches. And so there's a number of people. Now I would imagine demand is just completely outstripping supply on this topic for you. So

Tom Creely  6:39  
you know, the biggest problem we have is bandwidth. And yeah, we can take care of expanding the program. And so that's why it's selective and competitive. Yeah,

Cindy M.  6:47  
absolutely. Yeah. Yeah. Well, let's dive into this topic here. And I want to I want to start with a definition of technology from a chapter in a book written by a guy named Ian Barber, and I'll let you tell everybody who that is. But the definition of technology, as he describes it is, it's the application of organized knowledge, to practical tasks, to buy ordered systems of people and machines. I think that last parts really super important. But so that's actually the definition of technology. And then he goes on from there are two, it really spells three different views of technology. And I'm really interested in Tom and in what your view is here, and whether or not you agree with Ian, but let me just lay out what his three are, that he discusses in this chapter. One is that tech is seen by some people as being a liberator. So it's a source of higher living standards, arguably improved health, better communications, all of those things, others, the second position is that tech is a threat. And so people of that persuasion, believe that it leads to alienation from nature from the environment leads to environmental destruction, the mechanization of human life and loss of human freedom. Then his third view is that he says, Some people see tech as an instrument of power, that technology is actually ambiguous. And that it impacts its impacts vary according to the social context in which it is designed and used. And that's because it's both a product and a source of economic and political power. So now, not to give away the punchline, but you know, and you can absolutely disagree with them. But Ian would say that it's a technology is that last one, I think that it is an instrument of power. But what I'm interested in your views, what do you think is technology a liberator, a threat or an instrument of power?

Tom Creely  8:44  
I think it is all three of those. As a liberator, we certainly could have benefited from the development of medical technology to cure diseases and to treat illnesses, perform surgeries, it has certainly been important in cleaning our environment, and using these technologies. So in it's also, let's say, for instance, the internet, which we're working through right now has given us this ability to be able to communicate and have this program, as well as communicate and connect with people around the world. And so that has given tremendous freedom to a lot of people who have been bound by you know, their, their particular geographic areas, lack of communications. And you know, the internet is pretty much free in a sense, if you look at it from that. And so this has been a great equalizer in across the globe of people having a voice and so therefore, it does provide liberation for people in many other different ways. And I have just described and certainly as you know, professors and running an Ethics Center like yourself, you know, being able to research has been There's power in being able to write and think about the emerging topics that we have to deal with. In technology as well. Yeah. The second part is a threat. It's a thread in the sense that this is so much distributed, go back to also power, distributed power. Yes, that people all across have access to this, people are empowered by the technology that we see, particularly with the internet in particular, also, but you have threats from technology that people were using in artificial intelligence, you know, we look at technology as good or evil. Yeah, in the phone, the user uses it. For good or for nefarious purposes. Yeah, certainly, artificial intelligence is important in taking, for instance, the Army has 55 million slides, pathology slides, that they're digitizing in order to have disease prediction and what led to diseases and how to treat that's right order for society. Sure, it is thing time, artificial intelligence can be used nefariously. In order to hack into systems, for instance, they're shutting down the Colonial Pipeline here. Right, which was quite dangerous came within just a few hours of shutting the whole east coast down. So that's also indicative of power, but it's the threat so they are continuously we see other hacking we see that is occurring in businesses and organizations only daily basis, our own personal bank accounts or risk as a threat from technology. And then you know, you have the binary.

Cindy M.  12:05  
So can you explain that one for the audience for just a minute, our own bank accounts are can be a threat for that technology can be a threat to our bank accounts. Is it because of the hacking in?

Tom Creely  12:14  
Oh, absolutely. Yeah, I mean, I have been a perfect example, my wife and I our account has been hacked numerous times. And we've had to change our account numbers, numerous times, and even stolen money. I mean, they are so good at the hacking, which you have to stay on top of your personal business on a daily basis. Yeah, and maybe sometimes hourly basis, in order to protect yourself, you know, another threat, as we have all witnessed with COVID-19, the biology biological issue that we've had to deal with, and look how that has changed our lives in in affected the economy, the supply chain, in many other factors that it has affected. So you know, that those threats will continue to be there. And then it's a power. It's a power of people to use its distributive power for across the globe for other people to take advantage of look at the different groups of people through social media who are protesting, and it's causing social disruption in our country. And also guess what, our adversaries are seizing that power as well to use against us through these disruptive technologies, in what we call hybrid conflict. So they are playing some of these different ideological groups, political groups against one another. Yeah. So center, you have the United States in the socio economic life moving and going on the disruption. But yet on the outside, you have these players that are interjecting their viewpoints in manipulating right, working these opportunities. So

Cindy M.  14:10  
yeah, yeah, more more in the future is going to look very different than I think people's images of soldiers on the front lines, you know, Civil War, World War One World War Two, I mean, the world is advanced and, and so has the way countries fight for power and dominance. And, yeah, and technology can be weaponized in that way. So that brings me then to let's go back to the definition of technology. That last phrase where it talks about technology is the application of organized knowledge to practical tasks by ordered systems of people and machines. Think that of people and machines is really where you start to get into some of these tech ethics issues and so on. I'm just gonna throw out the question, what do you think it means to be human? moral, ethical in the age of technology? That's a big question. I know.

Tom Creely  15:10  
Big question. And we could talk the rest of this. Yes. Yeah, what does it mean to be human an age of technology, if which we had to continuously consider every day that we do research, and the mind bending research in the ethics, emerging military technology, program, technology, and humanity are inextricably linked, lete combined, and we cannot separate ourselves from it. So we, I don't know many people who suffer from techno phobia, the fear of technology. Mm hmm. Almost everyone has some kind of device, whether a phone, tablet or computer, even though you're still controlled by a government that has a lot of technology, but most of us are lovers of technology, and we have techno philia and so the techno Philia is that we embrace the technology without examining and questioning the ethical implications of that particular technology,

Cindy M.  16:17  
IQ, right? Especially the younger generations, they just seem so anxious to just embrace it without really a lot of considerations. So talk about that about it, did you call it techno philia where we just like, the love

Tom Creely  16:29  
of technology, the love

Cindy M.  16:30  
of technology,

Tom Creely  16:33  
now will say, you know, back it up, and you know, the professions of engineering and the medical profession, they have medical ethics, bioethics, engineering ethics that have been in place for a long time, but when you look at the issues on social media platforms, metaverse we talked about earlier, those are have come to fruition in use without a very little question or examination or reflection on the ethical implications. What harm money cause.

Cindy M.  17:04  
Right, right.

Tom Creely  17:08  
Yeah, so you know, how quickly do we run out and get a new phone? Or a new tablet as soon as it comes out? Mm

Cindy M.  17:16  
hmm. With the latest and greatest features, right? Yeah. Yeah. So one thing that comes to my mind when you mentioned medical and how it can be used for good in medical but you know, even in the medical field, it could be used for for ill intent as well. What about bio hacking? What what is that? Can you explain that and how might that be used inappropriately?

Tom Creely  17:42  
Yeah, biohacking in a very simple form is that it is changing your diet and changing your exercise regimen in order to have a better health. It is human enhancement, human improvement. And I have a friend who has cancer and his his biohacking in for himself and do it yourself is working quite well. And that's a good way, a positive way for biohacking. But I think the more problem, greater problems that are occurring, or the other people who have buy CRISPR gene technology kits off the shelf off the internet for a few $100. And they set up experimental genetic splicing in hacking with these kits, and inject these genetically modified genes into their body. Oh, my can they create other than that? So you know, and there's certainly questions about the whole issue with, you know, the COVID night thing that we've experienced as well. There's certainly a threat than people who and I recommend, you know, people look at the film, it came out a year ago actually called Citizen bio, and believe it's on YouTube, one time, and it showed just regular people who have these kits doing genetic engineering, in their basement, in their garage, home with no sense of direction of ethics, and morality. You know, I've talked to some people about this. And I said, well, they really don't have that technical expertise. And this is like four years ago, a scientist who was really into this, and they're not that advanced that to be able to create something that could be nefarious, that could be harmful. Yet, most recently, what I have read is that that has progressed and people skills are getting better. And they have a a concern of the risks that are being created by people who do not work. have the skills, the accountability or the ethics to use such genetic engineering?

Cindy M.  20:08  
Right. And I would imagine nation state actors may may also be using that for nefarious purposes now, yeah, without really considering the ethics of it, nation states as well as Purus. Ninten. Yeah, you're right. Both. Yeah. Which creates huge national security implications for, for the country. So, you know, Facebook has been in the news so much lately, as have a lot of the other big tech companies. But in fact, when one of their former directors of monetization at Facebook, Tim Kendall, testified before Congress last year in 2020, he just said that Facebook very quickly figured out that extreme and incendiary content translated to unprecedented engagement and prophets. He also noted that internally, they were aware that the social media services that he and others have built had really torn people apart with alarming speed and intensity. And so when I think about that, and put, you know, your lens, an overlay on top of that, in terms of national security, and we talked a little bit about how social media is being used to not only tear us apart from the inside, but also by those outside that are tearing us apart. What do you think are some of these overarching moral and ethical impacts of social media being used in that way?

Tom Creely  21:35  
Well, first of all, I look at the user and the information that is being collected from each of us. Someone asked Jaron Lanier a few years ago, what are you going to give us and he responded by we're not giving you anything, you're the product you are what we sell. And so every keystroke is recorded and analyzed and stored as data. And so they develop their algorithms based on what we see online, what we type in, in social media, and therefore they target us and this is all has a psychological aspect behind it, in particular, is concerned for young people, for children who have addictions to the internet, as well as you know, as adults, we have you know, that propensity pick up that iPhone to see what that ring was. Yeah. as well. And so this is a power, we go back to Ian barbers, the comments that is working behind the scenes, of which we have no knowledge of it is impacting and influencing people of all ages. And so that is a real concern. You know, latest young girls and women have been affected by the whole sets of image right here on tick tock, and it's has logical effects. Yeah. And so that is a concern. Again, then is in the local area, how does that affect families? How does it affect relationships with one another? Yeah, it is power. I know. Tristan Harris, who was social media, Ephesus produced the social dilemma. film to watch very good one talks about how it's specifically designed to be addictive. Right. Now, I would suggest people look at the Wall Street Journal's the Facebook files which have been going on for about four or five months now. And it is an absolute astonishing revelation of what they're pulling out of supporting human trafficking, support, I know supporting dictators, and what they're doing, and yet they also have the issue of, you know, censoring certain people here in the United States, or language that they deem appropriate. I gave a good example that in my ethics in a military a multi disciplinary approach or class use trial Triumph of the Will, which is a film that was produced propaganda film, by linear life install. She was probably the most prominent propagandist in modern history at that time of World War Two in the Nazi regime, and it's a powerful film and it shows Hitler in a messianic depiction. YouTube has banded so I cannot Since that film in order to show it in my class for discussion,

Cindy M.  25:05  
wow, even for educational purposes, educational, yeah,

Tom Creely  25:08  
it's no longer on YouTube. Wow. Yeah. So, you know, one of my problems I have with this social media is, you know, we have this whole free playground that everybody can play on. But yet now we have people who are beginning to censor what is said, and what is placed upon social media. You know, the question I have is, what ethical frameworks, important values on these people who are making the judgments, viewing all of these through? Is it an ideological, or is it truly based upon ethics? Right, ontology, utilitarian virtue, justice and rights? Yes. And so, there may be a small, large gap there in the people's abilities and skills of knowledge in making that kind of judgment.

Cindy M.  26:00  
Right. And to me, this is an issue and lenses that you just described, the different ethical frameworks are it are things that cut across, if you will, government, the nonprofit sector, and that in the business sector, everybody needs to be kind of pulling in the same direction with preferably some high level general framework that is generally agreeable. Again, I do think it is going back to a barber, I do think that it may depend it is somewhat agnostic, and may depend on the particular use and the particular situation. But the point is stopping to think about it. You know, back to what you said, we have techno Philia, everybody's just really addicted to it excited to have it, but how many people are really stopping to think at the beginning of the creation of it, the testing of it, the use of it, and injecting these, you know, true, true ethical considerations. I do know some companies now are have formerly given that that title that position to individuals and are compensating them for that Salesforce is one I know, you know, they have some tech emphasis on on board. Microsoft is another one that I know does that. But that, again, is very new. I mean, it came from a place where I mean, that's in the last three years, kind of when you said there was this explosion. And I think about how long Tech has been in our lives and how advanced it was up until three years ago, when the when the horse is already out of the barn, it's a little hard to put it back in. So you know, even with Facebook started with really good intent. And you know, I'm not saying they don't necessarily have a good purpose and intent. But when it grows so fast. It can it without the proper processes, procedures and controls and monitoring in place. It can get you know, technology, it's a great example of how it can just get beyond where we are. And it's hard to put it back. And it's mind boggling to me that the whole tech ethics discussion wasn't really on anyone's mind until about three years ago.

Tom Creely  28:02  
In fact, we do not have a gap between our ethics and technology. Instead, we have an abyss between our ethics and the exponentially increasing innovation of technology. Is that a business getting wider, wider? Yeah, why people say, You know what, we need to make laws, we need to look at Section 230. Like, we need to change section 230. And that needs to be looked

Cindy M.  28:28  
at what is what is section 230, just for the audience.

Tom Creely  28:31  
Section 230 is a I guess you could say a special dispensation for big tech companies to do whatever they want. They're not considered a communications company where they have accountability. And they have free rein to do whatever that needs to be changed definitely, in order to rein in some of this power that they have. But it's going to take a concerted effort of ethics, education, continuous education, in order to, to be able to grapple and do the mind bending work of these ethical complexities, such as my students who in the EMT program, it weighed upon Congress, as I say, in my seminars, you know, technology plus politicians equal disaster

Cindy M.  29:22  
costs, laws and regulations. That's a slot that's designed to be a little bit of a slower process, very deliberative, very thoughtful. But meanwhile, you've got technology, which is changing at the speed of light, basically, that fast. So this abyss that you talk about, really, I would say what I hear you saying is it really can't be filled by laws and regulations, which will always lag. So the only way to sort of bridge if you will, across that divide is with ethical considerations and business, military government. Anybody who's used developing it or using it actually proactively filling that void and hopefully being that bridge.

Tom Creely  30:03  
Exactly. And that requires a lot of work, let's just say artificial intelligence. In looking at the ethics of that it requires testing and evaluation constantly, in order to keep up with the biases that that may produce, right? Intentional unintentional, right? You don't know exactly what that black box and that is, the bots where it did something the AI did something, but we don't know how it did it.

Cindy M.  30:32  
Right. But deep, deep neural learning is more than that human brain can even comprehend.

Tom Creely  30:40  
Absolutely. You think thats ethical? Um, well, it's where we are. And so we have to deal with where we are, because we can't, you know, take the horse back to the barn. Now, because it's proliferating artificial intelligence, it's, you know, constantly, something new every day that we see. So in this is what I call where you need ethical agility. And where you say, you know, maybe you see it's strictly technology is a power or liberate or, or as a threat, or you see it from a deontological view, or utilitarian or virtue or justice or rights, you have to be have the agility to move between these different ethical frameworks in order to examine it. Yeah. tendency to be in one category. I'm pretty much a virtue ethicists. But you have, we have to train people teach people to be able to take those different views and examine it through these different lenses, right in order because it's terribly complex, right?

Cindy M.  31:51  
terribly complex. I mean, a simple way to explain that would be the self driving cars, right? And if you take the utilitarian view, you make a greater good, right. So if the question is kill one or kill five, how do you know the car would make the decision to kill the one? But is that the right decision? But if you don't think through it in each of those kind of contexts and arrive at a solution, then it's not going to be a good one? Right? The answer no.

Tom Creely  32:14  
Save time AI is going to be used as a decision making decision making assist, so that it can help make those decisions with what we call ethical AI. Yeah, my good friend Wendell Wallach, who is an expert on AI ethics, thinks that, you know, the AI in the future will be more more moral than human beings

Cindy M.  32:37  
really explain that one?

Tom Creely  32:39  
Well, because we are driven by emotions. And I think we can see an example, by the internet and social media, emotionally hijacking people because they don't think people leave, they are not discriminating about what they see read or believe. Right, therefore, we motive ism, becomes the driver. emotivism is an ethic of emotions, and that it is the overdriving emphasis over our cognitive abilities to rationally think about things interesting. And so because we allow that bit of emotion to come into a situation, that it may be more ethical for the machine to make decisions. Hmm. And this will also be used in military operations as well, because we're competing with the Chinese in the race for AI along with Russia, which as you know, Putin said, you know, whoever wins that AI conquer the world. But it's definitely a race. So, you know, it comes a mother of necessity that we be competitive against, with other countries and state actors as well as non state actors as well.

Cindy M.  33:56  
Yeah, absolutely. I mean, it's, yeah, we it's, the work you're doing is vitally important to our national security. And I can't thank you enough for spreading the education to everyone who's who's able to participate in your program, and I wish she just had more room for it. But you know, you one of the things that that you and I have talked about before, and you sit over is like this, you know, this timeline of history of technology. And, you know, when it kind of started and where, where you can see the advances right over decades and decades and decades. But that cycle is getting faster, right in terms of the technologies and they're becoming more disruptive, I would say. And so I'm in your class, do you do you model out and do you talk with the students about the potential disruptive technologies in the future and which ones that you've kind of looked at? Were you the most?

Tom Creely  34:51  
Well certainly biology, the biological issues that are arising concern one of my students who we just picked up a whole new cohort is raised some questions for his paper and on blockchain technology and cryptocurrency. And he has raised questions in his proposal read last week, which no one has researched at all. Oh, wow. is I really, you know, all of a sudden you think oh, okay, but these are the three that you know, you know, have really need to be concerned about now all of a sudden this blockchain technology and cryptocurrency comes up because there is now competition to have a dominant cryptocurrency. Yeah, there is global in some of his his questions when the problem is who, you know, finding another expert who can help answer these questions to help explore it, because this will definitely be good input for national security policy papers complete, and he is with the military. But me he's a way out there thinker. I say we do a feature rising exercise. How do we look at the future of the disruptive, the absurd, this black swans? And this is out of the normal patterns of, of trends, and

Cindy M.  36:23  
right, well, right. I mean, those are what are going to get us right, we have to kind of

Tom Creely  36:27  
get in the mind of an adversary. See what these risks are?

Cindy M.  36:32  
Mm hmm. And model that out. Wow, interesting. Well, Tom, this has been just a fascinating conversation. I do think we could talk about this for at least a week and probably never stop. But I think we'll have to leave it there. Let me ask you one last question for audience members that may want to go a little bit deeper on the topic. In addition to what you've already mentioned, some good documentaries and movies out there. Is there are there any any other resources any books you might want to recommend that you've read recently, or maybe a good podcast series on the topic or a good paper, anything that you want to call out?

Tom Creely  37:07  
Well, PW singers, Peter singers, like war and burn in a very good, like war is about the power of social media, its threat to the United States and to the globe. Because it becomes a sense of it. Is weaponized and has global power. And then also, Burnin is his novel about robotics that is, is Futuristic, but yet it is today. A little bit of both, and it is a gripping novel, then that gives you an insight of where we are and where we're headed and others. Another one I would recommend as you've all know, Harareis a Homo Deus man to God. And He predicts in the latter part is a history of the future. And he says we're moving from surveillance to data, valence, and the new religion is data ism. Data ism means that only data matters. Your personality, your spirit, doesn't matter. It is only is analyzed, assimilated, and put together that gives a picture of you, as the new religion of data is. This is quite an interesting as an enlightening, if you want to look at what he has written. He is an it's really professor and advisor on strategy to the Israeli military services. 

Cindy M.  38:34  
Wow, those sound like great resources. And and and we'll have to leave it there and ponder this question about data ism, and how that intersects with how to be human in the age of technology?

Tom Creely  38:48  
Well, basically, you lose your humanity. And because you will design and your own destiny, by what you type into your keyboard by all the data that's collected upon on you through government and through medical and any other data collection. You know, it begs the question, what does need to be collected and what should not be?

Cindy M.  39:12  
Right? Yeah, it sure does. It sure does. Wow. fascinating conversation, big questions, not ones that we can solve today. But it is a continuous journey of education. So I appreciate you being on that journey with us, quite frankly, leading the way and participating in this podcast series with me today. And thank you again for your service on this fantastic Veterans Day.

Tom Creely  39:37  
Yes. Thank you, Cindy. I appreciate it very much. It's been a pleasure to be here. All right.

Cindy M.  39:41  
Awesome. Thanks, Tom. Okay, thanks, Cindy. Thanks for listening to today's episode of the biz the business integrity school. You can find us on YouTube, Google, SoundCloud, iTunes or wherever you find your podcasts. Be sure to subscribe. Rate us and you can find us by searching the biz That's one word th e bi s which stands for the business integrity school tune in next time for more practical tips from a pro.

Matt WallerCindy Moehring is the founder and executive chair of the Business Integrity Leadership Initiative at the Sam M. Walton College of Business at the University of Arkansas. She recently retired from Walmart after 20 years, where she served as senior vice president, Global Chief Ethics Officer, and senior vice president, U.S. Chief Ethics and Compliance Officer.





Walton College

Walton College of Business

Since its founding at the University of Arkansas in 1926, the Sam M. Walton College of Business has grown to become the state's premier college of business – as well as a nationally competitive business school. Learn more...

Business Integrity Leadership Initiative

The initiative strives to foster a culture of integrity, and promote thought leadership and inquiry among scholars, students, and business leaders to address the ethical challenges inherent in our increasingly complex business world. Learn more...

Stay Informed

Engage with our initiative on social media, and get updates from our email newsletter.