Season 5, Episode 4: The Weaponization of Information with Peter Singer

February 14 , 2022  |  By Cindy Moehring

Share this via:

Cindy Moehring chats with Peter Singer, strategist at New America, Professor of Practice at Arizona State University, and principal at Useful Fiction LLC. They cover topics such as social media as a battlespace, the weaponization of information, and strategies for entities to combat online misinformation.

Podcast

Resources From the Episode

Episode Transcript

Cindy M.  0:03  
Hi everyone. I'm Cindy Moehring, the Founder and Executive Chair of the Business Integrity Leadership Initiative at the Sam M. Walton College of Business, and this is The BIS, the Business Integrity School podcast. Here we talk about applying ethics, integrity and courageous leadership in business, education, and most importantly, your life today. I've had nearly 30 years of real world experience as a senior executive. So if you're looking for practical tips from a business pro who's been there, then this is the podcast for you. Welcome. Let's get started.

Hi, everybody, and welcome back to another episode of the BIS, The Business Integrity School. Today we have with us a very special guest, P.W. Singer. Hey there, how are you?

Peter Singer  0:50  
I'm doing well. Thank you for having me.

Cindy M.  0:52  
Absolutely. Let me tell you all a little bit about this incredible guest we have with us today, and then we'll dive right into our topic about all things emerging tech ethics. So PW Singer is a strategist at New America. That's a think tank. He's a Professor of Practice at Arizona State University, and he's a principal at a company known as Useful Fiction. He is a New York Times bestselling author, written a number of books and has been described in the Wall Street Journal as the premier futurist in the national security environment. He's also been named by the Smithsonian is one of the nation's 100 leading innovators. The list goes on and on. But I think you get the idea right there that we're talking with somebody very special today. And today, we're going to dive into a particular topic, social media. And Pete has actually written a book on that topic called LikeWar, which explores, how social media has changed war, politics, and really the world we live in today. It's been named by Amazon and foreign affairs as a book of the year and it was reviewed by book lists as required reading, for everyone living in a democracy or anyone who aspires to do so. That's our audience. So I think that's a great place for us to dive into it, Pete. Why don't you just start by telling all of us how you kind of got interested in this topic of the future of tech and how that's going to affect our world and how that led you to where you are today?

Peter Singer  2:18  
Oh, goodness, well, first, let me begin by thanking you for that incredibly kind introduction. I've always been drawn to, it's interesting, both future issues, and in particular technology issues. And that cross of the real and frankly, the fictional, I was the little kid that, this'll date myself, but you know, I had Star Wars bedsheets. But I've also been, also from that early period, a history buff, and I think it's those two things coming together. So what are trends that break the rules of how we thought the way the world worked? But in turn, how are they echoes, riffs, rhymes off of what happened in the past? What can we learn from the past and applying it to that new trend or technology. And so that's, that's been, you know, whether it was a past book looking at robotics to LikeWar looking at social media, it's had that sort of link that's been part of it. And frankly, maybe one other element of it is I'm a bit of a curmudgeon. And so I've been drawn to topics that at the time, felt to me really important, and yet, for whatever reason, people weren't getting it, and people in particular, and kind of the powers that be, you know, I'd feel like this is so obvious, this matters, or this is so obvious that it is going to play out this way. And yet, there was some sense of kind of denial about it often because it just didn't fit the way that we thought the world worked. Or it was something that there was kind of a built in bias around it where, you know, for example, maybe people were invested in it. And so that's why they could only portray it for the good. And so I'd be the curmudgeon going well, but actually, there's this part that might be for the bad, or it might be on certain issues, No, that doesn't matter and I'd be like, Don't you see, it really does matter, or it's gonna matter even more.

Cindy M.  4:18  
Yeah, or, and that leads right into the whole issue about social media and and in LikeWar, I mean, we'll talk about it in just a second. So let's just talk about your book and what it was. So you wrote this book called, LikeWar, which I think is really, really. I know it was written a couple of years ago, not that long ago, but it's still really relevant today, given the rise again, of the Facebook files and the criticisms of Instagram and how it's affecting, you know, young girls, and really just, you know, how it's affected our lives in general with the elections and, you know, democracy in general. And so I think it's worth revisiting. In this episode for the podcast, your book and your book was about social media, how it's become weaponized. And what we need to know about it. So well, let me just stop there and ask you, why did you write that book? And and you know, two years later back to you said, you're kind of a curmudgeon, and it wasn't getting enough attention. What do you think about it now? Is it getting enough attention?

Peter Singer  5:13  
The starting point for that project was actually a series of conversations that I had with a young colleague, Emerson Brooking. This is way back in the day, this is in a period of like, Gosh, 2012 ish. And we're talking about this new technology space of social media. And if you recall, it was either something that people felt was, in terms of its, its consequences, it was either kind of light and airy, it didn't matter. It was just for fun. It was just for social relationships, right? You know, meeting, you know, I mean, look, Facebook starts as a way to rate who in your dorm was hot or not, that's the origin of all this. And then it starts to become a little bit of big money. But basically, it was kind of like, this is all just light, airy, or it was framed as something that was only for the good. So if you go back in time, it was, you know, you had Facebook's advertisements that were, now they almost sound kind of dystopian. Quote, the more you connect, the better it gets. Is it really that? That was their their tagline back then, too. Yeah, New York Times had an article on the democratizing power of social media, how it's going to make the world more free, make everyone more happy, more peaceful? And we, you know, we're, we're sort of, is that really the case? Is that the case of what we've seen in the past with past technologies, and you can find similar kind of discourse around like the telegraph, or the radio or whatnot, but also that hold it, there's bad guys in the world. And our project started with a look at well, but how are bad guys using this technology too? And that, that research project is what,you know, and we started looking at places like Iraq and Syria. But what happened is, the more we looked at how groups were using social media, we kept seeing this replication of similar approaches, where, for instance, ISIS's top recruiter was copycatting Taylor Swift, or to flip it. Lady Gaga fans were copycatting what Russian military intelligence was doing against everything from Ukrainian soldiers to US elections. We saw corporations and teenagers using the very same practices online. And of course, the effect of all this was both good, ice bucket challenge, and bad, ISIS propaganda. And so what it led us is this realization that social media, it was yes, it was a communication space. Yes, it was also a marketplace. But simultaneously, it was a battle space and it was a battle space where we were seeing what we called likewar. And so if you think of cyberwar as hacking networks, which again has hit everything from governments to corporations, likewar was hacking people on the networks by driving ideas viral through you know, likes and shares and sometimes lies, and the same thing it was affecting everything from governments to corporations to individuals. And so that's, that's what the like were project was about. That's how it started. And yeah, it continues to resonate today. You know, whatever topic you care about, whether it's politics on the international level or domestic politics or you care about public health, or you care about your corporations brand, or its marketing, or what employees are thinking and doing all the way down to your customers to guess what? Your kids. Everybody's shaped by targeted by and engaging in likewar, whether they know it or not.

Cindy M.  9:24  
Yeah. Okay, so I have to stop for just a minute and have you say a few more words about Taylor Swift and what what tactics was ISIS learning from what Taylor Swift was doing? I know a lot of people in our audience are fans of Taylor Swift.

Peter Singer  9:39  
So I joke on and this is the military, anyone with a military background? And maybe maybe some business school readers have heard of Clausewitz. Clausewitz was the German officer slash military strategist that wrote this book called Onward. It's sort of the foundational, you know, military strategy book and if you're in any military academy, you have to read it and a lot of business people read on war and the like. And I joke that Taylor Swift is the Clausewitz of online, of likewar. She's the Clausewitz of, and again what's so interesting, is she literally, while she's a young teenager writes down her strategy for how she understands this emerging area of social media, and how she's going to use it to achieve her goals. To you know, basically get her music out there and get a record contract, ultimately become I mean, I think she was the youngest, I want to say like self made multi millionaire, I mean, all sorts of you know, whether you're judging it by you like your music, or you just respect her business. And she in this, she lays it out in this essay. And essentially, you know, it lays out her understanding of everything from, you know, how you build an online following, and the importance of feeding them and the, I have to go back and find the rough quote, but like, feeding them a daily diet of tiny surprises. So keeping people engaged. You're utilizing that following for your goals. You're not just, you're engaging and lifting up. I mean, there's unfortunately, people you know, learn from that and including some of the recruiters for ISIS. And Junaid Hussain was a recruiter for ISIS, who used many of the very same techniques. The flip side, and I think this is a, maybe a more interesting and challenging story for a lot of people working on the business side is what the Lady Gaga fans learned from Russian military intelligence. They basically looked at what Russia had done to essentially sabotage US democracy, target elections, in the US but also over 30 other democracies out there. Basically, by using a mix of everything from false front accounts to driving misinformation, false stories, viral painting lies out there. And as part of getting people to believe the lie, posing as trusted figures and/or trying to create movements around trusted figures. So during the election, it was, you know, false information about this or that, but Russians posing as an American veteran, or a grandmother or, you know, whatever people, they tend to trust more. The Lady Gaga fans were troubled when her movie, her first ever movie was going to come out, A Star is Born. And other corporations had the temerity to have their rival movies come out the same weekend. And so the Lady Gaga fans openly online discussed what the Russians did, we could do that too. One rival was the movie Venom. And so they posed as everything from people who had seen this other movie and would push false stories about it to concerned parents about violence in movies like this horrible movie Venom that's out there. And much like what the Russians achieved, the key was not just to drive online discussion of the target. But there's a great data point that illustrates the power of likewar, over 90% of professional journalists use social media to decide what stories to cover or not, so whether it's a radio producer, or a newspaper reporter. So in this case, the fans got coverage of their false movements on local radio shows and the like, because people would see this discussion trending. So you would have people listening to the radio hearing about it, even though it you know, wasn't the case. And it's the same thing the Russians were able to achieve.

Cindy M.  14:26  
Isn't that, I mean, it's so interesting to listen to you describe these, these kind of different themes coming together in different universes, that folks may never really think we're alike in many respects, right, like war, and you're talking about things that Russia did to, you know, affect us elections and democracy. And those same tactics being used on the, in the business side. I mean, nobody, I think, before would have thought that those two worlds would have, would have intersected in that way to you know, to think about the same kind of tactics. And you know, something else you said, I think is really, really powerful for folks to think about, and how to use it the right way and not the wrong way, which is, it's not just what's online, right? Like, I mean, talking about the misinformation during the 2016 Election, you point out in your book that it was like, half of the US population ended up being reached by things on Facebook. And there were like, almost 288 million, I think tweets, like on Twitter, but it didn't stop there. To your point, it got picked up by like 90% of journalists see what's trending on social media. And it just gets, you know, broader and broader in terms of distribution. So that brings me back to a question that I'm hoping you can answer first, Pete, what is it that makes some of those things go viral? And some not?

Peter Singer  15:39  
Yeah, it's a great question. And it was interesting, when we looked at information that went viral, again, whether it was something positive, you know, like, and I go back, and maybe it's just because I like the rhyme of it, but you know, Ice Bucket Challenge, what can we think of, you know, I mean, so positive, and it's bold, and, in fact, to ISIS propaganda, or, you know, if you were looking at Russian messaging to vaccine mis and disinformation, and again, you know, positive or negatives, we found that consistently, there were a series of attributes, that the information that was more likely to go viral would have and importantly, veracity was not one of them. So, you know, one of the rules of likewar is the lesson that virality, Trump's veracity doesn't mean that the truth can't go viral. It can. It's just, it's the speed. It's the reach of it, that gives information its power. But again, when we looked at these incidents of something going viral for the good, for the bad, from a corporate marketing campaign to disinformation. There were elements, like we break it down in the book, but one, for example, is authenticity, the feel that there is something real about it. But real doesn't mean reality, it means operating in an online authentic manner. So I would use the example of Wendy's is, the online account Wendy's is not, it's not actually Wendy, right? But that account had high scores of authenticity compared to Hillary Clinton, who is a real person. A lot of it is not just about the how one speaks, but the back end process behind it. So this was, again, a parallel of military and corporate and business side, sort of corporate and political side is when we looked at, you know, breaking down a, but the messages put out there, what was the chain of events behind it? You know, the the Clinton campaign back in the day had, I want to say it was something like 16 different people all weighing in on a single message, you would get lowest common denominator approaches. It's slow in it's turn around. So real person, and yet the online messaging did not feel real. US military had a very similar problem in counter ISIS information in the early part of that where they had, if I recall, it was a team of like, eight colonels. Almost none of them spoke Arabic, debating back and forth on the appropriate message. At the same time that you know, individual ISIS members are just tweeting out there about everything from their terrorism to their cats. So like them or not, they, they felt real. Another example is the idea of building up a online movement around you where you're leveraging that followership to push that out. Another example, and this was a really good insight, there was a parallel of both Russian misinformation to the Trump campaign did this very well, and certain corporations do it very well, is to understand that every message is both a weapon. It has it has a goal, a weapon, a tool, it has to seek to accomplish something. But it's also an experiment. It sends back data about its success or failure with that individual target and everyone in the world who's like that target, and then you take that experiment, refine, experiment, again, refine, and if you can rapidly iterate that you'll be much more better off. And entities that don't do that, they just kind of fire it out there, hope for success. They will not succeed. There's a whole range of lessons of this. I mean, we could spend all day talking about it.

Cindy M.  19:48  
Oh, yeah, yeah, yeah. Yeah. Well, well, we'll I'm sure you cover that topic a little bit more in your book as well. So people can always, people can always go there to find a little bit more. But let me ask you a question. I always like to help the audience understand, what can they, what can they do to improve in this space and try to ferret out whether you're a company or an individual, like, improve, improvement of digitally. Let me start here. Digital literacy, I think, among and maybe that's not quite the right term, but being literate about what you're seeing and whether or not you, it's appropriate to share it further, or, you know, you are checking the veracity, because back to you said, you know, sometimes it's the speed, it's not necessarily what's true, but what, we have some responsibility as individuals and companies do too, to, to not just, you know, continue to share misinformation. So what, what is the digital literacy rate? And what can folks and companies do to improve on that?

Peter Singer  20:41  
Yeah, so I think, you know, let's break it down into sort of overall, what can we do? Or should we do and then get down into the individual part of it. Overall, whether it's the US government to your corporation working in this space. First, very obvious, but it's funny how many don't have it, you need a strategy. And many entities, whether it's US government to corporations, they have a public affair strategy. Or they have a cyber security strategy, you know, don't. don't hack my networks. You don't have a strategy for this space. And the important part of a strategy is that it you know, both lays out what's my goals? What's my measures? How my funding, what, who, what are the areas of responsibility, and what actually happens, and again, I can speak better about the US government versus like individual companies, is, we have a lot of activity, but it's not well coordinated. And certain elements of it are ad hoc in response. So US government, for example, you know, we had a Russia working group that brought together intelligence community and FBI, we also had a election working group. But you know, the threats, not just Russia, the issues of mis and disinformation are not just elections, it's we didn't have you know, it's hit pandemics. Also, the entities that you want involved is not just FBI or intelligence community, you need State Department, you need judiciary, you need Department of Education. So strategy. Second part, again, whether you're looking at the corporate side is basically not just my strategy, but how do I think through my responsibilities in this space? And that is something that, to be frank, too many of the technology companies have not had that discussion in terms of not just my responsibilities, but who do I believe I am, versus who am I truly. So too many of the tech companies, the platform companies, imagine themselves as solely technology creators, as opposed to they, as I said before, are running not just a communication space, not just a marketplace, but like it or not, they're also running a battle space right now. And that brings very different sets of responsibilities. One of the most important is consistently when companies get into trouble, and then it creates bad consequences for society, is when they wait for the bad thing to happen instead of gaming it out. How might bad guys misuse my product? And or good guys and gals misuse it unintentionally. So you know, everything from, I'll give a more specific example of this. Facebook was, you know, we set aside all the sort of deliberate things like just, when they pushed out video streaming, they were stunned, and had no process in place for when terrorists live stream their mass killings. Teenagers live streamed their suicides.? Anyone who knew anything about terrorism, and teenagers could have said, Hey, this is something that is going to happen, let alone might happen. What are your processes? What's your thought, instead of being surprised by that? And we've seen more recent examples of this, like Twitter had a policy that, you know, they thought were the good and every single expert was like, but this is what's going to happen from you know, far right, Neo Nazis are going to misuse it in this way. And then it happens in like, oh, yeah, sorry about that. So what are your roles and responsibilities, but now we get to the final thing that you brought up, which is us as individuals. And what we need is something that's been called Cyber citizenship skills, and it brings together three areas. One is digital or media literacy. This is basically understanding how this space works. What's the difference between fact and opinion? How do algorithms work and how they shape what you see or not? What is evidence? Second area is the civics citizenship discourse, which is basically not what do you know, but how do you behave online? Don't be an internet troll, don't be a cyber bully, the effect that the internet has on our democracy and other people, that kind of behavioral discussion. And then there's a third area, which is threat awareness. How are malicious actors trying to target? You know, traditionally, that's been a cybersecurity discourse. You don't click that link. But it also needs to know how and part of it of like mis and disinformation, how are they trying to manipulate you? And the key here is that you need all three of those it. It can't be enough to be able to tell fact, from opinion if you don't know how people are trying to manipulate you even sometimes with fact, or yeah, you can know fact from opinion, but if someone's being a troll, we're still in a bad place. And the challenge is in the United States, for the most part, we don't teach those skills across. You know, we've got roughly around 11,000 school systems. Now it feels I mean, to end on this point, it feels very kind of daunting. On the other hand, other democracies do, and they're much better off against these issues. So Estonia did not break up Facebook. Estonia did not make Russia stop attacking it. And yet, because Estonia has caught its, its strategy, and it's teaching up to this issue. It's far more resilient against these problems, and healthier democracy and a healthier society. Again, whether you care about public health and pandemic or democracy or just your kids are doing a search for who built the pyramids. We need skills.

Cindy M.  27:06  
Yeah. Wow, how interesting. So, so much to learn in that particular space that, you know, the US can actually learn, like you said, by looking at some others, it sounds like we have a lot to do there. And then it's not really happening today. So Pete, you are so deep into this issue. And such an incredible expert that I, I always like to leave my audience with some great resources, your website is one of them, you've got all kinds of great resources there. But where does somebody like you go to for additional information that you would like to maybe recommend to the audience if they want to go a little, a little deeper on this particular issue.

Peter Singer  27:43  
So on the topic of likewar, I'll steal your, I'll steer you to two locales that hopefully are useful. Um, one is and I want to be clear, is actually social media itself, but not to, you know, in no way shape or form if I said, stop using it or, you know, kind of the old school just say no. Most of us don't have that, that choice, nor would want to take it. So what we've done myself and my co author, Emerson Brooking, over the last several years, we basically would mark information that we thought was useful with hashtag likewar. So it might be a article on X, Y, or Z, it might be something from the the prediction from the book that came out, it might be an interesting study. So um, hashtag likewar doesn't just take it to the book. It's it's things by all sorts of different researchers in the like. The SEI is a set of resources that was built in partnership with Cyber Florida, which is the cybersecurity education program for Florida, and the National Association of Media Literacy Educators and New America, which is the think tank that I work at, and we teamed up to build, it's essentially the cyber citizenship portal. It's on OER Commons. The best way again, to find it rather than trying to you know, get this is basically Google cyber citizenship, New America, cyber citizenship, Cyber Florida. And what we've built there is a portal that brings together over 100 different teaching tools on this topic of mis and disinformation. So not ones that we've built, but ones that have been built by researchers and companies and everywhere from America to Austria, and we've even catalogued it, so that it's marked for everything from which of these are free or not to age appropriateness because you know, something that's appropriate for a fourth grader is going to be different than something for a 12th grader or an adult or whatnot. So it's basically a portal that you can go into and get everything from educational games to curriculum resources on this topic. I'll put a plug in, we built this as part of trying to help the Florida education system. You might have heard there's more than one state in the United States. So it would be great to have these kinds of resources for teachers in all of our different states. And again not saying, This is the exact way you have to teach it. Here's a bunch of different, here's a series of tools that can be pulled either by teachers or anyone else to learn from.

Cindy M.  30:47  
Sure, sure. Yeah. I mean, cyber citizenship, I think is is what we're calling it and yeah, it's such a new field that resources at all would be fantastic to have. So Pete, thank you so much for your time with us today and for sharing your wisdom and your knowledge and taking us into the good and the bad of social media and and where we can go to learn more about our responsibilities on on cyber citizenship. So I appreciate it very much. Thank you.

Peter Singer  31:16  
Thanks so much for having me. All right. Bye.

Cindy M.  31:22  
Thanks for listening to today's episode of The BIS, The Business Integrity School. You can find us on YouTube, Google, SoundCloud, iTunes or wherever you find your podcasts. Be sure to subscribe and rate us and you can find us by searching TheBIS. That's one word, t h e b i s, which stands for the Business Integrity School. Tune in next time for more practical tips from a pro.

Cindy MoehringCindy Moehring is the founder and executive chair of the Business Integrity Leadership Initiative at the Sam M. Walton College of Business at the University of Arkansas. She recently retired from Walmart after 20 years, where she served as senior vice president, Global Chief Ethics Officer, and senior vice president, U.S. Chief Ethics and Compliance Officer.