#20 Foreign Influence Operations with Natalie Thompson

As US election drama reaches a climax, the Nerds tackle headlines around foreign election interference. Joining us is special guest Natalie Thompson from the Carnegie Endowment’s Partnership for Countering Influence Operations. Join us for a discussion of what influence operations are, what states are using them, and how to stop them.

Click here to read Natalie Thompson’s article on Countering Influence Operations

HUNTER:

Welcome back to Geopolitics Rundown. There’s less than a week to go before the U.S. presidential election, and the whole world is watching with bated breath to see who will come out on top. Some countries, though, have taken to more than just watching and are actively attempting to sway the election one way or the other through a variety of technological and social means of nebulous legality.

The topic of today’s show is influence operations, a field of increasing relevance in our day to day lives. As information becomes easier to access, the truth becomes harder to find. Here to help us understand influence operations is our amazing guest, Natalie Thompson.

Natalie Thompson is a research assistant with the Technology and International Affairs Program at the Carnegie Endowment for International Peace. In her role, she supports Carnegie’s Partnership for Countering Influence Operations and conducts research and analysis on a variety of topics ranging from social media content, moderation standards to models for collaboration between technology companies and academic researchers. Natalie also holds a bachelor’s degree in mathematics and political science from Kalamazoo College. Thank you for coming on the show today, Natalie.

NATALIE:

Thank you so much for having me. It’s a pleasure.

HUNTER:

So to kick things off. Why don’t you tell our listeners who aren’t familiar what our influence operations and how do they work?

NATALIE:

Sure. Happy to. So we define influence operations as the organized attempt to achieve a specific effect among a target audience. And they can be organized by a variety of different actors. So when we think about the election interference context all the time, we’re talking about nation states. But civil society organizations or even commercial entities like advertisers that want people to buy products, engage in influence operations. And the idea is then in neither inherently good nor bad, it’s all up to us to define what the contours of acceptable influence are. And people can use a variety of different tactics, techniques and procedures when they engage in the attempt to change somebody’s attitudes or their behavior. And these different techniques can range from covert advertisers. Sorry, excuse me. These different techniques can range from over advertising to covert propaganda or financing of an entity can include things like strategic hack and leak operations. There’s a wide variety of different tactics that can be used. The general idea is that you have an entity or an individual that is trying to achieve a specific effect with a target audience.

HUNTER:

So on the state sponsored side, which countries are pushing the envelope and developing new tactics?

NATALIE:

Oh, so I don’t really know about new tactics, but I’ll say generally we talk about sort of like three states very often in this space. We talk about Russia, Iran and China as the big three that are pretty active. So I can… I’ll take them each kind of in turn.

Russia is very active and has been the subject of a lot of conversation since 2016 during. As a result of their efforts to interfere in the U.S. presidential election at that time. And what the Russians are trying to achieve is a broader goal of undermining U.S. foreign policy and security by amplifying existing tensions within the country and divisions within our domestic population, and to essentially create chaos and sow discord. So they use a variety of different techniques to do so, including things like laundering, false or misleading information through state sponsored media outlets or even through other sources. So there was a story recently about them coopting American and other journalists to actually write stories like real journalists to write stories for an organization called Piece Data, that was also peddling, you know, conspiracy theories and other kinds of misleading information.

In addition to that, you know, very legitimate reporting. They use fringe sites or groups to generate content and then spread it across social media. And that’s where you get the kind of like talk about the Russian troll farms. And so that involves sometimes false online personas, you know, paid advertising to get people to engage with content. And then they’re also engaged in hack and leak operations. And so that was obviously one of the big stories around 2016, was the effort to compromise the Democratic National Committee and take that information and distribute it for political purposes. So that’s kind of the Russian tactic, goal strategies. They’re very much just interested in kind of sowing discord and using a variety of different means to get there.

Then Iran is another one that gets talked about a lot. They’re a smaller player, but they’re very much still trying to get an edge in favor of their regional goals as well as also, Iran is very much oriented around the goal of national preservation and control over access to information. So Emerson Brooking over at the Digital Forensics Research Lab, recently wrote a really great paper about kind of the history of Iranian influence operations and, you know, details their use of sockpuppet since 2010 to launder information. Their attempts to distort narratives, especially regarding the role of Israel or the role of the United States in broader international security. They are very much trying to project a kind of exaggerated moral authority. So one example of that that we saw was that earlier this year, after the killing of General Soleimani, there was a lot of pro Iranian propaganda that was circulated on social media.

HUNTER:

Yeah. We actually talked about the killing of Qassem Suleimani in our episode Maximum Pressure. So it’s interesting that you bring that up.

NATALIE:

Yeah, it was, I think, a really interesting example, one where I think there was a lot of uncertainty about how Iran was going to respond. And so the kind of propaganda push that we saw, I think was probably the least of of the concerns that we could have seen materialize, but very much trying to portray that as an unjustified act of U.S. aggression and to kind of, you know, spread, you know, pro IRGC and pro Suleimani propaganda on social media.

So then I apologize is a very lengthy answer coming to China. China is very much involved in a long term and strategic influence operation related to shifting values around the world and trying to position themselves as a kind of global leader in the same way that the U.S. wants to be seen. And so a lot of their influence operations are responses to other portrayals of things like their response to the COVID-19 crisis and trying to reorient that conversation to the US’s failures in dealing with the pandemic as opposed to, you know, the origins and Wuhan or dealing with things like how other other countries or other outlets are reporting on the repression in Xinjiang and trying to shift that conversation back to one where they have moral authority and leadership. So do you see there the use of government social media accounts to portray the CCP very favorably, public diplomacy efforts in places like standard setting bodies to actually try and really influence the conversation about kind of the technical infrastructure that should underpin the information environment, really aggressive use of social media accounts and in some cases text messages or like private social communications to spread propaganda surrounding the Coronavirus.

Those are the three kind of big players state wise in the influence operations base.

HUNTER:

One of the things that’s been in the headlines recently has been foreign election interference by specifically Russia and Iran, which you mentioned earlier. Do you think that public discourse in the United States about foreign influence operations is commiserate with its threat to the country’s democratic institutions? Or do you think it’s overblown?

NATALIE:

This is a really great question, and Lawfare actually is running a sort of series of short pieces where people are kind of taking positions on either side of the issue. And I highly recommend that because there are some really good takes in there. And I think the thing that I often come back to because I can see this very clearly both ways, is that right now we don’t have a ton of evidence one way or another to support the position that it is a threat so grave that we should be devoting this much attention to it. And I don’t say that, you know, to say there’s no evidence of the fact that interference is happening. There certainly is. But one of the things that I think we struggle a lot with in our research is that there is very little by way of a robust evidence base for measuring the effects of an influence operation or the effects of countermeasures. And so we just need more data in order to make the claim about effects and and make it based on the evidence.

I think because of that, there is a certain way in which the public conversation has become swept up in this, and we haven’t really figured out a way to talk about it responsibly. And keep in mind the fact that we don’t have this evidence base. So where I kind of land on it is that domestic interference campaigns and the spread of false or misleading information, that domestic context is actually a lot more damaging and significant. And it’s harder to perceive there where the line between acceptable and unacceptable influences.

So there’s a quote by Masha Gessen that I really like describing the Russian threat as a crutch for the American imagination. And I think in a lot of ways, there is so much truth to the idea that certain kinds of influence go outside the bounds of what’s acceptable. But the threat in comparison to the kind of false and misleading information that spread all the time and domestically by people at the highest levels of government kind of pales. So the the bigger threats that we should be paying attention to are, you know, attacks on journalists within the country, you know, government leaders that flout norms of responsible behavior, people allowing people to abuse positions of power kind of with impunity. So those are the kinds of things that I think deserve as much, if not more attention than the foreign threat. While we kind of take stock of what can we actually say about the effects and what would we want to know in order to talk about the effects?

HUNTER:

It’s interesting you talk about disinformation in general and specifically disinformation that comes from people in power, because it seems like the United States and other Western democracies are uniquely vulnerable to these kind of influence operations because of the freedom of discourse that we have. Why do you think we’ve been unable to deal with this problem adequately?

NATALIE:

That’s another really good question. And I do think that that some of it comes back to the tension between wanting to counter disinformation or to help people, you know, like make better sense of the information that they interact with and claims of censorship. And I think that that’s that’s one tension that that’s there. That’s one balancing act that we have to pay attention to in order to uphold the sort of values that we have as a country.

But I think that there is also kind of some some difficulty that liberal democracies have with developing a coherent strategy for competing in the information environment in general. So there, you know, military doctrines like there’s, you know, the joint concept for operating in the information environment, the strategy for operations, the information environment, that you kind of outlined this in a military context and talk about it in the context of conflict. But for how people on a day to day basis should be thinking about information strategically, I don’t think that we necessarily have that kind of robust and coherent understanding, although our adversaries do. And so adversary countries like Russia and China, they have this very coherent vision of how like cyberspace and information work together and they’re able to sort of weaponize that very thoroughly in a way that we don’t really have a counter strategy for.

And lots of people have written about this. There’s a really good piece in I think it was in Washington Quarterly that I could be wrong about that by Laura Rosenberger and Lindsey Gorman, both of the German Marshall Fund, about how democracies can win the information contest. And they talk about this that like there’s, you know, a difference in the way that our adversaries perceive the environment and the way that we do that makes us sort of on the defensive a lot of the time rather than on the offensive. And when you’re constantly playing defense, it’s no wonder that you feel very vulnerable, because I don’t know that the U.S. has a very good strategy for competing offensively in this realm.

HUNTER:

It’s super interesting that you mentioned that other countries have a different sense of how the information space works, because one of the things that I read when I was in my undergrad that’s been part of the literature for years is this new Russian doctrine of hybrid warfare that we talked about in our episode, the heist of Crimea. Is that what you’re talking about, where Russia has a kind of a clearly defined methodology for using information as a weapon?

NATALIE:

Yeah, very much so. And I think that in particular in my work. One of the things that I encounter a lot is the difference between terms that different states use. So like Russia and China, you don’t refer to the concept of information security, and it very much implicates the kind of human rights concerns that they have in mind. Well, I guess they wouldn’t consider them human rights concerns. We would. But the concerns that they have in mind related to, you know, how information can actually undermine the stability of their regime. And so they see information very much as a threat, as much as they do an opportunity to compete. And so that’s why you have this sort of integrated understanding in places like Iran and Russia. Of the relationship between, say, offensive cyber operations, for example, and influence operations. And they’re all kind of one and the same because there’s a much more sort of cohesive understanding of the relationship between the technical infrastructure and the kind of idea realm. I’ve heard other people describe it as, you know, like the cybersecurity realm. This is the pipes and information is the water. And so they have, I think, a more coordinated strategy for dealing with those two things. Whereas in the West, we have like to keep those separate and to talk about cybersecurity, for example, as something that doesn’t implicate content because of all of the sensitivities surrounding what happens when we say if we’re censoring people or that sort of thing.

HUNTER:

So you mentioned this before, but I want to come back to it. The focus of the public right now has been on state sponsored influence campaigns targeting the U.S. elections, specifically from Russia and Iran. Can you tell us a little bit about the methods they use and why? Generally, they’re doing it.

NATALIE:

Yeah, and I think it very much comes back to the aims that I described at the beginning about Russia is very much interested in selling chaos and undermining the US’s position, although that’s largely Iran’s goal as well with Russia. The strategy that they have taken has been. Large and more sophisticated than Iran and their capabilities, so they have, you know, executed social media campaigns, those kind of have been ongoing since 2016. I think that a lot of the and well before then, I think the views of a lot of people. But there’s also, you know, the efforts to do things like obtain voter registration lists, which are publicly available, but to use that to then scare people. And that was what we saw wrongdoing as well with the sorts. The proud boys attack that we saw recently reported on where Iranian actors took voter registration rolls and used them to send spoofed emails appearing to come from a domain called official prop boys dot com that threatened voters and said, we know that you’re a registered Democrat. And if you don’t change your registration and vote for Donald Trump will harm you. And so that’s been one tactic is actually taking publicly available information and using it to target voters. Russia has also, you know, been engaged in attempted cyber attacks that are likely not to have resulted in actual penetration of any systems, but are meant to influence people’s attitudes by giving the perception that they are in the systems and that they could compromise the integrity of the election. And so there’s been, you know, sort of that tactic that the Russians have engaged in.

HUNTER:

What role does the mainstream media have in disseminating this disinformation to the advantage of these elections hampers like the Russians, for example?

NATALIE:

Very good question. There’s a couple different roles that the media has in a space like this. So one is by virtue of reporting on it. It can cause the public to believe that the threat is greater than it is, you know, particularly when people only read headlines and, you know, the lead is focused on the attempted interference and not on the fact that it was unsuccessful. That can give people the perception that there has been a successful attack. So there’s a balance there. Obviously, the message needs to get out. The public should be aware of these threats. But to the extent that we’re inculcating fear in people or actually causing them to undermine or to doubt the integrity of the election, that is a way that journalists are becoming co-opted in this space. So that’s one thing is the traditional reporting.

And then there is, you know, the co-opting of the freelance journalist, like I talked about with the piece data incident where, you know, the site is paying journalists to very legitimate freelance journalist to write stories on their behalf. The journalists are unaware of the actual origins of the organization, and the organization is also creating other content that is meant to, you know, spread false or misleading information or to, you know, affect people’s perceptions in certain ways. And that’s a direct cooptation of journalists who are unwitting participants in the legitimate migration of a site that otherwise would appear legitimate. And then there is no context in which journalists are reporting on a hack and leak operations. And that can be another sensitive way in which journalists unwittingly, you know, become the tools of influence operators because, you know, although there may be compelling public information in a hacked set of files.

Is it responsible to report on those if the documents have been obtained illegally for the purpose of influencing an election? That’s a tricky question. So how you report on hacked information in a responsible way is, I think, another, you know, sort of contentious area where journalists can accidentally, you know, become the purveyors of false or misleading information or help spread a narrative that otherwise wouldn’t have been the predominant narrative. Now, that’s what I don’t think we’ve seen that necessarily in this election. But that was certainly, I think, something that happened in 2016 where a lot of the conversation about the WikiLeaks and DC leaks was dominating. And so there is a real need there to sort of verify the source of documents to determine what is going to be the focus of coverage. And that’s another area where journalists have the potential to accidentally, you know, provide this information.

HUNTER:

Let’s move on to the subject of countermeasures. Your paper, the Challenges of Countering Influence Operations, made the point that tech companies like Google, Facebook, Twitter, etc. have a greater ability to control the spread of influence operations on their sites than governmental policies do. For those who haven’t read it, could you explain why that is?

NATALIE:

Yeah, I would be happy to. So the point of the paper is that you’re trying to look at a case study involving some commercially motivated influence operators where, you know, a group of coordinated pages was spreading polarizing information or polarizing stories and then actually monetizing the clicks that were driven to these like weird websites that they had set up to host the false stories. And one of the things that we were looking at in this case study that was written by myself and then the director of my project, Alicia Wanless, and Elise Thomas, who works with the Australian Strategic Policy Institute, was how could social media companies and governments respond to these sorts of things? And one of the things that we realized is that as we were doing this survey of the platformed policies that could apply in the government, you know, responses that could apply is that there just seemed to be a lot more avenues through which the platforms can tackle this by virtue of the fact that the content exists on their site.

So there are a bunch of policies that relate to things like spam or harassment that the platforms can use to action this content and to take it down the range of government responses is very different. And so we looked at the, you know, laws that could apply and there certainly are some. So the United States in past cases has used things like the Foreign Agents Registration Act to deal with influence operators. But there’s no real law that’s really meant to deal with influence operations. Like there’s nothing that you can really take and say what you did was criminal. And even if you did, you run into two problems. One is that attributing this kind of behavior to an individual is difficult. And the second is that even if you could make the attribution, even if the law didn’t exist, you did exist, you could make that attribution. It’s often difficult to get to those operators because you need them to either be extradited or you’re never gonna touch them.

HUNTER:

And it’s like, so what are you gonna do? Go to Russia, pick this guy up, bring him back to the United States and try him for harassment?

NATALIE:

Right. So you can see in this case, it was we didn’t make the attribution ourselves. There was another reporter that had previously investigated this case study and had alleged that the individual was in Israel. But again, even in that case, you know, are you really going to go after this one person and devote all of these legal resources to getting them into the United States? And so you see that they’re like sort of few tools there. And the point that we were trying to make was that like the social media companies, they can remove this content as private companies. They have the ability to actually identify this sort of thing. But governments don’t seem to have a lot of tools for dealing with these kinds of cases, or at least the tools that they have are different, you know. So there are plenty of things that the government can do, but they’re not the same as the tools that are within the realm of social media companies. You can actually just take the content down.

HUNTER:

So to wrap things up, is there anything that you think individuals, our listeners, for example, can do to stop the spread of misinformation and to be aware of influence operations in general?

NATALIE:

So I’m a huge advocate for media literacy in this space. So, you know, I think that one thing that we all need to be better about, and I will admit mean guilty about this myself is actually, you know, reading behind headlines and reading a full article before sharing it on Twitter. And I actually am a huge fan of the fact that Twitter has started to flag the users when they’re about to retweet an article that they haven’t clicked on. And it’s a I think, a really unique way of nudging people. The other thing that I would say is and, you know, I’m not sure that any individual can take this on. In fact, I know no one can take this is that I think that we need to have a more robust conversation as a society about what the acceptable bounds of influence are. So the spread of false and misleading information is just one part of influence, operations and influence. Operation can be totally true and can be totally verifiable. And there would still be questions that we would need to ask ourselves about what are acceptable strategies for doing things like targeting people with content that you know is likely to appeal to them. What are the ethics of, you know, working with information? That’s totally true, but that it comes from a source that maybe we would not otherwise have access to were it not for some kind of nefarious behavior.

And so I think there’s a very real reckoning to be had socially. And I fear that our current political discourse is not equipped to have that reckoning because the issue has become so politicized. But I do think that there needs to be a whole of society effort dedicated to this, and it needs to involve not just, you know, individual voters like you and I, but it needs to involve the government. It needs to involve civil society organizations. It needs to involve the tech companies. And if there needs to be some kind of greater at least awareness of the nuances in this face so that we can move the public conversation beyond, you know, the emphasis on fake news and into a more nuanced space where we can talk about how do all of the various technologies that we interact with affect this space? And what should what should the acceptable rules of the road be?

HUNTER:

That’s it for our show today. Thanks again to Natalie Thompson for coming on the show and sharing her insights with us. If you’d like to this content, don’t forget to write and subscribe on iTunes, Spotify or wherever you get your podcasts. If you have questions about the show comments or just want to get a hold of us, visit us at our website. http://www.geopolitics rundown.com. As always, thank you for listening. This has been Geopolitics…Rundown.

Leave a Reply