Ben Sohl
Background
The world is currently undergoing a technological
revolution. The advent of the internet, like
other great innovations in human history such as the plow, the printing
press, and the combustible engine, is changing the nature of just about every
human endeavor. Foreign policy is no
different. There is currently a transformation
in the nature of the conduct of foreign affairs happening in front of our very
eyes.
In the realm of foreign policy, the revolution in
communications technologies has created new avenues of influence for competitor
states. No longer logistically
difficult, foreign states can use information warfare to change and shape
public opinion in states they target.
These campaigns have come to be referred to as “influence operations.”
Public opinion is the lifeblood of democracies. Therefore, campaigns to influence public
opinion using newly developed information technologies represents a significant
innovation of coercive power. This
innovation has quickly emerged as critically important to a state’s coercive
toolkit. In 2016, the United Kingdom held
a referendum on whether to leave the EU.
“Leave” narrowly won with 51.9% of the vote[1]. Several states, like Russia, had core
security interests involved in this vote – the stability or instability of the
EU. Russia’s coercive tool of choice to
influence this vote was not traditional tools of power like military threats,
but rather an influence campaign[2].
Likewise, the outcome of the 2016 US presidential
campaign would directly affect the core interests of many states, again
including Russia. This includes
presidential policy on NATO, the EU, and US sanctions on Russia over the
conflict in Ukraine. Once again, the
Russian state resorted to an influence campaign to shape the outcome of this
event.
Research questions
In 2017, I published an article titled “Influence
Campaigns and the Future of International Competition.”[3] The article contained two hypotheses: the
first was that democratic states are uniquely vulnerable to influence campaigns
and the second was that the success of Russia’s recent influence campaigns
would result in the increased proliferation of these activities throughout the
international system. The goal of this
project is to investigate the veracity of these two hypotheses and with it, use
social network analysis to better understand the nature of influence
campaigns.
Why Social Network Analysis?
Social network analysis provides us with a unique set of
tools to understand the nature of influence campaigns. Over time, these campaigns have become more
complex, with a multitude of actors taking part. These actors have various relational ties
with one another, from victims, to perpetrators, to surrogates. Understanding not just the composition of
influence campaigns, but the relationships between the actors is critically
important to understanding how they function.
Social network analysis will also provide us with
analytical tools to better understand the nature of influence campaigns. Using attribute data, we will be able to
interrogate the relationship between authoritarian states and democratic states
within the context of influence campaigns.
This data analysis will provide us with interesting insights into who
these campaigns are targeting and why they are targets.
Questions
In order to build on our understanding of influence
campaigns, this project will look at several questions. These questions are below.
2. Are
there clusters within influence campaign networks?
3. Can
we identify perceived spheres of influence based on cluster analysis and ego
networks?
4. Do
betweenness measures identify states who use a high number of proxies or engage
in more influence campaigns?
5. Are
influence campaign networks homophilous or heterophilous based on whether they’re
authoritarian or democratic states? And what
can this analysis tell us about who is being targeted and why? For example, do authoritarian states use influence
campaigns against other authoritarian states or only against democratic states?
Data and Methodology
In order to interrogate our questions, we will need to
incorporate several datasets. The first
dataset is one that contains all major influence campaigns since 2016, who
engaged in the campaign, who was the target of the campaign, and whether there
was a proxy relationship to another state. This dataset will be done on an
annual basis to identify patterns over time and on a non-annual basis with the number
of total campaigns between states to view overall trends. The dataset will also be directional.
Given the limited number of major influence campaigns
conducted over the last four years, assembling this dataset will be viable. To begin, we will start with the data compiled
in the “Authoritarian Interference Tracker”[4]
built by the Alliance for Security Democracy, housed in the German Marshall
Fund. Next, we will include the dataset
found in the report “The Global Disinformation Order,”[5]
as part of the Computational Propaganda Research Project by the Oxford Internet
Institute at Oxford University. From
there, we will do independent research to fill any holes in the data.
Secondly, we will utilize Freedom House’s “Freedom in the
World 2019” report to create attribute data based on the amount of freedom in a
given state. This attribute file will
allow us to include the democratic or authoritarian nature of a state into our analysis.
With our data in place, we will conduct a series of tests
to better understand these campaigns and what they mean. First, we will conduct a Newman Girvan
analysis to identify clusters within the network. This cluster analysis should help us draw out
any small world networks of these campaigns including the primary aggressor,
their proxies, and the victims. Using
cluster analysis and ego networks, we will be able to identify who the hegemons
in the global system are and where their perceived spheres of influence extend
to. We will also be able to see how they
are prioritizing their influence campaign assets based on the number of
campaigns they’ve waged against different states. We can use centrality measures such as betweenness
to illustrate who lies at the center of these networks. This will be helpful particularly if proxies
engage in campaigns on behalf of multiple actors. Once they’ve been identified, separating
hegemons by ego networks will allow us to better understand their behavior on
an individual basis.
In order to analyze our data further, we will dichotomize
our attribute data file to narrow the freedom index into states that are
democratic and those that are authoritarian.
We will then use an E-I index analysis to determine how heterophilous or
homophilous our data is based on the style of government. This will provide us with key
information. If the dataset is highly homophilous,
that will tell us that states of a similar level of democracy or
authoritarianism engage in influence campaigns against each other. If it is heterophilous, it provides evidence
towards our hypothesis that authoritarian states are more likely to engage in
campaigns against democratic states.
With this information, we should be able to draw out important
conclusions.
By changing the size and color of the different nodes,
each representing a country, we will be able to provide this critical analysis
in an easy and digestible presentation.
Taken together, the use of social network analysis will greatly increase
our understanding of the nature of influence campaigns.
[1] BBC
News, “EU Referendum Results.” Accessed October 19, 2019. https://www.bbc.com/news/politics/eu_referendum/results
[2] David
Kirkpatrick, “Signs of Russian Meddling in Brexit Referendum.” New York
Times, November 15, 2017. https://www.nytimes.com/2017/11/15/world/europe/russia-brexit-twitter-facebook.html
[3]Ben
Sohl, “Influence Campaigns and the Future of International Competition.” The
Strategy Bridge, September 12, 2017. https://thestrategybridge.org/the-bridge/2017/9/12/influence-campaigns-and-the-future-of-international-competition
[4] Alliance
for Security Democracy, “Authoritarian Interference Tracker,” Accessed October
15, 2019. https://securingdemocracy.gmfus.org/toolbox/authoritarian-interference-tracker/
[5] Samantha
Bradshaw and Philip Howard, “The Global Disinformation Order.” Computational
Propaganda Research Project, Oxford Internet Institute, University of
Oxford. Accessed October 19, 2019. https://comprop.oii.ox.ac.uk/wp-content/uploads/sites/93/2019/09/CyberTroop-Report19.pdf
1 comment:
I see from your intro that you're a tech evangelist! It's a great topic, and you've done a good deal of thinking about how to apply SNA. Some questions: does your first dataset exist? Or will you have to create it; if so, that sounds like a lot of work.I'm not sure what you mean when you say that "assembling this dataset will be viable." Whatever you mean, you're right in assuming that selecting and assembling the data will be the heart (and bulk) of your work, but your curated data alone will be a valuable deliverable for you and for other researchers.
Post a Comment