SMPA Junior Publishes Groundbreaking Disinformation Research

An SMPA Q&A with Junior Political Communication Major Jack Nassetta.

November 4, 2018

Alt Text


 

 

This summer, junior Jack Nassetta was on the opposite side of the country working as a visiting fellow with the James Martin Center for Nonproliferation Studies at the Middlebury Institute of International Studies in Monterey, California.

 

Back in Foggy Bottom for the fall semester, his report “All the World is Staged: An Analysis of Social Media Influence Operations against US Counterproliferation Efforts in Syria” quickly caught the media's attention in September, including an article co-authored by Nassetta in The Washington Post and a live appearance on Europe's largest broadcast network, SkyNews.

SMPA junior Samantha Cookinham asked Jack about how he first became interested in studying disinformation and the roller coaster ride of having a major research report published as an undergraduate and then picked up by the media.

Q: What first sparked your interest in studying disinformation?

A: I arrived at SMPA with my sights set on electoral or legislative politics, but after taking Professor Steven Livingston’s “Introduction to Political Communication” course as a sophomore, I became attracted to the more theoretical side of political communication.

Professor Livingston had a section of the course dedicated to new research emerging at the cross-section of the internet and communication, especially disinformation. After that, I was hooked and worked with him for an independent study course the following semester focusing on disinformation.

Q: How did you become interested in working with the James Martin Center for Nonproliferation Studies?

A: I became interested in working with the Center because of the splash they were making in the open source community for combining new technologies with traditional security theory.

I was originally accepted as a visiting fellow for a proposal on evaluating how media narratives shape arms treaty negotiations. It was only once I mentioned a side project on Twitter data over lunch to my future co-author, Ethan Fecht, that we decided to work together to make that the main project for our fellowships.

Fellows have the opportunity to present research in progress to the senior officials at the institute about half-way through the summer. When we presented our preliminary findings, the director was so thrilled by the potential that he offered to publish a full report, including a print run.

Q: What was the research process like for the report?

A: It was an amazing process and we were fortunate to have the backing of the Institute the entire way, allowing us to devote all of our time at the Center directly to the project.

We spent about a month just workshopping ideas and playing around with the raw data to see what correlations we could find. Eventually, we focused in on some key areas and developed those into the report.

We spent about three months fully focused on analyzing more than 850,000 tweets collected after a suspected chemical attack in Douma, Syria in April. With a data set that rich, whenever we thought we had the report finalized we would find another fascinating new statistic or account to include.

We started with some basic statistics to get a sense of what we had on our hands. When we found an indicator with potential, we visualized it and broke it down further if warranted.

Eventually, you reach the limit of what statistics can do in this type of analysis. So, for the latter portion of our work, we holistically reviewed accounts from the sample to determine their authenticity.

To determine the nature of the account, we looked at a number of factors and weighed them against each other. The most important factors were when the account was created and how the account was operating.

Most often, the fake accounts were newly created and only tweeting about the chemical attack. In the case that they did have history, they often only tweeted during events important to the Russian government.

We observed that the most obvious accounts tweeted only in praise of Putin, for instance posting memes about the “good” he does domestically. The most ridiculous account we found had stolen the profile picture from a Russian nature photographer’s Instagram and one of their final tweets before disappearing was “Putin tells the truth.”

Q: What lead to your Washington Post 'Monkey Cage' article on how to identify fake Twitter trolls?

A: After we finished the main report, we wanted to publish a popularized version to increase awareness.

I remembered that Professor Livingston had previously written for The Washington Post. I asked for his help and he connected me with Monkey Cage editor-in-chief John Sides, a professor in GW's Department of Political Science.

From there, we worked with the Monkey Cage blog editors and published the article on the Post website.

The article was framed around the possibility of a new chemical attack in Idlib, Syria.

We saw Russian propaganda claiming that the U.S. is preparing to fake a chemical attack there, so we wanted to make the sure the public was ready for the propaganda campaign. We included a number of the factors we used in our research to find fake propaganda accounts so that people can remain vigilant and avoid spreading made up accusations.

Q: What has it been like to receive so much media attention for your research?

A: Getting the confirmation from The Washington Post was incredible. We had a preliminary “okay” but then worked with their editors to revise the piece a number of times.

Around midnight on a Sunday night, I received an email informing me that it was going to be published on Monday morning.

I had the landing page open on my phone ready for when I woke up in the morning. When I woke up, I refreshed the page immediately — I couldn’t believe my eyes seeing the article up on the website.

Going on SkyNews, the largest news channel in Europe was also surreal.

The organization partners with Fox in the U.S. so I was at the Fox News D.C. bureau for the live broadcast — I thought it would just be a small office for the SkyNews bureau, but I ended up in the Fox News green room getting my makeup done next to the chairwoman of the Republican National Committee and thought I was hallucinating.

The on-camera portion was even more unbelievable. When you hear the host say your name in the earpiece, it's exhilarating because you know you're now live in front of a very large audience.

I was also featured on the front page of my hometown newspaper in Connecticut, The Day, which I was very appreciative of and was invited to consult with the State Department’s Program on Preventing State Disinformation.

Q: What do you hope people take away from your research?

A: I hope people take away the scale of this problem, in that it goes far beyond elections.

The Russian government is only getting more deceptive, and soon identification like this may not be possible. We need to address the problem in a systematic way before that is the case.