(You can download a PDF version of this paper here.)
The goal of the Dark Matter project and its related publications is to inform interested parties of the existing, urgent, comprehensive risk of disinformation and propaganda in American politics, and the larger, more generalized threat those elements pose to western democracy and the stability of the global post-World War II order.
Specifically, this project examines these risks in the context of digital communications and the use of weaponized social media by political campaigns for the purpose of engineering chaos through spreading propaganda and misinformation. The purpose of this paper is to review digital and political risk assessments published in the Dark Matter project in 2018.
In general, our original risk assessments were well-founded and, unfortunately, proven out.
The first Dark Matter paper published in February contained the following overall statement of risk:
In the 2018 cycle, many high-profile races nationwide will see interference directly from Russian state intelligence sources. Most campaigns will not see direct interference by foreign actors, but the activity at the top will have a destructive effect on campaigns down the ballot. Most Republican campaigns in competitive races will use tactics adopted from Russian intelligence operations, because they are effective, easy, cheap, and low-risk.
In mid-December, two reports commissioned by the Senate Intelligence Committee were released. These reports painted a more clear picture of what happened not only in 2016, but also how Russian intelligence disinformation efforts supported Trump after he was elected. A short summation: Russia used every major social media platform to support the election of Donald Trump, and, separately, to spread disinformation among communities likely to align against him in a voter suppression effort. Chief among those communities were African Americans, targeted along with “Latinos, Muslims, Christians, gay men and women, liberals, Southerners, [and] veterans.”
Meddling Efforts in 2018 by Russia (with special guest Iran)
As the midterms approached, an ongoing narrative about major digital media platform efforts to combat elections meddling and to root out propagandist accounts from Russia and Iran persisted, despite the relatively small scope of what each platform managed to find and stop. While that effort began in earnest in mid-late 2017, Facebook and Instagram accounts from Russian sources were being discovered and banned up through November 2018, with a majority of those originating on the platforms after mid-2017. Russia even fostered a sizable propaganda effort on Reddit using r/the_donald as a primary conduit, and a Russian national was indicted for work on attempts to influence both the 2016 and 2018 elections.
So, our stated risk of foreign adversaries interfering in our elections using digital propaganda was proven correct, and that activity continues to persist and evolve in 2019.
The Calls Are Still Definitely Coming From Inside the House
The midterms saw intense disinformation campaigns from foreign and domestic sources alike, and much of that propaganda was interchangeably amplified. Some affected specific communities, like the dozens of Muslims running for public office; in a similar vein, some campaigns conflated two bigoted world-views, like a (poorly) altered image of Georgia Democratic gubernatorial candidate Stacey Abrams which falsely aligned her with the Muslim Brotherhood.
Other efforts were more generalized, like a botnet geared towards voter suppression discovered by the DCCC and removed by Twitter. And, predictably, OG propagandists Project Veritas infiltrated several Democratic campaigns, producing videos aimed at damaging the campaigns of “Missouri Sen. Claire McCaskill, North Dakota Sen. Heidi Heitkamp, Tennessee Senate candidate Phil Bredesen, Arizona Rep. Kyrsten Sinema, Florida gubernatorial candidate Andrew Gillum, Oregon Gov. Kate Brown and House candidate Abigail Spanberger.”
Several publications offered crowd-sourced overviews of disinformation efforts running during the 2018 midterms. The New York Times produced a particularly good one, due in no small part to more than 4,000 examples submitted by readers.
Learning The Wrong Lesson, Again
The closing paragraph of the statement of risk was as follows:
If Democrats manage to ride an undisciplined wave to victory in 2018, that victory will likely be more narrow than needed. As a result, those campaign operatives are likely to learn the worst possible lesson, remaining completely unarmed for an entirely new kind of political, information warfare heading into the 2020 presidential election cycle.
While Democratic performance exceeded expectation across the board and especially in the House, several races were extremely close and many of the most high profile losses were narrow. In those losses, particularly in the aforementioned statewide races in Georgia and Florida, disinformation was both obvious and played a key role in the outcome.
Victorious Georgia Republican gubernatorial candidate Brian Kemp combined real-live voter suppression efforts with a very effective (and completely shameless) disinformation campaign. This included robocalls claiming Stacey Abrams was actively trying to steal the election via undocumented immigrant voters, to a baseless accusation that Georgia Democrats had tried and failed to hack the state election system.
In Florida, gubernatorial candidate and Tallahassee Democrat Andrew Gillum was the target of a disinformation campaign which took many forms including unsourced text messages, which claimed he was under criminal investigation by the FBI. These claims were false, but dominated more than one news cycle in the run-up to an extremely close election which Gillum ultimately lost by less than half a point.
In the first Dark Matter paper, we identified computational propaganda as a systemic, environmental, critical risk to the 2018 election cycle in the United States, and cited three major risk dimensions:
Ongoing Russian computational propaganda and information warfare attacks (hereafter: hybrid war) against the United States;
The adoption of Russian tactics and strategies by domestic political organizations, primarily conservative in nature; and
The extant state of American political discourse and cognitive bias, enforced and amplified by a broad adoption of social media platforms within American voting populations
In light of many factors, including but not limited to: reporting from ongoing investigations of Russian intelligence operations against western democracies; the acceleration of propaganda operations and misinformation campaigns from domestic sources, including the Trump administration; and the clear and present danger of human rights crises and the erosion of basic government functionality over the last 60 days, we feel compelled to add an additional risk dimension.
The manner in which propaganda and misinformation from national and state government sources is observed, analyzed, and reported in mass media channels, and how those communications methods increase the effectiveness of the propaganda campaign is a risk dimension equal in threat to those dimensions previously established.
Also in the June update, we went so far as to say that, under Trump’s leadership, the federal government was no longer a reliable source of information, citing several ongoing disinformation campaigns emanating directly from the government, including the propaganda and falsehoods being actively distributed about the family separation policy implemented at the border between the United States and Mexico.
Events since have proven the addition of domestic disinformation and propaganda campaigns as an additional risk dimension was as accurate as it was pressing. The North Korean summit and following diplomatic embarrassments surrounding denuclearization, the Helsinki summit, the confirmation hearings of Brett Kavanaugh, the myth of the violent caravan, the shutdown, and numerous foreign and domestic issues have put the propaganda skills of the Trump administration on full display.
What’s worse is the administration is getting more strategically skilled at disinformation even as they get more brazen. A prime example of this is Trump’s early January 2019 demand for a synchronous primetime network television slot to describe a non-existent immigration crisis in an effort to press the House for funding for a border wall which will likely never exist. Trump’s primary skill set is media manipulation, and he’s effectively weaponized it.
An Important Distinction: Hacking and Computational Propaganda
A troubling, completely incorrect narrative sprang up close to Election Day in 2018, and it made the rounds so quickly that it became broadly accepted more quickly than I thought possible. The ability of the American public to observe foreign disinformation campaigns against our country as a continuing condition was impaired by an extremely clumsy media narrative: that Russia didn’t meddle in the midterms.
This idea -- presumably reported because voting machines didn’t spring to life, Terminator-style, and murder people with hammers and sickles -- was enabled and propulsed by a Trump administration announcement that there was no evidence of Russian hacking ahead of Election Day. This was spread far and wide by right-wing news outlets and became a definitional problem for the rest of the media, as reporting actively conflated elections systems hacking with disinformation campaigns and computational propaganda, asserting that the absence of the former means that Russia wasn’t meddling in our election at all. This is clearly not true:
The NRCC email hack shows a real example of the kind of actual hacking - meaning the use of sophisticated computer security breach methods to gain access to protected, sensitive information, which seems to be what most people were looking for as they declared 2018 to be hack-free. The massive, constantly evolving disinformation campaign, meant to suppress voter turnout, sow bad information and discord, and influence the midterm elections was also underway.
Hacking efforts and propaganda campaigns are two different things. Both are attacks against our country, and both are extremely serious problems. Journalists should know how to capably communicate about both, individually and simultaneously with distinction.
While many reporters make a valiant effort to do so, headlines like NPR’s “The 2018 Midterms Weren't Hacked. What Does That Mean For 2020?” and New York Times’ “Mystery of the Midterm Elections: Where Are the Russians?” are problematic, and either reveal a fundamental institutional misunderstanding of the current risk environment, an editorial disconnect from capable reporting, an intentional misreading of reportage in an effort to drive traffic, or some combination of the three.
The NPR headline is particularly unforgivable in that the story undoes itself within the first several paragraphs with mismatched definitions similar to what we discuss above, in granting parity between
“After a Russian effort leading up to 2016 to sow chaos and polarization, and to degrade confidence in American institutions, what sort of widespread cyberattack awaited the voting system in the first national election since?
None, it seems.”
“We didn't see any coordinated effort or targeting that interrupted the elections process," said Matt Masterson, a senior cybersecurity adviser at the Department of Homeland Security. "[Nothing] that prevented folks from voting or compromised election systems in any way ... certainly nowhere close to what we saw in 2016.”
The scenarios described are two very different things. Why are they equated? And why is the assertion or premise of the story even entertained, especially with the wealth of reporting and knowledge publicly available about the efforts of Russia (and other countries) to influence the 2018 election available by late December 2018, when this story was published?
In terms of hacking, the United States is terrific on offense and terrible on defense, a common state for a major world power with what amounts to an infinite number of vulnerabilities and threat vectors, especially for democracies in which major technology companies are not considered state assets, as they are in China.
This institutional, omnipresent vulnerability is a critical threat and it is certainly worth taking seriously, for every country. Beyond the NRCC hack in 2018 or the DNC hack in 2016, consider the recent hack in which sensitive information about members and organizational operations of almost every major political party in Germany were exposed. The only party not affected by the hack was Alternative für Deutschland, the extreme far right Neo-Nazi party. Israel is bracing for a similar onslaught ahead of their elections on April 9, as well they should.
But disinformation and computational propaganda are a threat equal to and distinct from hacking. Consider the wild theories dominating political discourse in this country prior to the 2018 elections, and how many of them either originated from or were amplified by domestic sources.
These range from Donald Trump’s fabricated warnings via tweet about how “Law Enforcement has been strongly notified to watch closely for any ILLEGAL VOTING which may take place in Tuesday’s Election (or Early Voting), “ to how Trump turned a racist, xenophobic closing argument about immigration into a full-blown, unavoidable media narrative about a caravan of asylum-seekers being a possible invasion.
Trump achieved this in part by abusing state power and resources in deploying the military to the southern border. This in effect allowed Trump to reach beyond the normal Republican standard press approach of working the refs, all the way through to hacking the media itself. Disinformation and propaganda campaigns get significantly more powerful when you can deploy troops and issue official statements via government institution in support of the story you want to tell, as autocrats like Vladimir Putin well know.
Helsinki, Russia, and the Future
The bell we have been ringing since February of this year sounds an awful lot like where we have ended up with these individual conclusions, so finally getting here isn’t a surprise. What I failed to properly weigh in our original analysis is the cumulative effect of all of these events, and the context in which they have occurred.
The very top of that context, the rudder which steers it and the engine which drives it, is Trump’s obviously compromised position with Russia. And while, after Helsinki, some Republican elected officials have started using the “everybody does it” argument, we say this: This is not "Russia interfering with our elections." This is an organized crime syndicate which has captured the state-security apparatus of the world's second-largest nuclear power, now engaged in an active cyberwar to capture the institutions of state of the world's largest nuclear power -- which just happens to also be the country we live in.
This is a Russia which not only dedicated considerable time and resources to hacking the DNC and working actively against the Clinton campaign in 2016, but which also built an advanced persistent threat on Twitter, spending years building audiences for dozens of local news look-alike accounts, distributed nationally, which had yet to post any false information. Think of the discipline and organization that kind of effort requires, and what the long game might be from a group like the Internet Research Agency.
This is a Russia which is running a propaganda campaign right now to convince Democrats to leave the Democratic Party, under the #Walkaway hashtag. These tweets have already shown up in the mentions of high profile campaigns like Beto O’Rourke for Senate, but have also started showing up in Texas congressional campaigns, especially those that recently raised their profiles through fundraising success, like Gina Ortiz Jones, Lizzie Fletcher, MJ Hegar, and Sri Kulkarni.
In the original Dark Matter paper, we explored the risk of Russian interference but focused more specifically on the adoption of their methods by domestic political campaigns and organizations. I now believe that to be a mistake. In dealing with a truly environmental set of risks – a threat matrix that is all-encompassing – we should not understate the severity of the core danger which drives the whole system.
Everything happening now flows from Russian activity. Domestic actors have learned from them. American political campaigns have benefitted from their cyber attacks on the United States by using the stolen information to gain advantage on their opponents. Propaganda and disinformation flows from everywhere – Russia, the Trump administration, lesser elected officials, political candidates, Fox News, Breitbart, InfoWars, Sinclair, 4chan – and is aggressively amplified by and between every platform imaginable.
In the interest of political advantage and expediency, American citizens and elected officials are participating in this campaign against their own country. The reach of Russia's effort – the multiplicative nature of it, the momentum – should terrify you. And the Helsinki Summit – both Trump’s actions and the refusal of House and Senate Republicans to stop him – should alert us all to the fact that we are now living in a global emergency, the kind that endangers every structure which has held mankind together since the end of World War II.
I don’t like quoting my own work so extensively, but recent news merits this look back:
Russia won’t be alone in 2019 or 2020, and many of our problems are domestic. However, Russia is first among critical risks in an environment in which our entire political communications infrastructure suffers from massive vulnerabilities, and I do not expect that to change for the foreseeable future.
The end of our Helsinki update included these passages:
When you hear people talk about how Russia has weaponized information, believe it. The United States is under attack, and we are at war. It doesn’t look like the war you’re used to, but it is war nonetheless.
There is no logical or credible reason to believe the Trump administration, or Congress, or any American institution can or will stop the Russian attack on the United States. Republican campaigns will happily seek amplification by their bots and trolls, and will use hacked campaign materials at every opportunity.
The Russian risk is critical and ever-present, and while I acknowledged it early on, I didn’t outline it severely enough. I talked myself out of believing some things that seem obvious now. Of course Russia has advanced persistent threats, malware and backdoors and access, buried in voting machines and state election systems. Of course they have advanced persistent threats lodged in the systems of candidates and campaigns and elected officials and government agencies. Of course they will use what they learned running a multi-state targeted disinformation campaign based on good targeting data in a general election as the foundation for running a similar campaign in dozens of races nationwide. Of course campaigns at every level are vectors for this threat.
Russian computational propaganda is not just coincidentally and gratefully amplified by domestic, Republican groups and campaigns adopting their methods. Rather, that synergy is how this attack was designed to work.
I have no reason to believe that this risk is any less critical in 2019 than it was in 2016, and current context makes me believe it is even more of a cause for concern. The primary for the 2020 Democratic presidential nomination will be the chief breeding ground for an evolved, supercharged version of every risk we examined in 2018, and every campaign is at risk, both from without and from within.
If you work in politics and you aren’t rethinking everything from how a press shop should operate to how to structure your media plans and budgets -- in short, if you aren’t rebuilding everything about your approach and philosophy to political communications to fit the reality in which we live in 2019 -- you are actively participating in worsening the risk conditions outlined in our work. To that end, I urge you to read our paper from May, “Cognitive Bias in Political Advertising and Communication.”
Personal operational security training at regular intervals for candidates and campaign staff are now a critical requirement. Spending money intelligently to invest in building both a digital communications infrastructure and a digital audience for your campaign is more important than ever. Mitigating digital risk must be at the forefront of every campaign plan. Understanding how media works in today’s environment is not possible without recognizing that it has changed significantly, even within the last year.
The major social media platforms will not fix any of their terrible issues fast enough if at all, and do not necessarily have incentive to do so in the first place. The Trump Administration is clearly not capable of nor inclined towards stopping the disinformation campaigns, foreign or domestic, destabilizing our democracy.
Understanding requires a shared frame of reference. Achieving a true and clear perspective of the forces at work here is necessary for us to have any chance of stopping them. Such a perspective demands an acceptance of the environment in which those forces operate.
The required work is difficult and demands consistent disciplined effort, as well as the destruction of many norms about campaigns, communication, media, and politics that have come to be taken as gospel. It’s an unfair game, and time is of the essence.