“US Government Support for Domestic Censorship and Disinformation Campaigns 2016-2022” Congressional Testimony
EP’s president and founder, Michael Shellenberger, appeared before the US House of Representative’s select committee on the "Weaponization of the Federal Government” on March 9, 2023 where he testified on the The Censorship Industrial Complex. Download his testimony here.
The Censorship Industrial Complex
U.S. Government Support For Domestic Censorship And Disinformation Campaigns, 2016 - 2022
Testimony by Michael Shellenberger to The House Select Committee on the Weaponization of the Federal Government
March 9, 2023
CONTENTS
- The Censorship Industrial Complex Today
- Definition and Mission
- National Science Foundation Funding
- Defense Advanced Research Projects Agency (DARPA) Roots
- Key Organizations
- Key Individuals
The Complex’s Disinformation Campaigns
- The Trump-Russian Collusion Conspiracy Theory, 2016 - 19
- Delegitimizing the Covid Lab Leak Theory, 2020 – 2021
- The Hunter Biden Laptop Conspiracy Theory, 2020–21
Ideology, Strategy, And Origins
- Defund the Censorship Industrial Complex
- Mandate instant reporting of all communications between government officials and contractors with social media executives relating to content moderation
- Reduce Scope of Section 230
Executive Summary
In his 1961 farewell address, President Dwight Eisenhower warned of “the acquisition of unwarranted influence… by the military-industrial complex.” Eisenhower feared that the size and power of the “complex,” or cluster, of government contractors and the Department of Defense would “endanger our liberties or democratic processes.” How? Through “domination of the nation's scholars by Federal employment, project allocations, and the power of money.” He feared public policy would “become the captive of a scientific-technological elite.”[1]
Eisenhower’s fears were well-founded. Today, American taxpayers are unwittingly financing the growth and power of a censorship-industrial complex run by America’s scientific and technological elite, which endangers our liberties and democracy. I am grateful for the opportunity to offer this testimony and sound the alarm over the shocking and disturbing emergence of state-sponsored censorship in the United States of America.
The Twitter Files, state attorneys general lawsuits, and investigative reporters have revealed a large and growing network of government agencies, academic institutions, and nongovernmental organizations that are actively censoring American citizens, often without their knowledge, on a range of issues, including on the origins of COVID[2], COVID vaccines[3], emails relating to Hunter Biden’s business dealings[4], climate change[5], renewable energy[6], fossil fuels[7], and many other issues.
I offer some cautions. I do not know how much of the censorship is coordinated beyond what we have been able to document, and I will not speculate. I recognize that the law allows Facebook, Twitter, and other private companies to moderate content on their platforms. And I support the right of governments to communicate with the public, including to dispute inaccurate and misleading information.
But government officials have been caught repeatedly pushing social media platforms to censor disfavored users and content. Often, these acts of censorship threaten the legal protection social media companies need to exist, Section 230.
“If government officials are directing or facilitating such censorship,” notes George Washington University law professor Jonathan Turley, “it raises serious First Amendment questions. It is axiomatic that the government cannot do indirectly what it is prohibited from doing directly.”[8]
Moreover, we know that the U.S. government has funded organizations that pressure advertisers to boycott news media organizations and social media platforms that a) refuse to censor and/or b) spread disinformation, including alleged conspiracy theories.
The Stanford Internet Observatory, the University of Washington, the Atlantic Council’s Digital Forensic Research Lab, and Graphika all have inadequately-disclosed ties to the Department of Defense, the C.I.A., and other intelligence agencies. They work with multiple U.S. government agencies to institutionalize censorship research and advocacy within dozens of other universities and think tanks.
It is important to understand how these groups function. They are not publicly engaging with their opponents in an open exchange of ideas. They aren’t asking for a national debate over the limits of the First Amendment. Rather, they are creating blacklists of disfavored people and then pressuring, cajoling, and demanding that social media platforms censor, deamplify, and even ban the people on these blacklists.
Who are the censors? They are a familiar type. Overly confident in their ability to discern truth from falsity, good intention from bad intention, the instinct of these hall monitor-types is to complain to the teacher — and, if the teacher doesn’t comply, to go above them, to the principal. Such an approach might work in middle school and many elite universities, but it is anathema to freedom and is an abuse of power.
These organizations and others are also running their own influence operations, often under the guise of “fact-checking.” The intellectual leaders of the censorship complex have convinced journalists and social media executives that accurate information is disinformation, that valid hypotheses are conspiracy theories, and that greater self-censorship results in more accurate reporting. In many instances, censorship, such as labeling social media posts, is part of the influence operation aimed at discrediting factual information.
The censorship industrial complex combines established methods of psychological manipulation, some developed by the U.S. Military during the Global War on Terror, with highly sophisticated tools from computer science, including artificial intelligence. The complex’s leaders are driven by the fear that the Internet and social media platforms empower populist, alternative, and fringe personalities and views, which they regard as destabilizing. Federal government officials, agencies, and contractors have gone from fighting ISIS recruiters and Russian bots to censoring and deplatforming ordinary Americans and disfavored public figures.
Importantly, the bar for bringing in military-grade government monitoring and speech-countering techniques has moved from “countering terrorism” to “countering extremism” to countering simple misinformation. The government no longer needs a predicate of calling you a terrorist or extremist to deploy government resources to counter your political activity. The only predicate it needs is simply the assertion that the opinion you expressed on social media is wrong.
These efforts extend to influencing and even directing conventional news media organizations. Since 1971, when the Washington Post and New York Times elected to publish classified Pentagon papers about the war in Vietnam, journalists understood that we have a professional obligation to report on leaked documents whose contents are in the public interest, even when they had been stolen. And yet, in 2020, the Aspen Institute and Stanford’s Cyber Policy Center urged journalists to “Break the Pentagon Papers principle” and not cover leaked information to prevent the spread of “disinformation.”
Government-funded censors frequently invoke the prevention of real-world harm to justify their demands for censorship, but the censors define harm far more expansively than the Supreme Court does. The censors have defined harm so broadly, in fact, that they have justified Facebook censoring accurate information about COVID vaccines, for example, to prevent “vaccine hesitancy.” Their goal, clearly, is not protecting the truth but rather persuading the public. That is the purpose of open debate and the free exchange of ideas. Persuasion by covert means is censorship.
And, increasingly, the censors say their goal is to restrict information that “delegitimizes” governmental, industrial, and news media organizations.[9] That mandate is so sweeping that it could easily censor criticism of any part of the status quo from elected officials to institutions to laws. This extreme, reactionary attitude is, bluntly, un-American.
Congress should immediately cut off funding to the censors and investigate their activities. Second, it should mandate instant reporting of all conversations between social media executives, government employees, and contractors concerning content moderation. Third, Congress should limit the broad permission given to social media platforms to censor, deplatform, and spread propaganda.
Whatever Congress does, it is incumbent upon the American people to wake up to the threat of government censorship via behind-the-scenes pressure on media corporations. “Only an alert and knowledgeable citizenry,” Eisenhower noted, “can compel the proper meshing of the huge industrial and military machinery of defense with our peaceful methods and goals so that security and liberty may prosper together.”
The Censorship Industrial Complex Today
Definition and Mission
The censorship industrial complex is a network of ideologically-aligned governmental, NGO, and academic institutions that discovered over the last few years the power of censorship to protect their own interests against the volatility and risks of the democratic process. They are not “defending democracy,” as they claim. Rather they are defending their own policy and pecuniary interests against democracy.
National Science Foundation Funding
Since January 2021, the National Science Foundation (NSF) has made at least 64 government grants totaling $31.8 million on the science of “countering” social media “mis/disinformation” and two government grants totaling $7 million. Forty-two colleges and universities received 64 grants.[10] NSF created a new research track, “Track F”, for disinformation and censorship research called the “Trust and Authenticity in Communication Systems.”[11]
NSF justifies its censorship program as a way to defend civilization. “Modern life is increasingly dependent on access to communications systems that offer trustworthy and accurate information,” writes NSF in its 2022 research overview. “Yet these systems face a common threat; communication systems can be manipulated or can have unanticipated negative effects. Introducing misinformation into communication flows can disrupt the performance of a wide range of activities and the functioning of civil society.”[12]
NSF repeats the central claim of the censorship industrial complex that the Internet requires censorship. “Although false claims and other inauthentic behaviors have existed throughout history,” writes NSF, “the problems that they cause have reached critical proportions resulting from the massive scale of targeting and personalization, the rapid speed of information exchange, and the ability to automate information dissemination.“[13] Here is a sample of the censorship/disinformation initiatives NSF 2022 is funding:
- University of Michigan: WiseDex “harnesses the wisdom of crowds and AI techniques to help flag more posts.”
- Hacks/Hackers: Toolkit for “building trust around controversial topics such as vaccine efficacy.”
- Ohio State University: CO:CAST “helps decision-makers manage their information environment.”
- Meedan: Co·Insights “enables community, fact-checking, and academic organizations to collaborate and respond effectively to emerging misinformation narratives that stoke social conflict and distrust.”
- Temple University’s CommuniTies: “Using an AI network science tool, CommuniTies provides actionable insights for local newsrooms to help them build digital lines of communication with their communities, preventing the spread of misinformation and disinformation.”
- University of Wisconsin: Course Correct is “a dynamic misinformation identification dashboard to empower journalists to identify misinformation networks, and correct misinformation.”
Defense Advanced Research Projects Agency (DARPA) Roots
The censorship industrial complex today is using tools that the DoD originally developed to fight terrorists.
For example, DARPA 2011 created the Social Media in Strategic Communication (SMISC) program “to help identify misinformation or deception campaigns and counter them with truthful information.”
DARPA said the goals were:
- “Detect… misinformation.”
- “Recognize persuasion campaign structures and influence operations across social media sites and communities.”
- “Identify participants and intent, and measure effects of persuasion campaigns”
- "Counter messaging of detected adversary influence operations.”[14]
The four goals of “Course Correct,” a project funded by NSF, targets U.S. citizens today in a nearly identical way:
- “… detect misinformation…”
- “…continue developing A/B-tested correction strategies against misinformation…”
- “... evaluate the effectiveness of evidence-based corrections… by conducting small, randomized control trials…”
- “ongoing collaborations with journalists, as well as tech developers and software engineers.”
Key Organizations
CISA: The Cybersecurity and Infrastructure Security Agency, an agency within the Department of Homeland Security (DHS). On January 6, 2017, outgoing Obama Administration DHS Secretary Jeh Johnson designated “election infrastructure” as “critical infrastructure,” opening up CISA’s mission to censoring alleged “disinformation.”[15] Congress created CISA in November 2018 to defend the U.S. from cybersecurity threats from hostile foreign actors (e.g., Russian hackers).[16]
Digital Forensics Research (DFR) Lab at the Atlantic Council: The lab is one of the most established and influential full-time censorship institutions in the world.[17] Atlantic Council DFR Lab created the foreign-facing DisinfoPortal in June 2018, working directly with the National Endowment for Democracy (NED) and 23 organizations to censor election narratives leading up to the 2019 elections in Europe.[18] In 2018, Facebook named Atlantic Council, an official partner in “countering disinformation” worldwide.[19] US taxpayer funding to the Atlantic Council comes from the Defense Department, the US Marines, the US Air Force, the US Navy, the State Department, USAID, the National Endowment for Democracy, as well as energy companies and weapons manufacturers.[20]
Graphika: a private network analysis firm. Graphika published a report for the Senate Intelligence Committee in December 2018, which claimed to have uncovered “in unusually rich detail the scope of Russia's interference not only in the 2016 U.S. presidential election but also in our day-to-day democratic dialogue.”[21] Graphika hired as its director of investigations Ben Nimmo away from DFR lab.[22] The Defense Department's Minerva Initiative, which focuses on psychological warfare, and DARPA, both gave grants to Graphika.[23] In 2021, the Pentagon awarded nearly $5 million in grants and nearly $2 million in contracts to the organization.[24] Last fall, Graphika alleged that cartoons on a fringe website were “suspected Russian actors” that were “engaged in a renewed effort” to interfere in the 2022 midterm elections.[25] The New York Times picked up on the story.[26]
Moonshot CVE is private firm to redirect right-wing people online away from radicalism[27] but was found to have pushed right-wing people toward an anarchist leader. “They sent people who were already looking for violence to a convicted felon with anarchist and anti-Semitic views,” Rep. Morgan Griffith (R-Va.) said to Google’s CEO. “Who is vetting the vetters? We continue to need more transparency and accountability.”[28] Moonshot includes Elizabeth Neumann, former DHS Asst. Sec. for Counter Terrorism.
FITF: Foreign Influence Task Force, a cyber-regulatory agency comprised of members of the FBI, DHS, and ODNI
GEC: Global Engagement Center, an analytical division of the U.S. State Department which systematically launders domestic censorship by working through “counter-disinformation” NGOs and foreign firms.
Hamilton 68: A dashboard created with U.S. government funding and the support of New Knowledge claiming to reveal Russian bots on Twitter but was mocked by Twitter staff because all or almost all belonged to American citizens
HSIN: Homeland Security Information Network, a portal through which states and other official bodies can send “flagged” accounts
EIP: Election Integrity Project, a partnership between four government-funded censorship organizations: Stanford Internet Observatory, Graphika, University of Washington Disinformation Lab, and the Atlantic Council’s Digital Forensic Research Lab. EIP has served as CISA’s deputized domestic disinformation flagger.
IRA: Internet Research Agency, the infamous Russian “troll farm” headed by “Putin’s chef,” Yevgheny Prigozhin
MISP: Malware Information Sharing Platform. Used by cyber security operatives to share malware, tools around bots, coordinated and inauthentic ops. “When DFR wanted to apply cybersec tools to misinformation,” said a government disinformation specialist, “they used MISP.”[29]
NewsGuard and the Global Disinformation Index: Both taxpayer-funded, are urging advertisers to boycott disfavored publications, and direct their funding to favored ones The organizations have been caught spreading disinformation, including that the COVID lab leak theory is a debunked conspiracy theory, and seeking to discredit publications which accurately reported on Hunter Biden’s laptop, such as the New York Post
Cognitive Security Collaborative and Adversarial Misinformation and Influence Tactics and Techniques: These are online platforms for describing and coordinating disinformation attacks. “It works like other security operations focused on threat actors,” noted the specialist. “If they have a threat actor who has launched a coordinated inauthentic information attack,” said a source, “they would log the threat actor and start mapping the actor just as they would a cyber attack. They then coordinate social media takedowns” [removals].[30]
University of Washington (UW): One of two academic institutions that DHS worked directly with and had as its partner, to censor information on social media platforms during the 2020 election.[31] It received a $3 million government grant, shared with Stanford Internet Observatory, from the Biden Administration in 2021, to continue its “election misinformation” flagging.[32]
Stanford Internet Observatory (SIO): One of the four members of the Election Integrity Project (and later the Virality Project) with UW, Graphika, and DFR. It was created in June 2019 by director Alex Stamos and research manager Renee DiResta. SIO monitors social media and promotes Internet censorship. For the 2020 election, as part of its partnership with CISA, SIO had 50 “misinformation” analysts assigned to monitor social media.[33] SIO was originally funded by Craig Newmark Philanthropies, the Omidyar Network, and the Charles Koch Foundation.[34]
Key Individuals
- Graham Brookie, leader of the Atlantic Council’s DFR Lab. Brookie served in the Obama White House on the National Security Council.[35]
- Renee DiResta, Stanford Internet Observatory. DiResta was the research director for the organization caught creating bot accounts and spreading disinformation about Alabama Republican Senate Candidate Roy Moore.[36] In her 2018 Senate testimony DiResta advocated “legislation that defines and criminalizes foreign propaganda” and allowing law enforcement to “prosecute foreign propaganda.”[37] According to recorded remarks by DiResta’s supervisor at Stanford, Alex Stamos, she had previously “worked for the CIA.”[38]
- Jen Easterly, CISA Director. A former military intelligence officer and the National Security Agency (NSA) deputy director for counterterrorism. “One could argue we’re in the business of critical infrastructure,” said Easterly in November 2021, ”and the most critical infrastructure is our cognitive infrastructure, so building that resilience to misinformation and disinformation, I think, is incredibly important.”[39] The month before, Easterly said during a CISA summit that Chris Krebs's construction of a “counter-misinformation” complex with the private sector was a high priority for DHS.[40] A U.S. District Court ruled in October of last year that Easterly could be deposed because of her “first-hand knowledge” of the CISA “nerve center” around disinformation.[41]
- Chris Krebs. CISA Director (2018 to 2020). Chair of Aspen Institute “Commission on Information Disorder,” helped organize DHS’s “whole-of-society” approach to censorship.[42] Krebs administered the federal side of the 2020 election after DHS effectively nationalized election security on January 6, 2017, via the declaration of elections as “critical infrastructure.” Krebs then declared that “misinformation” was an attack on election security. Krebs said in April 2022 that the Hunter Biden laptop still looked like Russian disinformation and that what mattered was that news media did not cover the laptop during the 2020 election cycle.[43] Krebs advocated for censoring critics of government COVID-19 protocols[44] and said “misinformation” is the largest threat to election security.[45]
- Ben Nimmo, Head of Global Threat Intelligence for Facebook, and thus one of America’s most important censors. Nimmo was the technical lead for censorship at the Atlantic Council's Digital Forensics Research Lab, was employed by Graphika in the fall of 2020 [46], and worked in NATO information operations.[47] In 2018, Nimmo publicly reported an anonymous Twitter account, “Ian56”, as a Russian disinformation bot account because it expressed left-of-center populist anti-war views when in reality Ian56 was a real person.[48] After Nimmo's report, “Ian56” was reported to the UK government.[49]
- Kate Starbird, who runs the University of Washington disinformation lab, has for years been funded primarily by U.S. government agencies to do social media narrative analytics of political groups, or insurgency movements, of interest or concern to U.S. military intelligence or diplomatic equities. Starbird acknowledged that the censorship focus of CISA and EIP moved from “foreign, inauthentic” social media users to “domestic, authentic” social media users between 2016 to 2020.[50] Starbird is now the head of CISA’s censorship advisory subcommittee.
- Alex Stamos was the senior leader at EIP and VP, which served as the deputized domestic “disinformation” flagger for DHS via Chris Krebs’ CISA. Stamos in 2020 proposed that DHS centralize government censorship.[51] Stamos was the Chief Security Officer of Facebook and led Facebook's response to alleged Russian disinformation after the 2016 election. Stamos left Facebook, now Meta, in 2018, after reportedly conflicting with other Facebook executives over how much to censor.[52] Stamos says he favors moving away from a free and open Internet toward a more controlled “cable news network” model. A huge part of the problem is “large influencers,” said Stamos.[53]
- Claire Wardle cofounded and directed, First Draft News. a nonprofit coalition, in June 2015, to build the censorship complex. “In September 2016, our original coalition expanded to become an international Partner Network of newsrooms, universities, platforms and civil society organizations.” In 2017, while at the Shorenstein Center for Media, Politics and Public Policy at Harvard’s Kennedy School, Wardle helped develop the “Information Disorder Lab,” a framing that Aspen Institute would embrace. In June 2022, First Draft closed, but its work lives on at the Information Futures Lab at Brown University’s School of Public Health.
The Complex’s Disinformation Campaigns
Many of the leaders and participants in today’s censorship industrial ex have been involved in spreading disinformation, including conspiracy theoriescompl, while discrediting accurate information and alleging that valid theories were debunked conspiracy theories.
1. The Trump-Russian Collusion Conspiracy Theory, 2016–2019
The complex’s first major disinformation campaign was the conspiracy theory that Donald Trump colluded with Vladimir Putin and the Russian government to steal the 2016 election.
There is no evidence that Russia’s social media investments, or its hacking and leaking of emails, had any impact, much less a decisive one, on the outcome of the 2016 election.[54] Most neutral analysts, as well as many Democratic strategists, believe that they did not.[55]
Two of the four leading censorship organizations, New Knowledge and Graphika, provided the Senate Intelligence Committee provided the academic foundation for the claim that the Russians had elected Trump. They pointed to evidence that ten million people in the U.S. had seen the ads.[56]
“The Russian disinformation operations that affected the 2016 United States presidential election are by no means over,” wrote DiResta in the New York Times in December 2018. “Russian interference through social media… is a chronic, widespread and identifiable condition that we must now aggressively manage.”[57]
Her findings were widely respected and publicized. Former director of national intelligence James Clapper called the evidence that Russia had influenced the election "staggering." University of Pennsylvania communication professor Kathleen Hall Jamieson pointed to the evidence to conclude that Trump would not have been president without the Russians.[58]
But there is no evidence that the Russians had any influence on the 2016 campaign, much less that they won it for Trump. Robert Faris, Hal Roberts and Yochai Benkler of Harvard's Berkman Klein Center for Internet and Society analyzed millions of articles from 2015 to 2018 using network analysis to measure how audiences paid attention to media coverage, text analysis to see which sites wrote about what and when, and wrote detailed case studies of the most salient issues in the election. “The Russians are there,” wrote Benkler. “They are trying. But in all these cases, American right-wing media did the heavy lifting to originate and propagate disinformation.”
Conservative voters did not consume very much social media compared to news media in 2016. While 40 percent of Trump voters said Fox was their primary source of news, only 7 percent said Facebook. “People promoting the idea that Russia swung the election will often cite the figure that Russian Facebook posts reached about 126 million Americans. But that refers to anyone whose news feed ever included such a piece of content, regardless of whether they saw it, or whether it may have been drowned out in their minds by hundreds of other posts.”[59] What’s more, 56% of the Russian troll farm’s pages appeared after the election and 25% were seen by no one.[60]
DiResta’s work is plagued by exaggeration. “The consolidation of the online social ecosystem into a few major platforms means,” she wrote in the Times, “that propagandists have ready audiences; they need only blanket a handful of services to reach hundreds of millions of people. And precision targeting, made possible by a decade of gathering detailed user behavior data (in the service of selling ads), means that it is easy and inexpensive to reach any targeted group” (my emphasis).[61]
But if it is so cheap and easy to reach hundreds of millions of people on-line, why don’t more people do it? Why must politicians and corporations alike spend tens of millions trying to reach audiences? Because it’s not cheap or easy, as anybody who has attempted to market a product or candidate online knows. It is for that reason that the Russians reached so few people.
What about the DNC and Podesta email hacks? DiResta and Twitter’s Yoel Roth, following Jamieson, claimed that the publicity around the hacks interrupted the “Access Hollywood” tape release where Trump boasts of grabbing the genitals of women. “The Podesta emails, for their part, were released by WikiLeaks on Oct. 7, 2016, less than an hour after the "Access Hollywood" tape came out, in a clear effort to divert attention from that embarrassing story about Trump's lewd comments apparently acknowledging sexual misconduct with women,” note the Harvard scholars.
But it had little impact. “While they [the hacked emails] certainly drew attention, generating between 150 and 400 stories per day in the 10 days after their release, the emails failed to divert attention from the ‘Access Hollywood’ tape, which generated two to three thousand stories per day…. Given the volume and tenor of mainstream and right-wing domestic coverage of Clinton, it seems unlikely that Russian propaganda made much of a difference.”[62]
Scholars and Facebook warn that such gross exaggerations of Russian influence are a form of disinformation that helps Putin, or at least the people trying to win his favor, while only making Americans more confused.
“When we propagate the idea that Russian propaganda is the all-powerful source of disinformation in American politics,” writes Benkler, “we reinforce precisely this primary goal: We sow confusion.”[63]
Wrote Facebook in 2022, “These actors… have an interest in exaggerating their own effectiveness, engaging in client-facing perception hacking to burnish their credentials with those who might be paying them.”[64] As such, those who exaggerate the impact of foreign interference may create a financial and political incentive for more of it.
When challenged on these claims, DiResta and others emphasize that we should be alarmed simply by the fact that foreign interference is happening.
But governments have been interfering in each other’s elections for hundreds of years. France interfered in the 1796 U.S. Presidential election. The French ambassador openly campaigned for Republicans and attacked the Federalists, and urged President George Washington’s secretary of state to reject the Jay Treaty, the trade agreement between the U.S. and Great Britain, which France had just fought in the war of 1793.[65]
The U.S. has secretly sought to influence elections in South Vietnam and Japan, El Salvador, Haiti, Guatemala, Brazil, Israel, Lebanon, Panama, Iran, Greece, Italy, Malta, Slovakia, Romania, Bulgaria, Albania, Sri Lanka, and the Philippines.[66]
That historical context is not meant to justify interference in other nations’ elections. It is meant to show how overwrought such claims were and are. The bottom line is that it’s hard to change the minds of voters, in general. And foreign actors are usually far less able to do so than domestic actors, who have much more at stake and understand the nuances of local politics.
One final note: it’s not clear to me whether the promoters of the Russia-Trump conspiracy theory even believed what they were saying, or whether they were simply using it as a pretext for censorship.
2. Delegitimizing the COVID Lab Leak Theory, 2020–2021
The second major disinformation campaign aimed at the American people began in February 2020 and stated that the hypothesis that the COVID virus originated in a Chinese laboratory was a “debunked conspiracy theory,” when in fact, this idea was always just as reasonable as the theory that the virus crossed over from wild animals to humans. This disinformation campaign was advanced by National Institutes of Health head Francis Collins and NIAID’s Anthony Fauci, who oversaw the U.S. government’s response to COVID. Emails made available show that at least two leading researchers told Collins and Fauci in February 2020 that a lab leak was possible and likely. Collins and Fauci publicly dismissed the lab leak theory as a conspiracy theory even though they knew it wasn’t, perhaps for fear of harming cooperation between the U.S. and China or of being implicated in the pandemic since Fauci was instrumental in offshoring this research to Wuhan after Obama banned it on U.S. soil.[67]
3. The Hunter Biden Laptop Conspiracy Theory, 2020–2021
The third major disinformation campaign also occurred in 2020 and was aimed at convincing journalists, social media executives, and the American people that the Hunter Biden laptop had been made public through a Russian “hack and leak” operation and not, as the New York Post reported on October 14, 2020, through a computer repair store owner.[68]
The Stanford Internet Observatory published a report urging news media to abandon the ethic held since the publishing of the “Pentagon Papers” in 1971 and instead focus on the perpetrators of the hack and leak, rather than the contents of the leak.[69] Aspen Institute hosted a “tabletop exercise,” in what may have been a “pre-bunking operation,” to shape reporting around a potential “hack and leak” relating to Hunter Biden for the top censors at Facebook and Twitter, as well as national security reporters at the New York Times, Washington Post, and CNN, in the summer of 2020, months before the October 14 publication.
The greatest episode of the censorship industrial complex’s discrediting of factual information was the “prebunking” it did of the Hunter Biden laptop.[70] There is strong evidence of an organized effort by representatives of the intelligence community (IC), aimed at senior executives at news and social media companies, to discredit leaked information about Hunter Biden before and after it was published.
During all of 2020, the FBI and other law enforcement agencies repeatedly primed Twitter’s head of Site Integrity (and later Head of Safety and Trust) Yoel Roth to dismiss reports of Hunter Biden’s laptop as a Russian “hack and leak” operation. This is from a sworn declaration by Roth given in December 2020.
During these weekly meetings, the federal law enforcement agencies communicated that they expected ‘hack-and-leak operations’ by state actors might occur in the period shortly before the 2020 presidential election, likely in October. I was told in these meetings that the intelligence community expected that individuals associated with political campaigns would be subject to hacking attacks and that material obtained through those hacking attacks would likely be disseminated over social media platforms, including Twitter. These expectations of hack-and-leak operations were discussed throughout 2020. I also learned in these meetings that there were rumors that a hack-and-leak operation would involve Hunter Biden.
FBI did the same to Facebook, according to CEO Mark Zuckerberg. “The FBI basically came to us [and] was like, ‘Hey... you should be on high alert. We thought that there was a lot of Russian propaganda in the 2016 election. There's about to be some kind of dump similar to that.’”
And yet the FBI warnings of a Russian hack-and-leak operation relating to Hunter Biden were not based on any new intelligence. “Through our investigations, we did not see any similar competing intrusions to what had happened in 2016,” admitted FBI agent Elvis Chan in November 2022.
Indeed, Twitter executives repeatedly reported very little Russian activity. For example, on September 24, 2020, Twitter told the FBI it had removed 345 “largely inactive” accounts “linked to previous coordinated Russian hacking attempts.” They “had little reach and low follower accounts."[71]
In fact, Twitter staff routinely debunked false claims made by mainstream journalists of foreign influence on its platform. In response to an article suggesting the #dcblackout campaign was driven by foreign bots, Yoel Roth wrote in an email to Elvis Chan, “We haven’t seen any evidence to support that claim.”[72] After the FBI asked about a Washington Post story on alleged foreign influence in a pro-Republican tweet, Roth said, "The article makes a lot of insinuations... but we saw no evidence that that was the case here (and in fact, a lot of strong evidence pointing in the other direction).”[73]
Pressure from the FBI on Twitter had been growing. “We have seen a sustained (If uncoordinated) effort by the IC [intelligence community] to push us to share more information and change our API policies,” complained a senior Twitter executive. “They are probing and pushing everywhere they can (including by whispering to congressional staff).”[74]
Despite Twitter’s pushback, the FBI repeatedly requested information from Twitter that Twitter has already made clear it would not share outside of normal legal channels.
Recently, Twitter’s Roth told tech journalist Kara Swisher that he had been primed to think about the Russian hacking group APT28 before news of the Hunter Biden laptop came out. When it did, Roth said, “It set off every single one of my finely tuned APT28 hack-and-leak campaign alarm bells.”[75]
Jim Baker is the former general counsel of the FBI (2014-18) and one of the most powerful men in the U.S. intelligence community. Baker has moved in and out of government for 30 years, serving stints at CNN, Bridgewater (a $140 billion asset management firm), and the Brookings Institution. As general counsel of the FBI, Baker played a central role in making the case internally for an investigation of Donald Trump.
Baker wasn't the only senior FBI executive involved in the Trump investigation to go to Twitter. Dawn Burton, the former deputy chief of staff to FBI head James Comey, who initiated the investigation of Trump, joined Twitter in 2019 as director of strategy.
As of 2020, there were so many former FBI employees — “Bu alumni” — working at Twitter that they had created their own private Slack channel and a crib sheet to onboard new FBI arrivals.[76]
On October 14, shortly after the New York Post published its Hunter Biden laptop story, Roth said, “it isn’t clearly violative of our Hacked Materials Policy, nor is it clearly in violation of anything else," but added, “this feels a lot like a somewhat subtle leak operation.”[77]
In response to Roth, Baker repeatedly insisted that the Hunter Biden materials were either faked, hacked, or both, and a violation of Twitter policy. Baker did so over email, and in a Google doc, on October 14 and 15. It is difficult to believe that Baker genuinely thought the Hunter Biden emails were either fake or hacked. The New York Post had included a picture of the receipt signed by Hunter Biden, and an FBI subpoena showed that the agency had taken possession of the laptop in December 2019.
Finally, by 10 am, Twitter executives had bought into a hack-and-dump story. “The suggestion from experts - which rings true - is there was a hack that happened separately, and they loaded the hacked materials on the laptop that magically appeared at a repair shop in Delaware,” wrote Roth.[78]
Ideology, Strategy, And Origins
Ideology
Leaders and members of the censorship complex share a common set of foundational beliefs and worldviews constituting an ideology. At the heart of the ideology is a highly simplistic view of truth. Something is true, or it isn’t true, holds censorship ideology. Something is either true or false in the same way that something is black and white. There are few if any grays. This truth/falsity binary underlies the information/misinformation distinction.
Censorship ideology also holds a highly simplistic view of human intentionality: people either intend to tell the truth or they intend to lie. This good intention/bad intention distinction underlies the distinction between information and misinformation on the one hand and “disinformation” on the other, since the only difference between misinformation and disinformation is the intention to mislead. Again, there is little room in the censorship ideology for gray areas.
The problem is many of the issues that the censorship industrial complex wants to censor aren’t obviously “true” or “false.” There is an inherent acknowledgment of this by the censorship industry with the concept of “malinformation,” which is when accurate facts are used to “mislead” people through “false narratives.”
The justification the White House and Facebook used to censor accurate vaccine information was that it was leading to “vaccine hesitancy.” In that instance, censorship went from censoring falsity to censoring dangerous truths. That act is fundamentally undemocratic and anathema to America’s commitment to freedom of speech.
Censorship ideology holds that the censors are able, at least better than most people, to determine the truth and falsity of something and the intention of the person or organizations. As such, censorship ideology is fundamentally elitist. Holders of censorship ideology believe that “disinformation experts,” as many of them define themselves, are well-suited to demand censorship of misinformation, disinformation, and malinformation and define misleading information, including narratives, from social media platforms and others.
Calling oneself a “disinformation expert” is like calling oneself a “truth expert.” It is naive, grandiose, and hubristic. Having read hundreds of pages of justifications for the censorship industrial complex in general and for specific censorship efforts, I can testify that the worldview is not significantly more complex than that.
It’s true that “disinformation experts” emphasize that they rely on other experts to determine what is true and false. But since experts in every field disagree, relying on some experts over others means either being an expert in every domain of human investigation or using irrational criteria for deciding between experts, e.g., credentialism.
Since Socrates and Plato, humans have grappled seriously with the universality of human irrationality, both how and why we get things wrong, and often in the same ways, over time. No human has god-like omniscience and wisdom. Everybody is wrong about something. Science evolves. Scientists thought volcanoes made dinosaurs extinct and then thought a large asteroid did and now many believe it was a combination.
The example is relevant because often what’s being labeled “disinformation” by the censorship industry aren’t facts but hypotheses, such as the idea that COVID came from a lab rather than from nature. Indeed, when Facebook was forced to justify its censorship of accurate information in court, Facebook said its censorship constituted “an opinion,” even though it had attached a “fact-check” label to the content.[79]
Adherents to censorship ideology dismiss these objections by making irrational emotional appeals and fear-mongering about the alleged dangers of misinformation, disinformation, and malinformation (MDM). They believe that the U.S. and other liberal democracies are in an “information war” on the Internet against actors who are causing harm with their MDM.
DiResta resists describing what she and her colleagues are doing as censorship. “Content moderation is not a binary ‘Take it down/Leave it up,’” explained DiResta. “I'll use Facebook's terminology here. They have a framework called ‘Remove, Reduce, Inform.’ Remove means it comes down. Reduce means its distribution is limited. And inform means a label is put up. There is some sort of interstitial. A popup comes up, or there is a fact check under it.”[80] When I interviewed her, DiResta described the fact check label as “not censorship in any way, shape, or form.”[81]
Fighting disinformation, DiResta argued in 2018, “is not about arbitrating truth, nor is it a question of free speech” but rather it is “a cybersecurity issue, it is an ongoing national security issue, and it must be addressed through a collaboration between governments responsible for the safety of their citizens and private industry responsible for the integrity of their products and platforms.”[82]
But fighting disinformation involves arbitrating truth and freedom of speech. How could it not? For something to be “disinformation,” one has already determined that it is not only untrue but intentionally so. And labeling something “disinformation” is often if not usually pretext to demanding censorship.
Censorship complex leaders do not argue that all MDM should be censored and often acknowledge their own limitations. Many openly recognize that doing so would be impossible or be a violation of the fundamental right to freedom of speech. “We are never going to live in a world free of mis- and disinformation,” said DiResta in 2021. “Such a world has never existed, and the government is not going to snap its fingers and regulate the problem away, because misinformation is ultimately speech.”
Rather, they argue that MDM that “causes harm” should be censored, and “repeat offenders” should be deamplified or de-platformed. DiResta describes her research into “how to respond to misinformation and disinformation in areas in which it can have significant harm.”[83]
The censorship industrial complex defines harm far more broadly than the Supreme Court. The U.S. Supreme Court defined “fighting words” in Chaplinsky v New Hampshire (1942) as words which ”by their very utterance, inflict injury or tend to incite an immediate breach of the peace.”[84] Speech that incites riots is also not protected.[85] The Supreme Court narrowed the scope of what counts as fighting words in Terminiello v. Chicago (1949),[86] arguing that for words to be constitutionally unprotected, they must produce a clear and present danger.[87]
And the Supreme Court has upheld very strong protections for speech that causes social conflict and unrest. In 1989 the Court found that burning the U.S. flag was not incitement.[88] And in 1992, the Court held that the “First Amendment prevents the government from punishing speech and expressive conduct because it disapproves of the ideas expressed.”[89] Notes one legal scholar, “Even if the words are considered to be fighting words, the First Amendment will still protect the speech if the speech restriction is based on viewpoint discrimination.”[90]
But the censorship complex is aggressive and expansionary, seeking to “map” the entire media sphere, with the aim of controlling the information environment. “When we can monitor the system as a whole and we understand the spread of information throughout the system, we can find opportunities to intercede,” explained DiResta in 2018 at Aspen.[91]
Another assumption of censorship ideologues is that federal propaganda and censorship are required. The argument made by EIP is that it was for some reason not good enough that state and local election officials communicate directly to the public through social media and news media. Rather, such officials required federally-funded experts at universities and think tanks to engage in propaganda and censorship efforts in their support.
DiResta and her colleagues have sought ways to deplatform people across multiple issues beyond COVID and elections. “Several platforms, for example, implemented a repeat spreader strike system after the election and then have since applied it to other areas of misinformation that causes significant harm.”[92]
Leaders of the censorship industrial complex claim to be nonpartisan, but their censorship is heavily focused on Republicans and Trump supporters. The leaders of all four EIP organizations share the same broadly anti-populist ideological orientation that might in the past have been accurately described as Cold War liberalism. Every “repeat misinformation spreader” account that EIP reported to social media companies for censorship through deamplification espoused right-populist views.[93]
Strategy
The censors’ goal is greater and greater control over social media platforms. “How do we decide what content people see?” pondered DiResta at an Aspen Institute gathering in 2018. “How do we decide what topics are recommended? Is there a ‘do-not-recommend’ list where we think more strategically?”[94]
The censorship complex moves from issue to issue without hesitation. Immediately after the 2020 election, all four members of EIP switched from policing and censoring election skepticism to policing and censoring vaccine skepticism. “Following the success of EIP and the certification of the 2020 election, SIO ramped down its monitoring and analysis capability because we thought that we were done with that,” explained DiResta in 2021. “However, almost immediately, we recognized the need to ramp back up this time to support government health officials’ efforts to combat misinformation.”[95]
The censorship industrial complex has large ambitions and a long-term vision of a public-private partnership to control the information environment. “The hard truth is that the problem of disinformation campaigns will never be fixed,” wrote DiResta in December 2018. “It’s a constantly evolving arms race. But it can — and must — be managed. This will require that social media platforms, independent researchers, and the government work together as partners in the fight.”[96]
Time and again DiResta and her colleagues emphasize the importance of government agencies outsourcing censorship to private entities, but working closely together. “We can establish the non-government capability,” she said in 2021. “And this will also help identify emerging issues for possible debunking and community or civil society coordination to deliver those messages to audiences that really trust what they have to say.”[97]
DiResta has repeatedly defended government demands of social media companies to censor as legitimate, First Amendment-protected “counter-speech,” and dismisses public alarm at the censorship by social media companies at the behest of U.S. government officials, revealed by the Twitter Files. In a recent podcast with me, DiResta claimed that the alarm was a result of a “lack of familiarity” with content moderation, rather than a clash of values.[98]
The new censors encountered resistance to the infringements upon the First Amendment that they felt they needed, and they sought to subcontract a significant amount of the censorship to the private sector, while also creating a revolving door between government agencies, charitable philanthropies, NGOs, social media platforms, and academic research institutions.
“The tech companies don't want to be seen as doing the work of the government,” said Renee DiResta in 2018, reflecting on the attitudes in the year 2015. “You have the EFF [Electronic Frontier Foundation] arguing that moderation is censorship…. public-private partnerships I think, are absolutely key… I know that triggers some people who get worried about privacy and such, but I don't think that there's any way to do it other than to treat it as an information war.”[99]
Today, the censorship industrial complex’s bid for global media and communications domination consists of pushing social media platforms to become more traditional news media companies, whether newspapers or TV networks, which the national security state in the past has been much better able to control.
In fact, with Section 230 protection, the censorship industrial complex may exert significantly more control over social media platforms, with their strict legal liability, than over news organizations. Whatever the final outcome, the direction of travel is clear: the censorship industrial complex seeks to restrict freedom of speech to narrow the public debate to exclude views that it regards as “delegitimizing.”
The censorship complex employs various tactics. One is the relentless demand for censorship, as we saw with the Twitter Files and the Facebook Files, released by the Attorneys General of Missouri and Louisiana. Another is directing government and philanthropic money toward research and advocacy for greater censorship of social media platforms, the specific task of at least two organizations, News Guard and the Disinformation Index.
These two tactics go hand in hand. The new censors seek legislation that would give increasing control over the content moderation of social media platforms to establishment experts and elites, while others in their network seek to direct advertiser revenue away from disfavored news media corporations, mostly conservative and libertarian but also some radical Left-wing ones, and toward favored news organizations, mostly liberal and establishment-friendly progressive ones.
What’s more, government and nongovernmental censors work together to threaten to revoke the Section 230 protections of social media, on the one hand, and demand censorship on the other. Sometimes they boast that the tech firms only caved thanks to the “huge regulatory stakes” for not censoring.[100]
Censors work together to deplatform disfavored individuals. This starts with labeling them “superspreaders” of disinformation. A disaffected disinformation warrior explained how it works. “When Cog Sec, DFR or MISP identify a threat, they get on their Slack channels and discussion groups. They’ll say, ‘We’ve identified this threat actor’ and that’s all that’s needed. Either a structured response will form, or somebody within the network will respond on their own. It isn't necessarily proven disinformation. Sometimes it could be in-depth commentary, a story someone like, or a story that somebody took a loose interpretation of, but is generally factual. If it is disliked or falls afoul of an active narrative, they launch a counteroperation, an example of which could be mass reporting on social media.”[101]
Part of the censorship industrial complex is public-facing, and another part is secretive. Its public aim appears to be to increase public comfort with growing censorship. Its members publish videos, podcasts, reports, and op-eds in newspapers, raising the alarm about “disinformation” and “conspiracy theories,” even as it spreads them. DiResta plays a dual role. She is the most articulate public advocate for state-sponsored censorship, on the one hand. On the other, she kept her work for the Department of Defense and, according to remarks by her supervisor, Stamos, the CIA, hidden from public view. In 2021, DiResta’s colleague, Alex Stamos, said she “worked for the CIA.”[102] In September 2021, DiResta recruited one of Twitter’s top censors, Yoel Roth, to attend a DARPA-funded workshop on “affective polarization on social media.” She asked him to keep his involvement quiet.[103]
Finally, the censorship industrial complex is expansionary and has missionary zeal. The categories of things it wants to censor have expanded, in just four years, from “foreign disinformation” to “domestic disinformation” to “misinformation” to “malinformation” and “malign narratives,” the latter two of which might contain a significant amount of accurate information, like the accurate covid vaccine information that Facebook censored at the request of the Biden administration in 2021.[104]
These phrases evoke totalitarian kinds of social control that are anathema to the American tradition of radical speech rights. We should remember that “truth is the first casualty of war,” and the same holds doubly true for the “information war.”
Much of what the censorship industrial complex advocates is not what it seems. The censorship complex is advocating the “Platform Accountability and Transparency Act,” but only for “qualified research projects, qualified researchers” as determined by the same NSF that is overseeing the distribution of government funding to “disinformation experts” and censorship advocates.[105] Neither ordinary citizens nor journalists nor policymakers would have direct access to the data under the proposal. As such, the Act would increase rather than reduce the power of the censorship industrial complex.
This is a radical departure from the Cold War, when the U.S. government not only didn’t criminalize foreign propaganda but rather translated it, including Soviet Communist propaganda, into English so that Americans could read it. “I worked on the global media side of the CIA,” said Martin Gurri, who wrote The Revolt of the Public, a book on the political impact of the Internet. “We used to translate reams of stuff from communist countries — Pravda, Izvestia, whatever — and put them out through something that was a kind of halfway house between the government and a public publication. As a result, American libraries all had Soviet propaganda given to them by the federal government! We didn’t think it would cause harm and convert people into communists. And the scholarly community loved it. And so the federal government used to translate and provide propaganda from the other side to the public without fear of what would happen!”
Origins
Elites in all societies seek to win and maintain the consent of the population they govern through communications. Machiavelli counseled leaders on deception and psychology to manipulate public opinion. Walter Lippmann in 1922 talked of the need for the government and industry leaders to utilize more advanced techniques through the mass media to “manufacture consent.” In a 1988 book of a similar title, Ed Herman and Noam Chomsky documented how U.S. news media uncritically cheerled the national security establishment, with few exceptions. For three additional decades, this arrangement worked, including with Internet 1.0, itself a product of the national security establishment, particularly DoD and DARPA.
The censorship complex has its roots in the war on terrorism, which began after September 11, 2001, and ran through to the 2015 disinformation war against ISIS recruitment by U.S. government agencies. In other words, its roots are fundamentally military and fundamentally about hierarchy, authority, and deception.
What is the motivation behind the ideology? Two seismic challenges to the postwar liberal order, both of which occurred in 2016: Brexit in June, and the election of Donald Trump as president in November. The two events shocked and frightened national security leaders on both sides of the Atlantic. Many openly said that the political threat to NATO and the Western Alliance was bigger than any security threat, a conclusion dramatically reinforced by the election of Trump in 2016, who had repeatedly criticized NATO and hinted at withdrawing the U.S. from it.
“When Trump came onto the stage, the traditional center-Left got caught up in the ‘resistance,’” a professional from the defense establishment told me. “These are people who held strong roles and were strong commentators. When Trump showed up they turned into anti-government personalities. We saw people in government roles of authority openly name-calling and down-calling the president. Openly engaging in commentary and activity that could undermine the interests of the U.S. domestically and abroad.”[106]
Gurri agrees that an elite counter-revolutionary backlash to the Trump revolution, created by the Internet revolution, is what’s driving the censorship industrial complex. “The flags and fact checks all assume people are stupid and will be misled. There is a Platonic guardian assumption elites make. The world to the elite mind breaks down between affluent, mobile, and articulate Platonic guardians and the rest of us would-be victims who they take care of at their mercy. That’s their vision of democracy. The radical move is to assume that the public is as smart as you are.”[107]
The U.S. national security establishment, along with other U.S. and Western elites, reacted in fear and disgust at the large amounts of grassroots, authentic, and nationalist media and messages, from Brexit supporters and Trump and the very real possibility that it could destroy the liberal post-war global order upon which they depend. Elites spent the following six years reacting to this blow to their control over the media discourse and, thus, their ability to manufacture consent.
I offer all of this as background, not as a statement of my own views of the liberal world order. I am a member of the educated elite and benefit from the liberal world order, for which I am grateful since it has kept the peace and done remarkably well in lifting people out of poverty and expanding human rights. I have, in recent years, defended Western civilization against those who are undermining its pillars of cheap energy, meritocracy, and law and order. I support NATO and Europe and the U.S. government’s support for the Ukrainian people’s defense against Russia’s invasion.
But I might be wrong, which is one reason among many that I believe the censorship industrial complex is so dangerous. Being in support of the liberal world order starts with supporting freedom of speech from authoritarians. What today’s censors call “disinformation” is more often than not just “being wrong on the Internet.” I’ve been wrong about many things in my life, including energy and the environment, homelessness, and how to respond to the coronavirus. I don’t know whether I’m more wrong or more right than most people, but I’m glad we live in a free society that allows and indeed even encourages us to be wrong because that’s democratic consent is built organically, messily, haphazardly, and over time.
Key Events
2017
Department of Homeland Security Expands Mission to Fight “Misinformation”
In January 2017, DHS quietly expanded its mission from cybersecurity to cybercensorship by arguing that “misinformation” is a “cyberattack” on US critical infrastructure. On January 6th, 2017, in his final act as Director of DHS, Jeh Johnson declared elections “critical infrastructure.” The concept of critical infrastructure went from physical things like satellites and dams and federal buildings to events like elections or public health campaigns. This allowed DHS to deem tweets about vaccine safety and mail-in ballots that it deemed “misinformation,” or simply “misleading,” as justification for censorship, specifically asking social media companies to remove users, remove posts, or prevent their spread. DHS defined “misinformation” as an election security risk, a threat to national security, and an attack on democracy.
New Knowledge runs disinformation campaign against Republican Senate Candidate in Alabama
The disinformation operation set up fake Facebook pages for Roy Moore in Alabama, which had Moore purporting to say that he was going to ban alcohol, and created fake Russian trolls on Twitter to make it look like Moore was getting support from the Russians, which journalists reported as true.
DiResta was on the Board of Directors of the group running the disinformation operation, American Enterprise Technologies, and joined New Knowledge, which consulted on the operation, as research director, one month later.
DiResta gave AET technical guidance and introduced its founders to potential financial supporters. DiResta told the Washington Post that “she became concerned with the opaqueness of the project, and severed ties with” AET. [108]
The disinformation operation run by DiResta’s colleagues at American Engagement Technologies and New Knowledge came to light when Washington Post wrote about a 12-page report bragging about the effort, called “Project Birmingham,” three days after the December 12, 2017 election.[109] “We orchestrated an elaborate ‘false flag’ operation that planted the idea that the Moore campaign was amplified on social media by a Russian botnet,” the memo claimed. The goal was to “radicalize Democrats, suppress unpersuadable Republicans (“hard Rs”) and faction moderate Republicans by advocating for write-in candidates.”
New Knowledge claims that it won the race. The claim cannot be proven, but the vote was close, with just 22,000 voters forcing Moore to lose. New Knowledge said it had moved “enough votes to ensure a Doug Jones victory.”
The revolving door is apparent in the effort. “The money passed through American Engagement Technologies, run by Mikey Dickerson,” reported the Washington Post, “the founding director of the United States Digital Service, which was created during the Obama administration to try to upgrade the federal government’s use of technology. Sara K. Hudson, a former Justice Department fellow now with Investing in Us, a tech finance company partly funded by Mr. [Reid] Hoffman, worked on the project, along with Mr. [Jonathon] Morgan.”[110]
The memo says it “planted the idea that the Moore campaign was amplified on social media by a Russian botnet. We then tied that botnet to the Moore campaign digital director, making it appear as if he had purchased the accounts.”
There is much that is notable about this tactic. First, it evoked the same narrative that was being used against President Trump at the time, that the Russians were supporting him. As such, it was aimed at delegitimation, which DiResta had condemned in other contexts. Second, it actually used the tactic of bots that DiResta would describe as “information war” in her Senate testimony.
Journalists downplayed DiResta’s involvement, and even seemed to joke about it. Consider the exchange at an Aspen Institute event in 2018.[111]
DiResta: I have a bunch of accounts that pay attention to anti-vax content… My anti-vax accounts — accounts that were active in anti-vax groups, just listening, just sitting in those accounts….
Nicholas Thompson, The Atlantic: How many bot accounts do you run?
Renee DiResta: No comment.
Thompson: I would like a complete tally of Renee DiResta’s sock puppet accounts by the end of this panel!
[laughter]
Former FBI employee web site that falsely accuses ordinary Twitter users of being Russian bots
Former FBI employee Clint Watts received U.S. government funding to create the web site, which falsely accused conservatives of being Russian bots.[112] Watts received help from New Knowledge.[113]
Twitter’s Yoel Roth investigated and found the list full of “legitimate right-leaning accounts…. Virtually any conclusion drawn from [the dashboard] will take conversations in conservative circles on Twitter and accuse them of being Russian.” Roth recommended that Twitter “call this out on the bullshit it is.” But Roth’s supervisors feared the political consequences, and opted instead to play a “longer game.”[114]
2018
Senate Intelligence Report On Russian Interference
In 2018, DiResta was the lead researcher for the Senate Intelligence Committee in its investigation of Russian influence operations during the 2016 elections. In her 2018 Senate Testimony, she argued that America is in a “high-stakes information war” and the U.S. government, and the “whole-of-society” must “go to war” against “malign narratives” whether foreign or domestic.[115]
The dramatic rhetoric of DiResta’s 2018 Senate testimony was typical. Censorship advocates repeatedly claim, without evidence, that false information travels faster than true information as justification for rapid and expansive U.S. government and whole-of-society action to censor disfavored opinions and voices.
2020
“Election Integrity Partnership”
The Election Integrity Partnership (EIP), the seed of the censorship industrial complex, was founded by two universities, a think tank, and a social media analytics firm, Stanford Internet Observatory, Washington University's (UW) Center for an Informed Public, The Atlantic Council's Digital Forensics Research Lab; and Graphika. EIP claims it classified 21,897,364 individual posts comprising unique “misinformation incidents” from August 15, 2020 to December 12, 2020 from a larger 859 million set of tweets connected to“misinformation narratives.”[116]
On June 23, 2020, there was a formal meeting between CISA to formally set up the EIP initiative to stop misinformation for election security. “The legal framework under which DHS – and CISA particularly – drew their jurisdiction was that whenever any US citizen posted what DHS considered ‘misinformation’ online it was now considered a ‘cyber attack’ against US critical infrastructure.”[117]
EIP leader Alex Stamos says EIP’s purpose was “to try to fill the gap of the things that the government could not do themselves” because the government “lacked both kinda the funding and the legal authorizations.”[118]
EIP flagged posts to social media companies for censorship while publicly advocating for policy change. Stamos told the New York Times[119] on August 26, 2020, shortly after EIP and DHS planning sessions,[120] where the arrangement was made for EIP to do what DHS could not legally do, that the tech companies had agreed to join the EIP censorship arrangement.
“We have reached out and we have had two-way conversations with all of the major platforms,” said Stamos. “We've had really good conversations with all of the major platforms. Facebook, Twitter, Google, Reddit… our goal is that if we're able to find disinformation, we'll be able to report it quickly, and then collaborate with them on taking it down. There's a good precedent for this, which is that all four of these organizations have worked on research projects side by side with tech platforms.”[121]
The leaders of all EIP organizations made unsubstantiated claims between 2017 and 2020 that Russian interference in the form of inauthentic bots and troll accounts on social media helped elect Donald Trump president in 2016. By 2020, all four institutions had deep and longstanding relationships with top content moderation executives in all of the major social media platforms. They have worked together on censorship since 2017.[122]
Social media companies, DHS, and EIP organizations worked on a real-time chat app, Jira Service Desk, to coordinate censorship. The EIP reports that it censored 22 million tweets with “misinformation” labels; collecting 859 million tweets collected in databases for analysis; 120 analysts monitoring social media “misinformation” in up to 20-hour shifts; 15 tech platforms monitored for “misinformation” often in real-time; <1 hour average response time between government partners and tech platforms; Dozens of “misinformation narratives” targeted for platform-wide throttling; and hundreds of millions of individual Facebook posts, YouTube videos, TikToks, and tweets censored for “misinformation.”[123]
EIP representatives often mislead their audiences by claiming their domestic censorship activities were “narrowly tailored” to relate to “time, place and manner” of voting. This deception, however, relies on the ignorance of the audience as to EIP’s own censorship data. The vast and overwhelming majority of EIP censorship was related to “delegitimization,” a new censorship category EIP members pressured tech platforms to adopt, which would come to constitute 72% of EIP’s censorship tickets and what appears to be over 99% of the posts, measured by overall volume, of the 22 million labeled “misinformation incidents.” [124] EIP defined “delegitimization” broadly to mean any speech that “casts doubt” on any kind of election process, outcome or integrity. “The result was that a user merely posting “incidents” of election issues was still committing a Terms of Service violation because “incidents” had the effect of “casting doubt,” and thus even factual reporting was effectively banned altogether.
By classifying entire political narratives as misinformation, and automatically flagging individual US citizens’ posts supporting a banned postas de facto misinformation, EIP was able to classify hundreds of millions of social media posts (across 15 social media platforms) in a five months span between June and November 2020 (and then again later similarly for COVID) because they had backend access to the Election Integrity and Intelligence Sharing and Analysis Center (EI-ISAC), the domestic disinformation switchboard that was created so that DHS would be able to have instant access to censorship decisionmakers.
Aspen Institute Workshop Trains Top Journalists To Pre-Bunk “Hack and Leak”
On March 31, 2020, Stanford University’s Cyber Policy Center, the same umbrella organization that houses the Stanford Internet Observatory, published a report by Obama political operative Andrew Grotto and ex-journalist Janine Zacharia urging editors and journalists to “Break the Pentagon Papers principle.” What did they mean? They meant reporters should not cover leaked information, even when true, because it could contribute to “disinformation.”[125]
“Since Daniel Ellsberg’s 1971 leak of the Pentagon Papers,” wrote the authors, “journalists have generally operated under a single rule: Once information is authenticated, if it is newsworthy, publish it…. In this new era, when foreign adversaries like Russia are hacking into political campaigns and leaking material to disrupt our democracy and favor one candidate, journalists must abandon this principle.”
Stanford’s goal was explicitly to change norms so journalists would not do what they did in 1971 with the Pentagon Papers. “The more news outlets that embrace a new set of norms, the more resilient American media will be against exploitation by malicious actors,” the authors write.
The authors, Grotto and Zacharia, proceed to celebrate news media not reporting on things the national security state doesn’t want them to report. “There is a long history of journalists refraining from publishing, particularly in the national security realm,” the authors write. “In 1958, when New York Times military affairs reporter Hanson Baldwin spotted an unusual plane on a German base and later determined it was a secret U.S. U-2 spy plane, The Times never published the story despite its obvious newsworthiness.”
The authors describe how the news media will, in real life, cover the Hunter Biden laptop, in October 2020. “Focus on the why in addition to the what,” they say. Make the disinformation campaign as much a part of the story as the email or hacked information dump. Change the sense of newsworthiness to accord with the current threat.”
Aspen Institute held training for reporters with an eerily similar message. On June 25, 2020, Aspen Institute convened a “tabletop exercise” to train journalists at the New York Times, Washington Post, and CNN, and censors at Twitter and Facebook, to treat leaked information, however accurate, as likely the result of Russian hacking, and to make the story about the hacking, not the contents of the hack.[126]
The organizer was Vivian Schiller, the former CEO of NPR, the former head of news at Twitter, the former General Manager of The New York Times, and the former Chief Digital Officer of NBC News. Two of the attendees were Andrew Grotto and Janine Zacharia, the authors of the Stanford report urging reporters to “break the Pentagon Papers principle.” Here is a complete list of attendees:
- Jessica Ashooh, Director of Policy, Reddit
- Olga Belogolova, Policy Manager – IO, Facebook
- John Bennett, Director of Security, Wikimedia Foundation
- Kevin Collier, Reporter, NBC News
- Rick Davis, EVP, News Standards and Practices, CNN
- Nathaniel Gleicher, Head of Cybersecurity Policy, Facebook
- Garrett Graff, Director, Cyber Initiatives, Aspen Institute
- Andy Grotto, Director, Stanford Cyber Policy Center
- Steve Hayes, Co-Founder and Editor, The Dispatch
- Susan Hennessey, Executive Editor, Lawfare
- Kelly McBride, Senior VP, Poynter Institute
- David McCraw, VP and Deputy General Counsel, The New York Times
- Ellen Nakashima, National Security Reporter, The Washington Post
- Evan Osnos, Staff Writer, The New Yorker
- Donie O’Sullivan, Reporter, CNN
- Dina Temple Raston, Investigations Correspondent, NPR
- Yoel Roth, Head of Site Integrity, Twitter
- Alan Rusbridger, Former Editor in Chief, Guardian, Member of Facebook Oversight Board
- David Sanger, Chief Washington Correspondent, The New York Times
- Noah Shachtman, Editor in Chief, The Daily Beast
- Vivian Schiller, Executive Director, Aspen Institute
- Claire Wardle, Cofounder and Director, First Draft News
- Clement Wolf, Global Public Policy Lead for Information Integrity, Google
- Janine Zacharia, Visiting Lecturer, Stanford[127]
Covid Censorship
Lab Leak Theory
Through most of the pandemic, the idea that the spread of Covid-19 was caused by a leak from the Wuhan Institute for Virology’s laboratory in Wuhan, China, was dismissed. In February 2020, the Washington Post published an article headlined, “Tom Cotton repeats debunked conspiracy theory about coronavirus,” after the Republican senator floated the idea.[128] Two days later, the British medical journal Lancet published an article by 27 scientists “to strongly condemn conspiracy theories suggesting that COVID-19 does not have a natural origin.”[129]
In September 2020, Facebook censored a “Tucker Carlson Tonight” segment in which a Chinese doctor said that the COVID pandemic resulted from a virus escaping from a lab in China. Facebook labeled the clip as “false information,” and Instagram flagged it.[130]
Today, the mainstream media considers the possibility that a lab leak caused the pandemic to be as likely as the possibility that it was caused by a spillover of a virus from animals to humans.
The Wall Street Journal on February 26 reported that the U.S. Department of Energy (DOE) has joined the Federal Bureau of Investigation (FBI) in concluding that a laboratory leak was more likely than natural causes to have caused the coronavirus pandemic.[131] In November, the top government official overseeing the U.S. response to the pandemic, Anthony Fauci, said, about COVID’s origins, “I have a completely open mind.”[132]
In truth, there was abundant evidence by 2015 that a lab leak was a possible cause of a coronavirus pandemic.[133] None have announced new systems or safeguards to avoid making similar mistakes in the future and regain public trust.
Mask Skepticism
In 2020, Twitter removed a tweet by a member of the White House’s coronavirus task force who questioned the efficacy of masks.[134] In mid-2021, White House Press Secretary Jen Psaki said the Biden administration was identifying “problematic” COVID posts for Facebook to censor.[135] YouTube removed a video in which scientists from Harvard and Stanford expressed their opinion to Florida’s governor that children should not be required to wear masks.[136] And Facebook censored former New York Times journalist John Tierney for accurately reporting on evidence of the harm to children from wearing masks.[137]
2021
DHS Expands Its Censorship Powers
Demands from the government that social media companies censor content have increased under President Joe Biden. In January 2021, the Cyber Security and Infrastructure Security Agency, which was created in 2018 to respond to election disinformation, broadened its scope “to promote more flexibility to focus on general” misinformation, disinformation, and malinformation. Where misinformation can be unintentional, disinformation is defined as deliberate, while malinformation can include accurate information that is “misleading.”
In January 2021, CISA replaced the “Countering Foreign Influence Task Force” with a “Misinformation, Disinformation and Malinformation” team “to promote more flexibility to focus on general MDM.”[138] The move included a further turn inward to focus on domestic sources of MDM. The MDM team, according to one CISA official quoted in the IG report, “counters all types of disinformation, to be responsive to current events.”[139]
Geoff Hale, the director of the Election Security Initiative at CISA, recommended the use of contractor nonprofits as a “clearing house for information to avoid the appearance of government propaganda.”[140]
Under Pressure From White House, Facebook and Twitter Censor Accurate Vaccine Information
Twitter and Facebook both censored accurate COVID information, in part to reduce vaccine hesitancy, “discrediting doctors and other experts who disagreed.” This work involved the four members of the EIP, now the “Virality Project.” “Over the spring and summer of 2021, VP partnered with federal, state, and local stakeholders, as well as civil society organizations and coalitions of medical professionals to support their efforts to understand vaccine hesitancy,” explained DiResta in 2021.[141]
Biden administration officials scolded Twitter and Facebook executives for not doing more censorship. There were many instances of Twitter banning or labeling “misleading” accounts that were true or merely controversial. Twitter suspended a physician for accurately describing the results of a peer-reviewed study on mRNA vaccines.
Facebook censored a claim in October by President Donald Trump that a COVID vaccine was imminent, which it was,[142] an example of how censorship can be used as part of an effort to discredit accurate information, and increase distrust in authorities, two things about which the censorship industrial complex claims to care.
Facebook, under pressure from the White House, censored "often-true content” that a company executive said in the spring of 2021 "does not contain actionable misinformation” but was “discouraging vaccines."[143] The State Attorney General of Missouri, who is suing the Biden Administration for violating the First Amendment, released the email.[144] “As you know,” wrote the Facebook executive whose name was redacted, “in addition to removing vaccine misinformation, we have been focused on reducing the virality of content discouraging vaccines that does not contain actionable misinformation.”[145]
The email shows Facebook responding defensively to the White House’s then-COVID advisor, Andy Slavitt. “This often-true content,” wrote the executive, “which we allow at the post level because experts have advised us that it is important for people to be able to discuss both their personal experiences and concerns about the vaccine, but it can be framed as sensation[al], alarmist, or shocking.”
“We'll remove these Groups, Pages, and Accounts when they are disproportionately promoting this sensationalized content,” said the Facebook executive. ”More on this front as we proceed to implement.”[146]
Another White House official scolded Facebook employees in an email: “We are gravely concerned that your service is one of the top drivers of vaccine hesitancy - period.” Within an aggressive email thread with the subject line, “You are hiding the ball,” the official said he believed Facebook was at risk of “doing the same” thing it did before the Jan 6, 2021 riot at the US Capitol when “an insurrection …was plotted, in large part, by your platform.”[147]
All of these censorship demands were occurring against a backdrop of the White House and Congress regularly threatening to revoke Section 230 of the Communications Decency Act, which indemnifies social media platforms from liability for content posted by users. The social media platforms consider the possible repeal of Section 230 an existential threat. Without the Act, they could not exist in their current form.
After 2020, the four co-founders of EIP started The Virality Project to demand censorship on COVID-related issues. They used the same Jira Service Desk ticketing software that they used for EIP. VP did the exact same kinds of censorship except focused on censoring COVID-19 information. VP says it censored, with its government partners, 66 social media “narratives” that were allegedly going viral during 2021.[148]
Aspen Information Disorder Report
The sweeping vision of the censorship industrial complex can be seen in a 2021 Aspen Institute report, which effectively claims that MDM constitutes the greatest crisis facing America because it “exacerbates all other crises.” The report builds upon the continually expanding framework, from “disinformation” to “misinformation” to “malinformation” — a category that allows for the censorship of accurate information in the name of preventing a “misleading narrative” — to “information disorder.”[149] The Aspen report calls for vastly expanded social media censorship of information and a propaganda effort led by the White House and social media platforms working together.
Climate Change and Energy
The censorship-industrial complex pressures social media platforms to censor content relating to climate change and energy.
I speak from experience as someone who has been attacked by an ongoing censor-and-discredit campaign that has been waged against me since I wrote a viral article in June 2020 to announce the publication of my book, Apocalypse Never. In response to my article, multiple think tanks quickly and falsely claimed to have “debunked” its contents. Those fake debunkings became the basis for Facebook to censor my posts, even ones that don’t have to do with climate change, to this day. Facebook allowed no way for me to appeal. In response to a lawsuit brought by journalist John Stossel, Facebook confessed that its so-called “fact-checking” of him and me could not be considered defamation as it was merely an “opinion.”[150] And yet the censorship continues.[151]
In 2021, Facebook censored Bjorn Lomborg for accurately reporting that the British medical journal Lancet found that warmer temperatures save lives.[152]
Facebook and other social media companies give the people they have censored nothing in the way of an appeal process. After Stossel sued Facebook, its parent company, Meta, said in response to the lawsuit that Facebook’s “fact-checks” are just “opinion” and thus immune from defamation charges.[153]
The demand for ever more censorship continues. In a 2022 talk with Axios, Biden Administration Climate Advisor Gina McCarthy said, “The tech companies have to stop allowing specific individuals over and over again to spread disinformation.” After an Axios reporter asked, “Isn't misinformation and disinfo around climate a threat to public health itself?” McCarthy responded, “Oh, absolutely… We are talking, really, about risks that no longer need to be tolerated to our communities.”[154]
McCarthy pointed specifically to those who criticized the failure of weather-dependent renewables during the blackouts in Texas in February 2021. But many of those criticisms were factual. Over the last decade in Texas, investors sunk over $83 billion on weather-dependent energy sources, mostly wind turbines, which alongside frozen fossil fuel plants were largely unavailable during the cold snap in February.[155] That was only partly because of the cold and mostly because of low wind speeds.
McCarthy claimed that the critics of renewables are funded by “dark money” fossil fuel companies, which she compared to Big Tobacco. She claimed the critics are being paid to “fool” the public about “the benefits of clean energy.” “We need the tech companies to really jump in,” she said, because criticizing renewables is “equally dangerous to denial because we have to move fast.”[156]
But the main critics of renewables, including those used in Texas, do not receive funding from the fossil fuel industry. Moreover, McCarthy’s own interview with Axios was sponsored by 3M, a major supplier to the solar industry that has lobbied directly for climate and energy legislation that would benefit 3M.[157]
As such, notes the Wall Street Journal, “Merely pointing out technical limitations of lithium-ion batteries could be ‘disinformation,’” under the expansive censorship framework being proposed by McCarthy, Center for American Progress, and social media companies.[158]
Now, an entity funded by the U.S. government has smeared me and others in a report aimed at demanding greater censorship of my posts and those of others by social media platforms. A British think tank called the Institute for Strategic Dialogue (ISD) is demanding censorship of factual information using American taxpayer dollars. The State Department gave the Institute a grant in September 2021 to “advance the development of promising and innovative technologies against disinformation and propaganda.”[159] In a 2022 report on “climate disinformation,” the ISD slandered me and others as promoting “delay” on climate action.[160] That is a lie, as everyone who knows my work saving nuclear power plants knows. I have never advocated a “go-slow” approach in my life.
The Institute for Strategic Dialogue was awarded its funding after participating in an event sponsored by the North Atlantic Treaty Organization (NATO), the U.S. Embassy in Paris, the Atlantic Council’s Digital Forensic Research Lab (DFRLab), and the Cybersecurity and Infrastructure Security Agency (CISA).
2022
U.S. government funds “Disinformation Index” and “News Guard” to drive advertisers away from disfavored news media
Government-funded censors point to the desires of advertisers as justification for censorship. “Moderation rules and content policy are also tied into business incentives,” said DiResta. “Platforms don't want to create a cesspool. Twitter doesn't want, or didn't want, to be 4Chan because most people don't enjoy being in that type of environment. So even if there are types of content that are in line with the First Amendment, some of the platforms choose to moderate more or less heavily in line with the kind of environment they want to create versus having a free-for-all experience.”[161]
Meanwhile, the U.S. government funds groups seeking to divert advertising dollars from disfavored to favored news organizations. The National Endowment for Democracy, which received $300 million in taxpayer dollars in 2021, granted $230,000 in 2020 to the Global Disinformation Index, an organization that urges advertisers not to advertise with leading conservative and libertarian media outlets including the Washington Examiner, Reason, and the New York Post.[162]
In September 2021, the Defense Department gave a government contract worth $750,000 to Newsguard, another group advocating that advertisers cut off their money to disfavored publications.[163]
Creation of Department Of Homeland Security’s “Disinformation Governance Board”
In April of 2022, the Department of Homeland Security announced that it was creating a “Disinformation Governance Board” to fight disinformation on social media platforms. In a March 2022 meeting with social media executives and representatives of other government agencies, FBI official Laura Dehmlow, who headed up the Foreign Influence Task Force, said that “we need a media infrastructure that is held accountable.”[164]
The announcement of the Board triggered a strong and broad backlash from the public, and within a few weeks, the Biden administration had abandoned the plan. But rather than completely abandoning the plans, DHS agencies are monitoring social media on their own. According to the draft copy of the DHS’s 2022 Quadrennial Homeland Security Review, the agency had planned to target “inaccurate information,” including “the origins of the COVID-19 pandemic and the efficacy of COVID-19 vaccines, racial justice, U.S. withdrawal from Afghanistan, and the nature of U.S. support to Ukraine.”[165]
2023
Twitter Files
The American people should be appreciative of new Twitter owner Elon Musk for making the Twitter Files available. As might be predictable, the censorship industrial complex has spread significant malinformation and misinformation, and perhaps disinformation, about them and the journalists involved in reporting them. It has been widely reported that Musk hand-selected me to report on the Twitter Files; that is not true. Bari Weiss invited me to join her team of reporters. When we met for the first time, Musk told me he did not know who I was.
We were given broad access to internal emails and direct messages and found no evidence anything was kept from us. As for my journalistic independence, I will simply note that I am one of the few journalists in America to have criticized Musk’s statements on energy, both in Mother Jones magazine and in my 2020 book, Apocalypse Never. Whatever else one might think of Musk, his decision to make transparent the inner workings of one of the world’s most important social media platforms is unprecedented and allowed the public to understand the operations of the censorship industrial complex.
DiResta and Stamos hype “foreign disinformation” threat
In late February, after Meta (Facebook) released its fourth quarter “Adversarial Threat Report,” DiResta tweeted, “Interesting Facebook’s adversarial threat report today: 4 disinformation networks… Some were pretty big, state-linked but w/mercenary component (paid operators), $$$ ad spend.”[166] Stamos agreed. “Serious foreign influence campaigns continue online.”[167]
Here’s what Meta wrote: “While Russian-origin attempts at covert activity (CIB) related to Russia’s war in Ukraine have sharply increased, overt efforts by Russian state-controlled media have reportedly decreased over the last 12 months on our platform. We saw state-controlled media shifting to other platforms and using new domains to try to escape the additional transparency on (and demotions against) links to their websites. During the same period, covert influence operations have adopted a brute-force, ‘smash-and-grab’ approach of high-volume but very low-quality campaigns across the internet.”[168]
In other words, Russia has tried and largely failed to have impact through covert activity, and has been forced to shift to other platforms and “low-quality campaigns,” — a very different picture from DiResta’s claim that the efforts were “pretty big” and Stamos’ assertion that they were “serious.”
Gurri, author of Revolt of the Public, pushed back. “Prove ‘influence,’” he tweeted. “Where's the data? What pure American minds are polluted? And if there's no data, how isn't this an even dumber version of ‘a Commie under every bed’?”[169]
In response, Stamos wrote, “I’m a big fan of your book. It's unfortunate to see you reduced to this, sir….I have stated multiple times, over years, that I thought the impact of these campaigns is often overstated. It certainly was in regards to the 2016 election. But letting authoritarians run free and buy checkmarks is still not a smart way to run a trusted platform.”[170]
Recommendations
1. Defund the Censorship Industrial Complex
Censorship is a subsidized industry. If you take that money out there will be replacement money from private sector donors, such as the Open Society Institute, who will fill the gaps, but not entirely.[171]
2. Mandate instant reporting of all communications between government officials and contractors with social media executives relating to content moderation
Both parties should be legally required to report on their conversation to create a prisoner’s dilemma that reduces secret censorship.
3. Reduce Scope of Section 230
Section 230 is a special, radical legal liability granted to social media platforms, not to news media organizations, in recognition of the difference between them. Citizens have a right to demand that Section 230 privileges come with certain responsibilities.
This is especially true since the platforms are legal monopolies. Much has changed since 1996. Back then, neither Google, Facebook, nor Twitter existed. Nobody imagined back then that government officials would ask social media companies to secretly censor factual information and remove individuals from their platforms, under threat of losing their ability to operate.
Section 230 in its current form undermines the right of citizens both to free speech and to our constitutional right to redress harm or injury. In the 2022 Rogan O’Handley ruling, the San Francisco Ninth Circuit court refused to recognize normal tort and contract theories relating to social media companies on the grounds that Section 230 exempts social media companies from any liability. On Twitter, Meghan Murphy in 2018 called a trans activist by their birth gender and was bounced, or removed, from the platform.[172] What she had done wasn’t, according to Twitter’s own Terms of Service, a bannable offense. But then Twitter changed its Terms of Service, and retroactively applied it. As such, the courts had effectively said Twitter did not need to follow its own contracts. No corporation in the world should have such extraordinary powers to both deny American citizens their free speech rights and their ability to sue to redress the harm they cause.
As such, we need two key reforms: real transparency and private right of action. Congress must clarify that Section 230 doesn’t abrogate state tort law absent extremely specific criteria. Section 230 currently protects a social media platform from liability for “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable.” The words “otherwise objectionable” should be removed. It provides far too broad an ability of social media platforms to censor. Just as we must be free to express unpopular opinions, we must be free to insult each other, including in ways that some people claim cause “emotional harm.”
[1] Dwight D. Eisenhower, "Farewell Address," (Washington D.C., January 17, 1961), American Presidency Project, University of California Santa Barbara, https://www.presidency.ucsb.edu/documents/farewell-address-0.
[2] Cristiano Lima, “Facebook no longer treating, ‘man-made’ Covid as a crackpot idea,” Politico, May 26, 2021, https://www.politico.com/amp/news/2021/05/26/facebook-ban-covid-man-made-491053.
[3] Anonymous Facebook executive, email to Andrew M. Slavett and Rob Flaherty, “[EXTERNAL] Follow up - Friday call w[redacted],” Mar 21, 2021, cited by Michael Shellenberger, Leighton Woodhouse, “Under White House Pressure, Facebook Censored Accurate Covid Vaccine Information,” Public, Jan 12, 2023, https://public.substack.com/p/under-white-house-pressure-facebook.
[4] Michael Shellenberger (@ShellenbergerMD), “1. TWITTER FILES: PART 7,” Twitter thread, Dec. 19, 2022, 11:09 am, https://twitter.com/ShellenbergerMD/status/1604871630613753856.
[5] Editorial Board, “Facebook admits the truth: ‘Fact checks’ are really just (lefty) opinion, New York Post, Dec. 14, 2021, https://nypost.com/2021/12/14/facebook-admits-the-truth-fact-checks-are-really-just-lefty-opinion/
[6] Michael Shellenberger, “ Why The Biden Admin Wants Censorship Of Renewable Energy Critics,” Public, June 14, 2022, https://public.substack.com/p/why-the-biden-admin-wants-censorship
[7] Michael Shellenberger, “Disinformation Behind Censorship Demands,” Public, Sep. 26, 2022, https://public.substack.com/p/disinformation-behind-censorship
[8] Ken Klippenstein and Lee Fang, “Truth Cops: Leaked Documents Outline DHS’s Plans to Police Disinformation,” The Intercept, Oct 31, 2022, https://theintercept.com/2022/10/31/social-media-disinformation-dhs/
[9] See for example NSF-funded “Course Correct” described below.
[10] Mike Benz, “Biden’s National Science Foundation Has Pumped Nearly $40 Million Into Social Media Censorship Grants and Contracts,” Foundation for Freedom Online, Nov 22, 2022, https://report.foundationforfreedomonline.com/11-22-22.html; “Colleges & Universities Getting NSF Grants for ‘Mis/Disinformation,’ FY2021-2022,” Imgur, Nov 2022, https://imgur.com/a/CJQFKHT.
[11] “Track F: Trust and Authenticity in Communication Platforms,” in U.S. National Science Foundation, Convergence Accelerator 2022 Portfolio Guide, National Science Foundation, accessed March 6, 2023, https://nsf-gov-resources.nsf.gov/2022-08/NSF%20Convergence%20Accelerator%202022%20Portfolio%20Guide_Final_lowres_508_0.pdf#page=92
[12] “Track F: Trust and Authenticity in Communication Platforms,” in U.S. National Science Foundation, Convergence Accelerator 2022 Portfolio Guide, National Science Foundation, accessed March 6, 2023, https://nsf-gov-resources.nsf.gov/2022-08/NSF%20Convergence%20Accelerator%202022%20Portfolio%20Guide_Final_lowres_508_0.pdf#page=92.
[13] “Track F: Trust and Authenticity in Communication Platforms,” in U.S. National Science Foundation, Convergence Accelerator 2022 Portfolio Guide, National Science Foundation, accessed March 6, 2023, https://nsf-gov-resources.nsf.gov/2022-08/NSF%20Convergence%20Accelerator%202022%20Portfolio%20Guide_Final_lowres_508_0.pdf#page=92.
[14] Defense Advanced Research Projects Agency, “Social Media in Strategic Communication (SMISC) (Archived),” DARPA, accessed Mar 6, 2023, https://www.darpa.mil/program/social-media-in-strategic-communication
[15] Jeh Johnson, “Statement by Secretary Jeh Johnson on the Designation of Election Infrastructure as a Critical Infrastructure Subsector” (press release), DHS.gov, Jan 6, 2017, https://www.dhs.gov/news/2017/01/06/statement-secretary-johnson-designation-election-infrastructure-critical.
[16] U.S. Department of Homeland Security, “Congress Passes Legislation Standing Up Cybersecurity Agency in DHS” (press release), DHS.gov, Nov 13, 2018, https://www.dhs.gov/news/2018/11/13/congress-passes-legislation-standing-cybersecurity-agency-dhs.
[17] “Atlantic Council,” Influence Watch, accessed March 1, 2023, https://www.influencewatch.org/non-profit/atlantic-council.
[18] “News and Notes,” Journal of Democracy 29, no. 3 (July 2018), https://www.journalofdemocracy.org/articles/news-and-notes-14.
[19] Kris Holt, “Facebook partners with think tank to fight global election meddling,” Engadget, May 17, 2018, https://www.engadget.com/2018-05-17-facebook-atlantic-council-political-ads-fake-news.html.
[20] “Honor Roll of Contributors,” Annual Report, Atlantic Council, Nov 9, 2021, https://www.atlanticcouncil.org/in-depth-research-reports/report/2020-annual-report-honor-roll-of-contributors.
[21] Joshua A. Geltzer, “New Senate Reports Are an Indictment of the White House’s Inaction on Disinformation,” Slate, Dec 18, 2018, https://slate.com/technology/2018/12/senate-reports-russian-disinformation-social-media-trump.html.
[22] “Graphika welcomes industry expert Ben Nimmo to the team,” Graphika, Aug 23, 2019, https://graphika.com/posts/graphika-welcomes-industry-expert-ben-nimmo-to-the-team.
[23] “Research Priorities: 2022 Minerva Topics of Interest,” Minerva Research Initiative, accessed Mar 6, 2023, https://minerva.defense.gov/Research/Research-Priorities; “Graphika,” Graphika, accessed Mar 6, 2023, https://www.graphika.com.
[24] “Federal Awards: Spending by Prime Award” (table), USAspending, accessed Nov 7, 2022, https://www.usaspending.gov/search/?hash=5caa43faf4a5ff7cd70185d0466731e1; “Federal Awards: Spending by Prime Award” (screenshot), Imgur, accessed Mar 6, 2023, https://imgur.com/a/nL1JWHx.
[25] Graphika (@Graphika_NYC), “Suspected #Russian actors are engaged in a renewed effort to target far-right audiences in the U.S. with politically divisive messaging ahead of the #MidtermElections2022,” Twitter post, Nov 3, 2022, 8:16 am, https://twitter.com/Graphika_NYC/status/1588158278382534656; Léa Ronzaud, Jack Stubbs, and Tyler Williams, “Same Schmitz, Different Day,” Graphika, Nov 3, 2022, https://graphika.com/posts/same-schmitz-different-day.
[26] Steven Lee Myers, “Russia Reactivates Its Trolls and Bots Ahead of Tuesday’s Midterms,” New York Times, Nov 6, 2022, accessed Mar 6, 2023 through Archive.org, https://archive.ph/cVcGz#selection-397.0-397.66.
[27] Naomi LaChance, “Google Program Used to Deradicalize Jihadis Will Be Used for Right-Wing American Extremists Next,” The Intercept, Sept 7, 2016, https://theintercept.com/2016/09/07/google-program-to-deradicalize-jihadis-will-be-used-for-right-wing-american-extremists-next.
[28] Anita Chabria and Evan Halper, “Effort to stem online extremism accidentally pushed people toward an anarchist,” Los Angeles Times, Mar 30, 2021, https://www.latimes.com/politics/story/2021-03-30/google-moonshot-redirect-far-right-online-extremism-anarchist.
[29] Anonymous disinformation specialist, telephone interview by Michael Shellenberger, March 3, 2023.
[30] Anonymous disinformation specialist, telephone interview by Michael Shellenberger, March 3, 2023.
[31] Center for an Informed Public, Digital Forensic Research Lab, Graphika, and Stanford Internet Observatory, The Long Fuse: Misinformation and the 2020 Election, 2021, Stanford Digital Repository: Election Integrity Partnership, accessed Mar 6, 2023, https://purl.stanford.edu/tr171zs0069; Mike Benz, “DHS Censorship Agency Had Strange First Mission: Banning Speech That Casts Doubt On ‘Red Mirage, Blue Shift’ Election Events,” Foundation for Freedom Online, Nov 9, 2022, https://report.foundationforfreedomonline.com/11-9-22.html.
[32] “$2.25 million in National Science Foundation funding will support Center for an Informed Public’s rapid-response research of mis- and disinformation,” Center for an Informed Public, University of Washington, Aug 15, 2021, https://www.cip.uw.edu/2021/08/15/national-science-foundation-uw-cip-misinformation-rapid-response-research; “#2120496: Collaborative Research: SaTC: CORE: Large: Rapid-Response Frameworks for Mitigating Online Disinformation” (award abstract), National Science Foundation, accessed Mar 7, 2023, https://www.nsf.gov/awardsearch/showAward?AWD_ID=2120496&HistoricalAwards=false;
[33] Center for an Informed Public, Digital Forensic Research Lab, Graphika, and Stanford Internet Observatory, The Long Fuse: Misinformation and the 2020 Election, 2021, Stanford Digital Repository: Election Integrity Partnership, accessed Mar 6, 2023, https://purl.stanford.edu/tr171zs0069.
[34] “Stanford Internet Observatory Seeks to Detect Internet Abuse in Real Time,” Freeman Spogli Institute for International Studies, Stanford University, July 25, 2019, https://fsi.stanford.edu/news/stanford-internet-observatory-seeks-detect-internet-abuse-real-time.
[35] “Graham Brookie,” Atlantic Council, accessed Mar 6, 2023, https://www.atlanticcouncil.org/expert/graham-brookie.
[36] Scott Shane and Alan Blinder, “Secret Experiment in Alabama Senate Race Imitated Russian Tactics,” New York Times, Dec 19, 2018, accessed Mar 6, 2023 through Archive.org, https://archive.ph/qoskp#selection-249.0-249.65.
[37] Hearing before the Select Committee on Intelligence of the United States Senate: Open Hearing on Foreign Influence Operations' Use of Social Media Platforms (Third Party Expert Witnesses), 115th Cong. 19 (2018) (statement of Renee DiResta, Director of Research, New Knowledge), https://www.intelligence.senate.gov/sites/default/files/documents/os-rdiresta-080118.pdf?utm_campaign=The%20Interface&utm_medium=email&utm_source=Revue%20newsletter.
[38] Alex Stamos, “Securing Our Cyber Future: Innovative Approaches to Digital Threats” (lecture, Stanford Internet Observatory, Stanford University, Palo Alto, CA, June 19, 2019), YouTube video, Oct 27, 2021, 18:00-18:20, https://www.youtube.com/watch?v=ESR9k0BtmXY.
[39] Ken Klippnstein, Lee Fang, “Truth Cops: Leaked Documents Outline DHS’s Plans to Police Disinformation,” The Intercept, Oct 31, 2022, https://theintercept.com/2022/10/31/social-media-disinformation-dhs/
[40] Jen Easterly and Chris Krebs, “Continuity of Excellence” (interview, Cybersecurity Summit 2021, Oct 2021), YouTube video, Oct 27, 2021, 7:40-14:20, https://www.youtube.com/watch?v=c81G7egOr1Q.
[41] State of Missouri et. al. v. Joseph R. Biden Jr. et. al, 3:22-CV-01213, pg. 19, United States District Court, Western District of Louisiana, Monroe Division, 2022, https://ago.mo.gov/docs/default-source/press-releases/doc-90---order-regarding-witness-depositions.pdf?sfvrsn=24c99caa_2#page=19
[42] “Aspen Institute Launches Commission on Information Disorder to Develop Actionable Public-Private Responses to the Disinformation Crisis” (press release), Aspen Institute, Jan 12, 2021, https://www.aspeninstitute.org/news/commission-on-information-disorder; FFOSourceClips, “Supercut - "Whole-Of-Society" Censorship Push,” Rumble video, Aug 22, 2022, 2:08, https://rumble.com/v1gwfan-supercut-whole-of-society-censorship-push.html.
[43] FFOSourceClips, “Chris Krebs - Hunter Biden Laptop - Looked Like Russian Disinfo - News Media Correct Not To Cover,” Rumble video, Sept 19, 2022, 0:24, https://rumble.com/v1kp4d9-chris-krebs-hunter-biden-laptop-looked-like-russian-disinfo-news-media-corr.html.
[44] “Krebs says foreign disinformation actors ‘don’t actually have to do a whole lot…’” Face The Nation, CBS, July 18, 2021, https://www.youtube.com/watch?v=i3eF99LKSd8&t=21s
[45] Dorey Scheimer and Meghna Chakrabarti, “Why misinformation is America’s greatest election security threat,” WBUR, Dec 3, 2021, https://www.wbur.org/onpoint/2021/12/03/why-domestic-misinformation-is-americas-greatest-election-security-threat
[46] Twitter Files.
[47] Alan MacLeod, “The Facebook Team that Tried to Swing Nicaragua's Election is Full of U.S. Spies,” Mint Press News, Nov 8, 2021, https://www.mintpressnews.com/nicaraguans-ignore-facebook-spooks-trick-treating-election/278870.
[48] Ben Nimmo (@benimmo), “Meanwhile, one of the most-retweeted accounts on the Skripal case on March 18-20 was @Ian56789, which shared RT and called the attack a false flag,” Twitter post, March 24, 2018, 6:02 am, https://twitter.com/benimmo/status/977500910829146112
[49] Ben Nimmo (@benimmo), Twitter accessed through Imgur.com, 3-24-18, 7:02 AM, https://imgur.com/a/kSRY62j
[50] Kate Starbird, “Kate Starbird - Censor Targeted "Everyday People" Discussing Election, Radical Bias,” https://rumble.com/v1npqq8-kate-starbird-censor-targeted-everyday-people-discussing-election-radical-b.html
[51] Center for an Informed Public, Digital Forensic Research Lab, Graphika, and Stanford Internet Observatory, The Long Fuse: Misinformation and the 2020 Election, 2021, Stanford Digital Repository: Election Integrity Partnership, accessed Mar 6, 2023, https://purl.stanford.edu/tr171zs0069.
[52] Nicole Perlroth, Sheera Frenkel, and Scott Shane, “Facebook Exit Hints at Dissent on Handling of Russian Trolls,” New York Times, March 19, 2018, https://www.nytimes.com/2018/03/19/technology/facebook-alex-stamos.html
[53] FFOSourceClips, “Alex Stamos - Goal Is To Turn Social Media Companies Into Cable News Gatekeepers,” Rumble video, Nov 10, 2020, https://rumble.com/v1lwvfe-alex-stamos-goal-is-to-turn-social-media-companies-into-cable-news-gatekeep.html .
[54] Jane Mayer, “How Russia Helped Swing The Election For Trump,” The New Yorker, Sept 24, 2018, https://www.newyorker.com/magazine/2018/10/01/how-russia-helped-to-swing-the-election-for-trump.
[55] See for instance, Yochai Benkler, “The Russians didn’t swing the 2016 election to Trump. But Fox News might have,” Washington Post, Oct 24, 2018, https://www.washingtonpost.com/outlook/2018/10/24/russians-didnt-swing-election-trump-fox-news-might-have/ .
[56] Elliott Schrage, “Hard Questions: Russian Ads Delivered to Congress,” Meta, Oct 2, 2017, https://newsroom.fb.com/news/2017/10/hard-questions-russian-ads-delivered-to-congress.
[57] Renee DiResta, “What We Now Know About Russian Disinformation,” New York Times, Dec 17, 2018, https://www.nytimes.com/2018/12/17/opinion/russia-report-disinformation.html.
[58] Yochai Benkler, “The Russians didn’t swing the 2016 election to Trump. But Fox News might have.” Washington Post, reprinted by Stamford Advocate, Oct. 24, 2018, https://www.stamfordadvocate.com/opinion/article/The-Russians-didn-t-swing-the-2016-election-to-13333223.php
[59] Yochai Benkler, “The Russians didn’t swing the 2016 election to Trump. But Fox News might have.” Washington Post, reprinted by Stamford Advocate, Oct. 24, 2018, https://www.stamfordadvocate.com/opinion/article/The-Russians-didn-t-swing-the-2016-election-to-13333223.php.
[60] Elliott Schrage, “Hard Questions: Russian Ads Delivered to Congress,” Meta, Oct 2, 2017, https://newsroom.fb.com/news/2017/10/hard-questions-russian-ads-delivered-to-congress.
[61] Renee DiResta, “What We Now Know About Russian Disinformation,” New York Times, Dec 17, 2018, https://www.nytimes.com/2018/12/17/opinion/russia-report-disinformation.html.
[62] Yochai Benkler, “The Russians didn’t swing the 2016 election to Trump. But Fox News might have.” Washington Post, reprinted by Stamford Advocate, Oct. 24, 2018, https://www.stamfordadvocate.com/opinion/article/The-Russians-didn-t-swing-the-2016-election-to-13333223.php.
[63] Yochai Benkler, “The Russians didn’t swing the 2016 election to Trump. But Fox News might have.” Washington Post, reprinted by Stamford Advocate, Oct. 24, 2018, https://www.stamfordadvocate.com/opinion/article/The-Russians-didn-t-swing-the-2016-election-to-13333223.php
[64] Ben Nimmo, “Meta’s Adversarial Threat Report, Fourth Quarter 2022,” Meta, Feb 23, 2023, https://about.fb.com/news/2023/02/metas-adversarial-threat-report-q4-2022.
[65] Paul Baines and Nigel Jones, “Influence and Interference in Foreign Elections: The Evolution of its Practice,” RUSI Journal 163, no. 1 (2018): 12-19, doi:10.1080/03071847.2018.1446723, https://files.core.ac.uk/pdf/23/188364950.pdf.
[66] Dov H. Levin, “Partisan Electoral Interventions by the Great Powers: Introducing the PEIG Dataset,” Conflict Management and Peace Science, 36 (1): 88-106 (2019), https://www.dovhlevin.com/datasets.
[67] Andrew Mark Miller, “Fox News Special Report outlines fresh questions on what Fauci, government knew about COVID origin,” Fox News, Jan 25, 2022, https://www.foxnews.com/politics/special-report-outlines-fresh-questions-on-what-fauci-government-knew-about-covid-origin
[68] Emma-Jo Morris and Gabrielle Fonrouge, “Smoking-gun email reveals how Hunter Biden introduced Ukrainian businessman to VP dad,” New York Post, Oct 14, 2020, https://nypost.com/2020/10/14/email-reveals-how-hunter-biden-introduced-ukrainian-biz-man-to-dad/.
[69] Janine Zacharia and Andrew Gotto, “How to Report Responsibly on Hacks and Disinformation,” Stanford Cyber Policy Center, accessed Mar 8, 2023, https://fsi-live.s3.us-west-1.amazonaws.com/s3fs-public/full_report_download_-_how_to_report_responsibly_on_hacks_and_disinformation.pdf.
[70] Michael Shellenberger (@ShellenbergerMD), “In Twitter Files #7, we present evidence pointing to an organized effort by representatives of the intelligence community,” Twitter post, Dec 19, 2022, 10:13 am, https://twitter.com/ShellenbergerMD/status/1604872517927153669.
[71] Anonymous Twitter employee, email to Elvis M. Chan, “Update on Russian Accounts,” Sept 24, 2020, cited by Michael Shellenberger (@ShellenbergerMD), “15. Indeed, Twitter executives repeatedly reported very little Russian activity,” Twitter post, Dec 19, 2022, 10:57 am, https://twitter.com/ShellenbergerMD/status/1604883686855299072?s=20&t=npbe_XSWYXEyzbx8WqI33g.
[72] Yoel Roth, email to Elvis M. Chan, “RE: [SOCIAL NETWORK] Twitter referral,” June 2, 2020, cited by Michael Shellenberger (@ShellenbergerMD), “16. In fact, Twitter debunked false claims by journalists of foreign influence on its platform,” Twitter post, Dec 19, 2022, 11:06 am, https://twitter.com/ShellenbergerMD/status/1604885848398254080.
[73] Yoel Roth, email to Elvis C. Chan, “Re: Twitter Account Inquiry: @WentDemtoRep,” Aug 31, 2020, cited by Michael Shellenberger (@ShellenbergerMD), “17. After FBI asks about a WaPo story on alleged foreign influence in a pro-Trump tweet,” Twitter post, Dec 19, 2022, 11:11 am, https://twitter.com/ShellenbergerMD/status/1604887121700929541.
[74] Carlos Monje, Jr., email to Yoel Roth, “OGA Query,” Jan 2, 2020, cited by Michael Shellenberger (@ShellenbergerMD), “19. Pressure had been growing,” Twitter post, Dec 19, 2022, 11:16 am, https://twitter.com/ShellenbergerMD/status/1604888429816209409.
[75] Yoel Roth, interview by Kara Swisher, On with Kara Swisher, podcast audio, Nov 29, 2022, accessed through Brian Fung, “Twitter is less safe due to Elon Musk’s management style, says former top official,” CNN Business, Nov 30, 2022, https://www.cnn.com/2022/11/29/tech/yoel-roth-twitter-elon-musk/index.html.
[76] Matthew Williams, email to Jim Baker and Dawn Burton, June 15, 2020, cited in Michael Shellenberger (@ShellenbergerMD), “29. As of 2020, there were so many former FBI employees,” Twitter post, Dec 19, 2022, 11:44 am, https://twitter.com/ShellenbergerMD/status/1604895371360374784.
[77] Yoel Roth, email to anonymous, “Re: [for your awareness] New York Post Article / Action from FB,” Oct 14, 2020, cited by Michael Shellenberger (@ShellenbergerMD), “34. On Oct 14, shortly after @NYPost publishes its Hunter Biden laptop story,” Twitter post, Dec 19, 2022, 12:04 pm, https://twitter.com/ShellenbergerMD/status/1604900581809614848.
[78] Yoel Roth, email to SCALE legal and others, “Re: [for your awareness] New York Post Article / Action from FB,” Oct 14, 2020, cited by Michael Shellenberger (@ShellenbergerMD), “38. By 10 am, Twitter execs had bought into a wild hack-and-dump story,” Twitter post, Dec 19, 2022, 12:18 am, https://twitter.com/ShellenbergerMD/status/1604904052126404608.
[79] Editorial Board, “Facebook admits the truth: ‘Fact checks’ are really just (lefty) opinion,” New York Post, Dec 14, 2021, https://nypost.com/2021/12/14/facebook-admits-the-truth-fact-checks-are-really-just-lefty-opinion/; Stossel v. Meta Platforms, Inc., U.S. District Court for the Northern District of California, San Jose Division, No. 5:21-cv-07385, Reply filed by defendant Science Feedback in support of motion to dismiss pursuant to California's anti-SLAPP statute, Mar 14, 2022, http://climatecasechart.com/wp-content/uploads/sites/16/case-documents/2022/20220314_docket-521-cv-07385_reply-1.pdf.
[80] Sam Harris, “Social Media & Public Trust: A Conversation with Bari Weiss, Michael Shellenberger, and Renee DiResta,” YouTube, Feb 1, 2023, 1:08:52 https://www.youtube.com/watch?v=tVeL5HX4uDY
[81] Renee DiResta, interview by Michael Shellenberger, transcript here: https://docs.google.com/document/d/1J8bvylZwT1zAa7iE1D3NeeudgByCHTXkeJg4bwIn8aA/edit
[82] Hearing before the Select Committee on Intelligence of the United States Senate: Open Hearing on Foreign Influence Operations' Use of Social Media Platforms (Third Party Expert Witnesses), 115th Cong. 19 (2018) (statement of Renee DiResta, Director of Research, New Knowledge), https://www.intelligence.senate.gov/sites/default/files/documents/os-rdiresta-080118.pdf?utm_campaign=The%20Interface&utm_medium=email&utm_source=Revue%20newsletter.
[83] Renee DiResta, “Cybersecurity Summit 2021: Responding to Mis, Dis, and Malinformation” (lecture, Cybersecurity Summit 2021, Oct 2021), YouTube video, Oct 27, 2021, https://www.youtube.com/watch?v=yNe4MJ351wU.
[84] Chaplinsky v. State of New Hampshire, No. 255, (US Sup. Ct. 1942) https://www.law.cornell.edu/supremecourt/text/315/568.
[85] Feiner v. People of State of New York, No. 39, (US Sup. Ct. 1951) https://www.law.cornell.edu/supremecourt/text/340/315.
[86] Terminiello v. City of Chicago, No. 272m (US Sup. Ct. 1949) https://www.law.cornell.edu/supremecourt/text/337/1.
[87] “Clear and Present Danger,” Cornell Law School: Legal Information Institute, accessed Mar 6, 2023, https://www.law.cornell.edu/wex/clear_and_present_danger.
[88] Texas, Petitioner v. Gregory Lee Johnson, No. 88-155, (US Sup. Ct. 1989) https://www.law.cornell.edu/supremecourt/text/491/397.
[89] R.A.V., Petitioner, v. City of St. Paul, Minnesota, No. 90-7675 (US Sup. Ct. 1992) https://www.law.cornell.edu/supct/html/90-7675.ZO.html.
[90] “Fighting words,” Cornell Law School: Legal Information Institute, Accessed Mar. 6, 2023, https://www.law.cornell.edu/wex/fighting_words
[91] Tom Fanning et al., “Deep Dive: Cybersecurity and the Broad Geopolitical Risk of Digital Life” (lecture, Aspen Ideas Festival, Aspen Institute, 2018), YouTube video, June 27, 2018, https://www.youtube.com/watch?v=6ZQbQ3UpO8Y&t=3s.
[92] Renee DiResta, “Cybersecurity Summit 2021: Responding to Mis, Dis, and Malinformation” (lecture, Cybersecurity Summit 2021, Oct 2021), YouTube video, Oct 27, 2021, https://www.youtube.com/watch?v=yNe4MJ351wU.
[93] Center for an Informed Public, Digital Forensic Research Lab, Graphika, and Stanford Internet Observatory, The Long Fuse: Misinformation and the 2020 Election, 2021, Stanford Digital Repository: Election Integrity Partnership, accessed Mar 6, 2023, https://stacks.stanford.edu/file/druid:tr171zs0069/EIP-Final-Report.pdf#page=206
[94] Nicholas Thompson et al., “Anti-Social Media and The Menace of Disinformation” (panel, Aspen Institute, June 29, 2018), 24:30-24:57, https://www.youtube.com/watch?v=wpksY8w9JwI.
[95] Renee DiResta, “Cybersecurity Summit 2021: Responding to Mis, Dis, and Malinformation” (lecture, Cybersecurity Summit 2021, Oct 2021), YouTube video, Oct 27, 2021, https://www.youtube.com/watch?v=yNe4MJ351wU.
[96] Renee DiResta, “What We Now Know About Russian Disinformation,” New York Times, Dec 17, 2018, https://www.nytimes.com/2018/12/17/opinion/russia-report-disinformation.html.
[97] Renee DiResta, “Cybersecurity Summit 2021: Responding to Mis, Dis, and Malinformation” (lecture, Cybersecurity Summit 2021, Oct 2021), YouTube video, Oct 27, 2021, https://www.youtube.com/watch?v=yNe4MJ351wU.
[98] Sam Harris, “Social Media & Public Trust: A Conversation with Bari Weiss, Michael Shellenberger, and Renee DiResta,” YouTube, Feb 1, 2023, 1:08:52 https://www.youtube.com/watch?v=tVeL5HX4uDY
[99] Nicholas Thompson et al., “Anti-Social Media and The Menace of Disinformation” (panel, The Aspen Institute, June 29, 2018), 13:41-15:48, https://www.youtube.com/watch?v=wpksY8w9JwI.
[100] FFOSourceClips, “EIP - Bragging That They Pushed The Envelope On Censorship Policies; Threat Of Regulation,” Rumble, Oct 2022, video, 2:18, https://rumble.com/v1lzhvy-eip-bragging-that-they-pushed-the-envelope-on-censorship-policies-threat-of.html.
[101] Anonymous disinformation expert, telephone interview by Michael Shellenberger, March 3, 2023.
[102] Alex Stamos, “Securing Our Cyber Future: Innovative Approaches to Digital Threats” (lecture, Stanford Internet Observatory, Stanford University, Palo Alto, CA, June 19, 2019), YouTube video, Oct 27, 2021, 18:00-18:20, https://www.youtube.com/watch?v=ESR9k0BtmXY.
[103] Email from Renee DiResta to Yoel Roth, September 13, 2021, Subject: “DARPA ISAT workshop pre-invite.”
[104] Michael Shellenberger and Leighton Woodhouse, “Under White House Pressure, Facebook Censored Accurate Covid Vaccine Information,” Public, Jan. 12, 2023, https://public.substack.com/p/under-white-house-pressure-facebook.
[105] Platform Accountability and Transparency Act, S. 5339, 117th Cong. (2021), https://www.coons.senate.gov/imo/media/doc/bill_text_pata_act.pdf.
[106] Anonymous, telephone interview by Michael Shellenberger, March 3, 2023.
[107] Martin Gurri, telephone interview by Michael Shellenberger, March 2023.
[108] Craig Timberg et al., “Secret campaign to use Russian-inspired tactics in 2017 Ala. election stirs anxiety for Democrats,” Washington Post, Jan 6, 2019, https://www.washingtonpost.com/business/technology/secret-campaign-to-use-russian-inspired-tactics-in-2017-alabama-election-stirs-anxiety-for-democrats/2019/01/06/58803f26-0400-11e9-8186-4ec26a485713_story.html.
[109] Craig Timberg et al., “Secret campaign to use Russian-inspired tactics in 2017 Ala. election stirs anxiety for Democrats,” Washington Post, Jan 6, 2019, https://www.washingtonpost.com/business/technology/secret-campaign-to-use-russian-inspired-tactics-in-2017-alabama-election-stirs-anxiety-for-democrats/2019/01/06/58803f26-0400-11e9-8186-4ec26a485713_story.html.
[110] Craig Timberg et al., “Secret campaign to use Russian-inspired tactics in 2017 Ala. election stirs anxiety for Democrats,” Washington Post, Jan 6, 2019, https://www.washingtonpost.com/business/technology/secret-campaign-to-use-russian-inspired-tactics-in-2017-alabama-election-stirs-anxiety-for-democrats/2019/01/06/58803f26-0400-11e9-8186-4ec26a485713_story.html.
[111] Nicholas Thompson et al., “Anti-Social Media and The Menace of Disinformation” (panel, Aspen Institute, June 29, 2018), 6:15-6:51, https://www.youtube.com/watch?v=wpksY8w9JwI.
[112] Matt Taibbi (@mtaibbi), “1.THREAD: Twitter Files #15, MOVE OVER, JAYSON BLAIR: TWITTER FILES EXPOSE NEXT GREAT MEDIA FRAUD,” Twitter post, Jan 27, 2023, 11:49 am, https://twitter.com/mtaibbi/status/1619029772977455105.
[113] Sebastian Herrera, “Austin researcher makes a name – and finds controversy – in cybersecurity world, Austin American-Statesman, Feb. 15, 2019, https://www.statesman.com/story/business/technology/2019/02/15/who-is-jonathon-morgan-austin-researcher-makes-name-and-finds-controversy-in-cybersecurity-world/5974403007.
[114] Matt Taibbi (@mtaibbi), “1.THREAD: Twitter Files #15, MOVE OVER, JAYSON BLAIR: TWITTER FILES EXPOSE NEXT GREAT MEDIA FRAUD,” Twitter thread, Jan 27, 2023, 11:49 am, https://twitter.com/mtaibbi/status/1619029772977455105.
[115] Hearing before the Select Committee on Intelligence of the United States Senate: Open Hearing on Foreign Influence Operations' Use of Social Media Platforms (Third Party Expert Witnesses), 115th Cong. 19 (2018) (statement of Renee DiResta, Director of Research, New Knowledge), https://www.intelligence.senate.gov/sites/default/files/documents/os-rdiresta-080118.pdf?utm_campaign=The%20Interface&utm_medium=email&utm_source=Revue%20newsletter.
[116] Center for an Informed Public, Digital Forensic Research Lab, Graphika, and Stanford Internet Observatory, The Long Fuse: Misinformation and the 2020 Election, 2021, Stanford Digital Repository: Election Integrity Partnership, accessed Mar 6, 2023, https://stacks.stanford.edu/file/druid:tr171zs0069/EIP-Final-Report.pdf#page=201
“Alex Stamos - Goal Is To Turn Social Media Companies Into Cable News Gatekeepers,” FFOSourceClips accessed through Rumble, November 10, 2020 https://rumble.com/v1lwvfe-alex-stamos-goal-is-to-turn-social-media-companies-into-cable-news-gatekeep.html
[117] Mike Benz, “DHS Censorship Agency Had Strange First Mission: Banning Speech That Casts Doubt On ‘Red Mirage, Blue Shift’ Election Events,” Foundation for Freedom Online, Nov 9, 2022, https://report.foundationforfreedomonline.com/11-9-22.html.
[118] FFOSourceClips, “EIP and CISA - Unclear Legal Authorities,” Rumble video, accessed Ma 6, 2023, https://rumble.com/v1kp8r9-eip-and-cisa-unclear-legal-authorities.html.
[119] “Alex Stamos: Social Media and Digital Democracy,” Commonwealth Club of California, 2021, YouTube video, 1:02:58, https://www.youtube.com/watch?v=2kMYzqfkXaM&t=361s
[120] Center for an Informed Public, Digital Forensic Research Lab, Graphika, and Stanford Internet Observatory, The Long Fuse: Misinformation and the 2020 Election, 2021, Stanford Digital Repository: Election Integrity Partnership, accessed Mar 6, 2023, https://stacks.stanford.edu/file/druid:tr171zs0069/EIP-Final-Report.pdf#page=21
[121] “Alex Stamos: Social Media and Digital Democracy,” Commonwealth Club of California, 2021, YouTube video, 1:02:58, https://www.youtube.com/watch?v=2kMYzqfkXaM&t=361s
[122] FFOSourceClips, “Alex Stamos - EIP Contacted Tech Companies After DHS Censorship Pitch,” Rumble video, Jan. 2023, https://rumble.com/v1sc6zi-alex-stamos-eip-contacted-tech-companies-after-govt-censorship-pitch.html
[123] Center for an Informed Public, Digital Forensic Research Lab, Graphika, and Stanford Internet Observatory, The Long Fuse: Misinformation and the 2020 Election, 2021, Stanford Digital Repository: Election Integrity Partnership, accessed Mar 6, 2023, https://stacks.stanford.edu/file/druid:tr171zs0069/EIP-Final-Report.pdf#page=201
[124] Center for an Informed Public, Digital Forensic Research Lab, Graphika, and Stanford Internet Observatory, The Long Fuse: Misinformation and the 2020 Election, 2021, Stanford Digital Repository: Election Integrity Partnership, accessed Mar 6, 2023, https://purl.stanford.edu/tr171zs0069.
[125] Janine Zacharia and Andrew Grotto, “How to Report Responsibly on Hacks and Disinformation: 10 Guidelines and a Template for Every Newsroom,” Stanford Geopolitics, Technology, and Governance, Cyber Policy Center, accessed Mar 6, 2023, https://cyber.fsi.stanford.edu/content/how-responsibly-report-hacks-and-disinformation.
Original publication March 31, 2023
[126] “Aspen Digital Hack-and-Dump Working Group” (event agenda), Sept 2020, cited by Michael Shellenberger (@ShellenbergerMD), “30. Efforts continued to influence Twitter's Yoel Roth,” Twitter post, Dec 19, 2022, 11:47 am, https://twitter.com/ShellenbergerMD/status/1604896328453980160?s=20.
Note: my original tweet misstated the date.
[127] “Hack and Leak Roundtable Participant List,” The Aspen Institute, June 25, 2020, cited by Michael Shellenberger, “Correction: This event occurred on June 25, 2020, not in September,” Twitter post, Mar 7, 2023, 10:03 pm, https://twitter.com/ShellenbergerMD/status/1633317442368815104?s=20.
[128] Pauline Firozi, “Tom Cotton keeps repeating a coronavirus fringe theory that scientists have disputed,” Washington Post, Feb 17, 2020, https://www.washingtonpost.com/politics/2020/02/16/tom-cotton-coronavirus-conspiracy.
[129] Charles Calisher et al., “Statement in support of the scientists, public health professionals, and medical professionals of China combatting COVID-19,” Lancet 395, no. 10226 (Feb 19, 2020), doi:10.1016/S0140-6736(20)30418-9.
[130] Tucker Carlson (@tuckercarlsontonight), screenshot of Instagram post, Archive.org, accessed Mar 6, 2023, https://archive.vn/S9DSo.
[131] Michael R. Gordon and Warren P. Strobel, “Lab Leak Most Likely Origin of Covid-19 Pandemic, Energy Department Now Says,” Wall Street Journal, Feb 26, 2023, https://www.wsj.com/articles/covid-origin-china-lab-leak-807b7b0a.
[132] Olivia Olander, “Fauci on Covid lab leak theory: ‘I have a completely open mind,’” Politico, Nov 27, 2022, https://www.politico.com/news/2022/11/27/fauci-china-covid-lab-leak-theory-00070867.
[133] Guy Rosen, “An Update on Our Work to Keep People Informed and Limit Misinformation About COVID-19,” Meta, Apr 16, 2020, https://about.fb.com/news/2020/04/covid-19-misinfo-update.
[134] John Tierney, “This Article Is ‘Partly False,’” City Journal, May 17, 2021, https://www.city-journal.org/facebook-and-its-fact-checkers-spread-misinformation.
[135] Steven Nelson, “White House ‘flagging’ posts for Facebook to censor over COVID ‘misinformation,’” New York Post, July 15, 2021, https://nypost.com/2021/07/15/white-house-flagging-posts-for-facebook-to-censor-due-to-covid-19-misinformation.
[136] Kirby Wilson and Allison Ross, “YouTube removes video of DeSantis pandemic roundtable with Atlas, other panelists,” Miami Herald, Apr 12, 2021, https://www.miamiherald.com/news/politics-government/state-politics/article250611599.html.
[137] John Tierney, “This Article Is ‘Partly False,’” City Journal, May 17, 2021, https://www.city-journal.org/facebook-and-its-fact-checkers-spread-misinformation.
[138] Office of Inspector General, Department of Homeland Security, DHS Needs a Unified Strategy to Counter Disinformation Campaigns, OIG-22-58, Department of Homeland Security, Aug 10, 2022, https://www.oig.dhs.gov/sites/default/files/assets/2022-08/OIG-22-58-Aug22.pdf
[139] Ken Klippenstein and Lee Fang, “Truth Cops: Leaked Documents Outline DHS’s Plans to Police Disinformation,” The Intercept, Oct 31, 2022, https://theintercept.com/2022/10/31/social-media-disinformation-dhs
[140] Ken Klippenstein and Lee Fang, “Truth Cops: Leaked Documents Outline DHS’s Plans to Police Disinformation,” The Intercept, Oct 31, 2022, https://theintercept.com/2022/10/31/social-media-disinformation-dhs; CISA Cybersecurity Advisory Committee, “DHS Cybersecurity Disinformation Meeting Minutes,”accessed Mar 6, 2023, https://www.documentcloud.org/documents/23175380-dhs-cybersecurity-disinformation-meeting-minutes.
[141] Renee DiResta, “Cybersecurity Summit 2021: Responding to Mis, Dis, and Malinformation” (lecture, Cybersecurity Summit 2021, Oct 2021), YouTube video, Oct 27, 2021, https://www.youtube.com/watch?v=yNe4MJ351wU.
[142] “Under White House Pressure, Facebook Censored Accurate Covid Vaccine Information,” Public, Jan 12, 2023, https://public.substack.com/p/under-white-house-pressure-facebook?utm_source=twitter&utm_campaign=auto_share&r=1ccax; Ernie Mundell and Robin Foster, “New York City Nurse Is First Inoculated in Rollout of Pfizer COVID Vaccine,” U.S. News and World Report, Dec 14, 2020, https://www.usnews.com/news/health-news/articles/2020-12-14/us-rollout-of-pfizer-covid-vaccine-begins.
[143] Anonymous Facebook executive, email to Andrew M. Slavett and Rob Flaherty, “[EXTERNAL] Follow up - Friday call w[redacted],” Mar 21, 2021, cited by Michael Shellenberger, Leighton Woodhouse, “Under White House Pressure, Facebook Censored Accurate Covid Vaccine Information,” Public, Jan 12, 2023, https://public.substack.com/p/under-white-house-pressure-facebook.
[144] Andrew Bailey, “Missouri Attorney General Releases More Documents Exposing White House's Social Media Censorship Scheme” (press release), Jan 9, 2023, https://ago.mo.gov/home/news/2023/01/09/missouri-attorney-general-releases-more-documents-exposing-white-house%27s-social-media-censorship-scheme.
[145] Anonymous Facebook executive, email to Andrew M. Slavett and Rob Flaherty, “[EXTERNAL] Follow up - Friday call w[redacted],” Mar 21, 2021, cited by Michael Shellenberger, Leighton Woodhouse, “Under White House Pressure, Facebook Censored Accurate Covid Vaccine Information,” Public, Jan 12, 2023, https://public.substack.com/p/under-white-house-pressure-facebook.
[146] Anonymous Facebook executive, email to Andrew M. Slavett and Rob Flaherty, “[EXTERNAL] Follow up - Friday call w[redacted],” Mar 21, 2021, cited by Michael Shellenberger, Leighton Woodhouse, “Under White House Pressure, Facebook Censored Accurate Covid Vaccine Information,” Public, Jan 12, 2023, https://public.substack.com/p/under-white-house-pressure-facebook.
[147] Rob Flaherty, email to anonymous Facebook executive, "RE: You are hiding the ball," Mar 15, 2021, cited by “Under White House Pressure, Facebook Censored Accurate Covid Vaccine Information,” Public, Jan 12, 2023, https://public.substack.com/p/under-white-house-pressure-facebook?utm_source=twitter&utm_campaign=auto_share&r=1ccax
[148] The Virality Project, Memes, Magnets and Microchips: Narrative Dynamics around COVID-19 Vaccines, Stanford Digital Repository, Apr 26, 2022, https://purl.stanford.edu/mx395xj8490.
[149] Katie Couric, Chris Krebs, and Rashad Robinson, Commission on Information Disorder: Final Report, November 2021, Aspen Institute, Nov 2021, https://www.aspeninstitute.org/publications/commission-on-information-disorder-final-report.
[150] Stossel v. Meta Platforms, Inc., U.S. District Court for the Northern District of California, San Jose Division, No. 5:21-cv-07385, Reply filed by defendant Science Feedback in support of motion to dismiss pursuant to California's anti-SLAPP statute, Mar 14, 2022, http://climatecasechart.com/wp-content/uploads/sites/16/case-documents/2022/20220314_docket-521-cv-07385_reply-1.pdf.
[151] Michael Shellenberger, “I Have Been Censored By Facebook For Telling The Truth About Climate Change And Extinctions,” Environmental Progress, July 7, 2020, https://environmentalprogress.org/big-news/2020/7/7/i-have-been-censored-by-facebook-for-telling-the-truth-about-climate-change-and-extinctions.
John Stossel, “Here’s where the ‘facts’ about me lie — Facebook bizarrely claims its ‘fact-checks’ are ‘opinion,’” New York Post, Dec 13, 2021, https://nypost.com/2021/12/13/facebook-bizarrely-claims-its-misquote-is-opinion; John Stossel, “Government Fueled Fires,” YouTube video, Sept 22, 2020, https://www.youtube.com/watch?v=N-xvc2o4ezk.
[152] Bjorn Lomborg, “The heresy of heat and cold deaths,” Bjorn Lomborg, accessed Mar 6, 2023, https://www.lomborg.com/the-heresy-of-heat-and-cold-deaths; Qi Zhao et al., “Global, regional, and national burden of mortality associated with non-optimal ambient temperatures from 2000 to 2019: a three-stage modelling study,” Lancet 5, no. 7 (July 2021), doi:10.1016/S2542-5196(21)00081-4.
[153] Ted Johnson, “Joy Reid again faces defamation lawsuit over social media posts,” New York Post, July 15, 2020, https://nypost.com/2020/07/15/joy-reid-again-faces-defamation-lawsuit-over-social-media-posts; Stossel v. Meta Platforms, Inc., U.S. District Court for the Northern District of California, San Jose Division, No. 5:21-cv-07385, Reply filed by defendant Science Feedback in support of motion to dismiss pursuant to California's anti-SLAPP statute, Mar 14, 2022, http://climatecasechart.com/wp-content/uploads/sites/16/case-documents/2022/20220314_docket-521-cv-07385_reply-1.pdf.
[154] “Watch: A conversation on battling misinformation,” Axios, Jun 9, 2022, https://www.axios.com/2022/05/31/axios-event-gina-mccarthy-nih-misinformation-online.
[155] “Clean Energy in Texas,” American Clean Power, accessed Mar 6, 2023, https://cleanpower.org/resources/clean-energy-in-texas.
[156] “Watch: A conversation on battling misinformation,” Axios, Jun 9, 2022, https://www.axios.com/2022/05/31/axios-event-gina-mccarthy-nih-misinformation-online.
[157] “About 3M Solar Energy,” 3M, accessed Mar 6, 2023, https://www.3m.com/3M/en_US/power-generation-us/solutions/solar-energy; “Client Profile: 3M Co,” OpenSecrets, accessed Mar 6, 2023, https://www.opensecrets.org/federal-lobbying/clients/bills?cycle=2021&id=D000021800.
[158] Editorial Board, “Climate-Change Censorship: Phase Two,” Wall Street Journal, June 13, 2022, https://www.wsj.com/articles/climate-censorship-phase-two-gina-mccarthy-social-media-biden-white-house-11655156191.
[159] Ailan Evans, “State Department Helped Fund ‘Disinformation’ Research Group That Reportedly Blacklists Conservative News Sites,” Daily Caller, Feb 9, 2023, https://dailycaller.com/2023/02/09/state-department-disinformation-conservative-news-site; “The U.S-Paris Tech Challenge: Hear from the winners,” Safety Tech Innovation Network, Feb 16, 2022, https://www.safetytechnetwork.org.uk/the-u-s-paris-tech-challenge-hear-from-the-winners; “Partnerships & Funders,” Institute for Strategic Dialogue, accessed Mar 7, 2023, https://www.isdglobal.org/partnerships-and-funders.
[160] Jennie King et al., Deny, Deceive, Delay Vol. 2: Exposing New Trends in Climate Mis- and Disinformation at COP27, Institute for Strategic Dialogue, Jan 19, 2023, https://www.isdglobal.org/isd-publications/deny-deceive-delay-vol-2-exposing-new-trends-in-climate-mis-and-disinformation-at-cop27.
[161] Renee DiResta, interview by Michael Shellenberger, transcript here: https://docs.google.com/document/d/1J8bvylZwT1zAa7iE1D3NeeudgByCHTXkeJg4bwIn8aA/edit
[162] Lyssa White, “Global 2021” (list of grants), National Endowment for Democracy, Feb 11, 2022, https://www.ned.org/region/global-2021; Gabe Kaminsky, “Disinformation Inc: State Department bankrolls group secretly blacklisting conservative media,” Washington Examiner, Feb 9, 2023, https://www.washingtonexaminer.com/restoring-america/equality-not-elitism/disinformation-group-secretly-blacklisting-right-wing-outlets-bankrolled-state-department; Gabe Kaminsky, “Disinformation Inc: Meet the groups hauling in cash to secretly blacklist conservative news,” Washington Examiner, Feb 9, 2023, https://www.washingtonexaminer.com/restoring-america/equality-not-elitism/disinformation-conservative-media-censored-blacklists; National Defense Authorization Act for Fiscal Year 2022, H.R. 8282, 117th Cong. (2021), https://www.govinfo.gov/content/pkg/BILLS-117hr8282rh/html/BILLS-117hr8282rh.htm.
[163] “Contract Summary, Purchase Order, PIID FA864921P1569,” USAspending, accessed Mar 6, 2023, https://www.usaspending.gov/award/CONT_AWD_FA864921P1569_9700_-NONE-_-NONE-.
[164] Ken Klippenstein and Lee Fang, “Truth Cops: Leaked Documents Outline DHS’s Plans to Police Disinformation,” The Intercept, Oct 31, 2022, https://theintercept.com/2022/10/31/social-media-disinformation-dhs.
[165] Ken Klippenstein and Lee Fang, “Truth Cops: Leaked Documents Outline DHS’s Plans to Police Disinformation,” The Intercept, Oct 31, 2022, https://theintercept.com/2022/10/31/social-media-disinformation-dhs.
[166] Renee DiResta (@noUpside), “Interesting Facebook’s adversarial threat report today,” Twitter post, Feb 23, 2023, 8:22 am, https://twitter.com/noUpside/status/1628762155888648193?s=20.
[167] Alex Stamos (@alexstamos), “1) Serious foreign influence campaigns continue online,” Twitter post, Feb 25, 2023, 4:37 pm, https://twitter.com/alexstamos/status/1629611621005033472?s=20.
[168] Ben Nimmo, “Meta’s Adversarial Threat Report, Fourth Quarter 2022,” Meta, Feb 23, 2023, https://about.fb.com/news/2023/02/metas-adversarial-threat-report-q4-2022.
[169] Martin Gurri (@mgurri), “Prove "influence". Where's the data? What pure American minds are polluted?” Twitter post, Feb 26, 2023, 11:29 am, https://twitter.com/mgurri/status/1629896414083059712?s=20.
[170] Alex Stamos (@alexstamos), “I have stated multiple times, over years, that I thought the impact of these campaigns is often overstated,” Twitter post, Feb 26, 2023, 1:59 pm, https://twitter.com/alexstamos/status/1629934186089029632.
[171] “Information Program,” Open Society Foundations, accessed Mar 8, 2023, https://www.opensocietyfoundations.org/who-we-are/programs/information-program.
[172] Joseph Brean, “'Yeeeah it's him': Vancouver writer sues Twitter over its rule against misgendering trans people,” National Post, Feb 12, 2019, https://nationalpost.com/news/yeeeah-its-him-vancouver-writer-sues-twitter-over-its-rule-against-misgendering.
[173] TK
[174] Andrew Mark Miller, “Fow News Special Report outlines fresh questions on what Fauci, government knew about COVID origin,” Fox News, 1-25-22 https://www.foxnews.com/politics/special-report-outlines-fresh-questions-on-what-fauci-government-knew-about-covid-origin
[175] Janine Zacharia, Andrew Gotto, “How to Report Responsibly on Hacks and Disinformation,” Stanford Cyber Policy Center, https://fsi-live.s3.us-west-1.amazonaws.com/s3fs-public/full_report_download_-_how_to_report_responsibly_on_hacks_and_disinformation.pdf
[176] Jane Mayer, “HowRussia Helped Swing The Election For Trump,” The New Yorker, 9-24-18 https://www.newyorker.com/magazine/2018/10/01/how-russia-helped-to-swing-the-election-for-trump
[177] See for instance, Yochai, “The Russians didn’t swing the 2016 election to Trump. But Fow News might have.” The Washington Post, 10-24-18
Nate Silver, “How Much Did Russian Interference Affect The 2016 Election?” FiveThirtyEight, 2-16-18
https://fivethirtyeight.com/features/how-much-did-russian-interference-affect-the-2016-election/
[178] Cristiano Lima, “Facebook no longer treating, ‘man-made’ Covid as a crackpot idea,” Politico, 5-26-21, https://www.politico.com/amp/news/2021/05/26/facebook-ban-covid-man-made-491053
[179] Michael Shellenberger, Leighton Woodhouse, “Under White House Pressure, Facebook Censored Accurate Covid Vaccine Information,” Public, 1-12-23, https://public.substack.com/p/under-white-house-pressure-facebook
[180] Michael Shellenberger, Twitter, https://twitter.com/ShellenbergerMD/status/1604871630613753856
[181] https://nypost.com/2021/12/14/facebook-admits-the-truth-fact-checks-are-really-just-lefty-opinion/
[184] https://www.statesman.com/story/business/technology/2019/02/15/who-is-jonathon-morgan-austin-researcher-makes-name-and-finds-controversy-in-cybersecurity-world/5974403007/
[186] https://www.stamfordadvocate.com/opinion/article/The-Russians-didn-t-swing-the-2016-election-to-13333223.php
[187] https://www.stamfordadvocate.com/opinion/article/The-Russians-didn-t-swing-the-2016-election-to-13333223.php
[190] Ken Klippnstein, Lee Fang, “Truth Cops: Leaked Documents Outline DHS’s Plans to Police Disinformation,” The Intercept, 10-31-22 https://theintercept.com/2022/10/31/social-media-disinformation-dhs/
[194] https://public.substack.com/p/under-white-house-pressure-facebook?utm_source=twitter&utm_campaign=auto_share&r=1ccax
[195] Matt Taibbi, “1. Thread: THE TWITTER FILES,” Twitter thread, Dec 2, 2022, 5:34 pm, https://twitter.com/mtaibbi/status/1598822959866683394.
[196] Ken Klippenstein and Lee Fang, “Truth Cops: Leaked Documents Outline DHS’s Plans to Police Disinformation,” The Intercept, Oct 31, 2022, https://theintercept.com/2022/10/31/social-media-disinformation-dhs.
[197] Matt Taibbi, “Capsule Summaries of all Twitter Files Threads to Date, With Links and a Glossary,” Racket News, Jan 4, 2023, https://www.racket.news/p/capsule-summaries-of-all-twitter.
[198] Lyssa White, “Global 2021” (list of grants), National Endowment for Democracy, Feb 11, 2022, https://www.ned.org/region/global-2021; Gabe Kaminsky, “Disinformation Inc: State Department bankrolls group secretly blacklisting conservative media,” Washington Examiner, Feb 9, 2023, https://www.washingtonexaminer.com/restoring-america/equality-not-elitism/disinformation-group-secretly-blacklisting-right-wing-outlets-bankrolled-state-department; Gabe Kaminsky, “Disinformation Inc: Meet the groups hauling in cash to secretly blacklist conservative news,” Washington Examiner, Feb 9, 2023, https://www.washingtonexaminer.com/restoring-america/equality-not-elitism/disinformation-conservative-media-censored-blacklists.
[200] Editorial Board, “Climate-Change Censorship: Part Two
[201] TEST
It all begins with an idea. Maybe you want to launch a business. Maybe you want to turn a hobby into something more. Or maybe you have a creative project to share with the world. Whatever it is, the way you tell your story online can make all the difference.
Don’t worry about sounding professional. Sound like you. There are over 1.5 billion websites out there, but your story is what’s going to separate this one from the rest. If you read the words back and don’t hear your own voice in your head, that’s a good sign you still have more work to do.
Be clear, be confident and don’t overthink it. The beauty of your story is that it’s going to continue to evolve and your site can evolve with it. Your goal should be to make it feel right for right now. Later will take care of itself. It always does.