ylliX - Online Advertising Network
Trump’s Victory Has Opened the Disinformation Floodgates

Trump’s Victory Has Opened the Disinformation Floodgates



The breadth of falsehoods circulating in the months and days prior to Election Day in the United States was breathtaking in both scale and creativity. It was, as the head of the Cybersecurity and Infrastructure Security Agency Jen Easterly said, an “unprecedented amount of disinformation.” Voters were treated to videos masquerading as FBI-generated or CBS reports that warned of security threats and voter fraud, while other videos falsely depicted mail-in ballots that favored Republican presidential nominee Donald Trump being destroyed, or an alleged Haitian immigrant voting in two counties. Fabrications about Democratic nominee and Vice President Kamala Harris drew from a seemingly bottomless well, ranging from false allegations that she was involved in a hit-and-run incident to her being allied with convicted pedophile Jeffrey Epstein. This was on top of saturation levels of preelection disinformation campaigns involving Haitian immigrants, hurricane relief efforts, and so much more.

Oiled and revved up in advance of the Nov. 5 election, though, the disinformation machine abruptly died that evening. Trump had urged voters to get him a win “too big to rig”—harkening back to his persistent lie that victory was stolen from him in the 2020 election—and voters delivered.

The breadth of falsehoods circulating in the months and days prior to Election Day in the United States was breathtaking in both scale and creativity. It was, as the head of the Cybersecurity and Infrastructure Security Agency Jen Easterly said, an “unprecedented amount of disinformation.” Voters were treated to videos masquerading as FBI-generated or CBS reports that warned of security threats and voter fraud, while other videos falsely depicted mail-in ballots that favored Republican presidential nominee Donald Trump being destroyed, or an alleged Haitian immigrant voting in two counties. Fabrications about Democratic nominee and Vice President Kamala Harris drew from a seemingly bottomless well, ranging from false allegations that she was involved in a hit-and-run incident to her being allied with convicted pedophile Jeffrey Epstein. This was on top of saturation levels of preelection disinformation campaigns involving Haitian immigrants, hurricane relief efforts, and so much more.

Oiled and revved up in advance of the Nov. 5 election, though, the disinformation machine abruptly died that evening. Trump had urged voters to get him a win “too big to rig”—harkening back to his persistent lie that victory was stolen from him in the 2020 election—and voters delivered.

Now that there is a lull, we must ask some critical questions. First, does disinformation—while upsetting, annoying, or even amusing—matter in influencing outcomes? Second, with a second Trump term, what is the future of the disinformation machine? And if disinformation continues unabated and even flows across borders, what can be done about it on a national and transnational level?


Some researchers argue that disinformation has little effect in changing behavior. The argument often hinges on empirical studies demonstrating that such content typically gets relatively low exposure and is viewed and shared by a fringe already motivated to seek out such content.

There is, however, robust evidence to suggest that in the specific instance of the 2024 U.S. elections, disinformation did change behavior, in that an alternative reality took hold in voters’ minds and influenced their choices. Voters who demonstrated being misinformed about key issues, such as immigration, crime, and the economy preferred Trump. Consider the example of immigration and crime: Looking at 2018 felony crime offending rates in Texas, native-born U.S. citizens committed around 1,100 crimes per 100,000 people, compared to 800 by documented immigrants and 400 by undocumented immigrants. Analysis of similar data across all 50 states suggests no statistically significant correlation between the immigrant share of the population and the total crime rate in any state. This and numerous other sources of data consistently show that immigrants, both documented and undocumented, are less likely to commit crimes than native-born U.S. citizens across various crime categories and over extended periods. This contradicts a dominant narrative around a “migrant crime wave,” spread primarily by Trump and his surrogates. Approximately 45 percent of Trump supporters said immigration was one of their three biggest issues; most Americans, meanwhile, believed that illegal immigration was linked to higher crime rates.

With a second Trump term ahead, it is worthwhile to ask what we might expect from a disinformation machine that was so helpful in bringing such an administration to power. This was a machine designed to generate false narratives built to exploit fear and anxiety at scale, using fabrications that may build on a kernel of truth or resonant with some people’s beliefs or actual experiences. It identified malevolent actors to be defeated as part of the calls to action. In order to spread disinformation further and enhance its credibility, operations involved consistent repetition of narratives and their amplification in political rallies through social media and alignment with the financial and political incentives of other influential voices. Where does the Trump reelection disinformation machine go from here now that its primary job is done? Designed to increase confidence in the leader and the regime, disinformation systems have a distinguished tradition of flourishing under autocratic administrations, from Octavian’s Roman Empire to Vladimir Putin’s Russia.

There are five galvanizing issues to watch for in the next turn of the disinformation crank.

First, there will be a need to undermine the credibility of media outlets considered unfriendly to Trump. This objective will, of course, get plenty of support from “friendly” media like Fox News and the New York Post, but, most significantly, from Elon Musk—a close ally of the administration. Musk and his platform, X, are frightfully effective in creating and disseminating narratives.

Assuming the Musk-Trump alliance has a meaningful shelf life, consider the “Musk effect” itself. Analysis from the Center for Countering Digital Hate found that at least 87 of Musk’s posts on X in 2024 were false or misleading, and they had 2 billion views in total. None of those posts were accompanied by a Community Note, a user-generated fact-check. To add to his influence, Musk, who has said he’s a “free speech absolutist” and is selective about the content moderated on X to serve his own purposes, is now charged with minimizing government bureaucracy; he will likely work to ensure that regulations intended to moderate content—as long as they are not unfavorable to him—are held to a minimum. We should not expect to see any major legislative overhauls, such as a rollback of Section 230—originally part of the Communications Decency Act of 1996—that protects digital platforms from being held liable for content they host, as it would severely hamper the free-wheeling content environment at X that Musk has created. This would be a change of position for Trump, as he did push for such a rollback in his first term.

Second, putting Musk aside, a scan of the remaining names put forward for Trump’s cabinet reads like a who’s who in the annals of disinformation. Consider just three: first, Tulsi Gabbard, who is nominated to be director of national Intelligence. She has a track record of being partial to propaganda from the likes of Russian President Vladimir Putin and Syrian President Bashar al-Assad, and has even declared a QAnon conspiracy theory about a U.S.-funded bioweapons lab in Ukraine to be an “undeniable fact.” Second, Robert F. Kennedy Jr., nominated to be the secretary of the Department of Health and Human Services. He has peddled ideas that are outright false—such as childhood immunizations causing autism—and questionable, like that excessive fluoride in drinking water can lower IQ. Pete Hegseth, nominated for secretary of defense, has already gone a step further by calling for the word “misinformation” itself to be stripped from the public lexicon as soon as possible. Each of these three substantive federal agencies will need their disinformation machines to be humming and ready to go, given the individuals who might be in charge.

This would lead us to their boss’s playbook of false narratives, repetition, amplification, and targeting opponents and critics as “enemies.” The Washington Post found that Trump made 30,573 false or misleading claims, or around 21 fabrications a day, during his first term. As noted earlier, repetition of falsehoods is a core operating principle for Trump, and it has been shown to work: There is a demonstrated correlation between the number of times Trump repeated falsehoods during his presidency and misperceptions among Republicans. Given this record, we ought to count on him escalating his reliance on these strategies—and especially on repetitive disinformation—as a strategy for governance.

Fourth, with several disinformation-centered narratives influential in getting Trump to the White House, their lives will have to be extended as the administration swings into action to follow-up on campaign promises—for example, as deportation procedures against undocumented immigrants are launched.

Finally, it is essential to consider the intentions of foreign governments, which may use the Trump model to manipulate their own citizens. Russia has been the most energetic in its disinformation campaigns during the United States’ 2024 election season, from the falsehood about Harris being involved in a hit-and-run incident, to hurricane-related falsehoods, to bomb threats on Election Day. But there are many others in the fray with elaborate disinformation machines—including networks of bogus social media accounts, websites to spread divisive content, third-party actors, and fringe groups—that are ready to go. According to the U.S. Government Accountability Office, the three governments most active in creating and spreading disinformation in the United States are Russia, China, and Iran. All three steadily increased their disinformation campaigns in the months leading up to Election Day. One can expect the dynamic to follow that to be an arms race: If the U.S. government itself invests in disinformation, foreign governments will attempt to keep pace —and even view it as implicit permission to do so.


With this sobering outlook, it is natural to ask: What should be done? The regulatory and legislative establishment is likely to be compromised, so other actors will have to step up—these include major digital platforms, independent watchdogs, the media, civil society organizations, and regular citizens.

The most critical are the digital platforms as they have the greatest leverage; they must reverse their recent trend toward reducing content moderation teams and cutting resources for fact-checking, labeling, blocking, or demoting messages that run afoul of posted standards. The COVID-19 pandemic created a “infodemic” emergency and a sense of urgency for the platforms to be proactive and ramp up content moderation to stem the tide of misinformation. Even though many of the attempts were found wanting and flawed, most of the major platforms did take specific actions—defining policies, being transparent about their criteria, taking steps to remove or moderate violators, and nudging users to check out trusted sources. There is evidence to suggest that messaging from trusted sources had a positive effect on users’ quality of knowledge and how they behaved based on such information. A second Trump term needs to be viewed as an emergency of parallel proportions.

For their part, watchdog groups, the media, and civil society must amplify their voices when they see false narratives and counter them not just with boring statistics but engaging fact-based narratives to reeducate and inform. Local media, in particular, has a role to play in bringing credible fact-based news to ordinary citizens. Watchdog groups across different countries should collaborate with digital platforms to identify sources of false or malicious content and develop early warning signs of international interference by state and non-state actors and proxy groups. Once again, the lessons from the pandemic might come handy in considering the role of multilateral bodies—just as the World Health Organization (WHO) implemented strategies to combat COVID-19 misinformation, with mixed but generally positive results, similar bodies can be set up to coordinate across multiple actors and across geographies.

Finally, ordinary citizens will need to take the time to become responsible consumers of media—for their own good. And, as their lived realities diverge from false narratives in circulation, they might become more discerning and wary in seeking out information sources. As a recent study found, citizens do become more discerning consumers of digitally transmitted information when there are sustained mitigation and education efforts.

It will, no doubt, be a long road ahead. While we can expect a deluge of disinformation along the way, we cannot let it become the new normal.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *