AMU Intelligence Original

Combating Misinformation: What Works and What Doesn’t

In a 1967 episode of “Star Trek” titled Return of the Archons, Captain Kirk and Mr. Spock once again save the USS Enterprise by confronting Landru, a seemingly omnipotent being who controls the society on the planet Beta 3.

After using their phasers to blast their way past Landru’s false image along with the wall it was projected on, they find in the next chamber that Landru is not a person, but a 6,000-year-old computer used to control the long-dead human Landru’s utopian world. Instead of simply unplugging or destroying the machine, the duo use logical arguments about the necessity of freedom of choice and creativity that are incompatible with Landru’s programming. Their logic ultimately causes Landru to short-circuit, freeing both the people on Beta 3 and the Enterprise in orbit above.

Over 50 years later, this episode takes on an almost prophetic warning in the context of our current struggle with misinformation. Various tech providers, pundits, politicians, and academics have taken it upon themselves to define misinformation as a scourge or virus that must be defeated for the good of society. Many have also either presented or enacted different solutions to combat misinformation. Unfortunately, their solutions cause more problems than they correct.

Misinformed Methods to Combat Misinformation

As I covered in my previous article, misinformation can be classified and mapped in terms of its accuracy as well as the intent of its source. This mapping can also be used to understand the how the various methods to combat misinformation are supposed to work, but often fail:

Censorship – The Carpet Bombing of Rational Discourse

One of the most common methods to combat misinformation – especially within the large tech platforms such as Twitter and Facebook – is through censorship. Censorship is simply a heavy-handed attempt to eliminate all forms of disinformation, not through discourse and debate, but by blocking the message and messenger altogether.

It comes with the hubristic assumption that the potential audience is not intelligent or rational enough to process news that contradicts the given narrative. More sinister is when the censor simply doesn’t care about the audience; they just don’t want the audience knowing anything that counters the official message or narrative whether it’s the truth or not.

Whether the censor is a tech provider that only wants to eliminate disinformation or is a dictatorship that wants to eliminate all opposition, the use of censorship has the same effect of blocking everything: the lies, the truth, opinion and even simple errors.

Kleinsmith 2021

While censorship can yield some short-term success for the censor, in the long run it usually backfires in the form of both resistance and persistence.

The most recent and embarrassing example of this has been the theory that the COVID-19 virus originated in a lab in China. This theory had been adjudicated and censored by Facebook as verifiably false earlier this year, but is now one of the leading theories in the public arena. Because of this and other episodes of censorship, Facebook is no longer viewed as a trusted tech platform, but as a publisher and a biased one at that.

Prebunking – Another Double-Edged Sword

Prebunking is another method to combat misinformation currently touted within academic and media circles. It differs from debunking misinformation in that prebunking attempts to make the target audience aware of potential lies and fake news prior to receiving them, rather than trying to defeat a narrative after it has already spread.

Unfortunately, many academic papers and articles often describe this process using the medical terms like “inoculating” people, so that when the virus of disinformation reaches them, they are more resistant to it. There are many parallels, but this is a misleading analogy.

In theory, prebunking has already shown promise in audiences that have been prepared to receive the lies and disinformation they are exposed to through their news or social media feeds. The problem with prebunking is that can also be used to program an audience with falsehoods and propaganda, thereby making them more resistant to the truth when it disagrees with their preconceptions. In this more malicious form, prebunking is similar to mental conditioning and brainwashing. After all, almost any method used to mold an audience’s perception from marketing to political spin uses some form of prebunking in creating an image or idea that is resistant to challenges later on.

Combating Misinformation within the Marketplace of Ideas

The common element of failure in almost every method used to combat disinformation is the human element. No matter if they are censors, fact-checkers, prebunkers, or media analysts, as long as the human element is involved with combating misinformation, there is a risk that our own biases and misperceptions will be used to combat it. While many of these techniques look good on paper, in practice they are just adding another layer of bias into the process.

That is why the best place to combat misinformation has always been within the marketplace of ideas, our forums of discourse and public debate. Arguing your perspective, whether it’s in a company, as part of a command staff, or within a small analytic team, can be aggravating, confusing, uncomfortable, and even perceived as futile. Nevertheless, it’s an absolutely necessary part of our decision-making.

Instead of trying to condition ourselves to resist information, we should continue to be placing emphasis on the principles of critical thinking, reading and writing. This includes placing an emphasis at all levels of our education and professional training programs.

In an upcoming article, I will explore why misinformation and even disinformation are actually a necessary part of our analysis and decision-making. After all, if all of our efforts to combat misinformation were completely successful, how would we know?

Erik Kleinsmith is the AVP in Intelligence, National Security and Homeland Security for AMU. He is a former Army Intelligence Officer and portfolio manager for Intelligence & Security Training at Lockheed Martin. He is a subject of the book “The Watchers” about Able Danger. He published a book, “Intelligence Operations: Understanding Data, Tools, People, and Processes.” Erik is also a member of The Case Breakers, a private investigative group dedicated to breaking such cold cases as the Zodiac Killer, D.B. Cooper and the fate of Jimmy Hoffa.

Comments are closed.