We hear a lot of talk these days about how truth is threatened in the age of misinformation. I understand that misinformation is a terrible problem, and that it has downstream effects on science, trust, institutions, our society and just about everything else that we care about. But I think the real culprit is properly labeled not as misinformation but disinformation.
Let me explain.
Misinformation is a mistake. It is when we say that some falsehood is true when it is not, but this happened by accident. Say I am out walking my dogs and a passing car stops and asks for directions to Boston. I say, “Go down three blocks, make a left and get on Beacon Street. That’ll take you straight to Boston.” Easy enough and the person happily pulls away. Then, to my mortification, I realize that I’m not where I thought I was and I’ve told him to go exactly the wrong way. I said to make a left on Beacon, when I should have said right. Now he’s going to be headed out to Wellesley and — if he keeps going — eventually the New York border. I made a mistake but it wasn’t malicious. That’s misinformation.
Now imagine, by contrast, that I intentionally wanted to misdirect someone, because I had a gang of friends who were waiting to carjack a Mercedes, and I took this opportunity to give the motorist directions that would lead him directly to them, so that they could take his car, chop it up and sell it for parts. In this case, perhaps I might also have told him to go left rather than right, but now it is disinformation. I did it on purpose. I did it because it served my interests for the motorist to believe a falsehood. And this is where the whole thing is relevant for science.
If we cannot figure out what to do about disinformation, a number of our treasured institutions, like science and democracy, may be at stake.
I maintain that the reason we live in an age in which there is so much science denial — about the climate, COVID-19 and so much else — is not because it is some sort of accident or mistake that false information leaks out onto the internet. It is instead the result of a deliberate campaign of deception and falsehood cooked up to mislead us. There are forces at work who want us to believe false things about the climate and COVID-19 — about evolution and vaccines and GMOs — not because it serves our interests, but instead because it benefits the interests of the people who are creating and spreading these falsehoods.
Science denial is not an accident.
People don’t wake up one day wondering whether the California forest fires were due to a Jewish space laser, or if Bill Gates put microchips into their COVID-19 vaccines. Those are the result of a deliberate propaganda campaign that is meant to create doubt and distrust. Of course, to the person who is going in the wrong direction, you might wonder why it’s worse that the mistake was deliberate. Yet as you can see in my original example, it’s because the consequences can be quite terrible for the person who is being misled, depending on the stakes of the lie. The difference between driving to Wellesley and turning around versus driving to Wellesley and getting carjacked is not trivial. Imagine asking a scientist why fraud is worse than error. They’ll give you a funny look. Anyone who has ever done a little science understands that it is hard enough without having to deal with people who are cheating. Of course, even mistakes in science aren’t really welcome, but at least you might be able to learn from error. But what is the point of building upon someone’s work when even they know it isn’t true? And I submit that the same holds with the distinction between misinformation and disinformation. It matters why you are driving the wrong way.
But let me make another qualification. Couldn’t you say that to the person who acts on bad information, it hardly matters what the original intent was? After all, the person who hears a piece of disinformation and then spreads it — believing it to be true — has done so without any malicious intent. So doesn’t it at that point become misinformation? Now I’m a philosopher. And we could spend a couple of weeks sorting that out. But let’s not. Because the important point I’m trying to make here is that if you want to fight bad information you have to know where it is coming from and why. And if you are trying to defend science from the science deniers and the charlatans, it is important to keep in mind not just that they believe false things, but that they believe false things because they are being lied to.
That is the nub of the problem.
As with most effective lies, a science denial campaign usually starts with a kernel of truth. And it employs a tactic we all think of as very good and in fact quite scientific. Doubt.
Scientists doubt things, and it is good that they do so. As Carl Sagan once said, the amazing thing about scientists is that they are both open-minded and skeptical at the same time. Now the trick is to be open-minded enough to want to learn new things that might change your mind about what is true, but then immediately suspend your judgment over any particular hypothesis until you hear the evidence for it. You always want to test things.
Doubt in and of itself is not necessarily a bad thing, because doubt about any particular scientific finding can be overcome with sufficient evidence. Indeed this is how scientists reason. Scientists are skeptical. This is why they turn to the evidence. But the real problem comes when mere doubt morphs into distrust. For doubt can be overcome with evidence, but distrust cannot.
The goal of disinformation is not merely to raise doubt but to weaponize it into distrust, and eventually, denial. This occurs when our fellow citizens are encouraged not just to disbelieve certain scientific facts but to distrust the scientists who have discovered them — to see them as biased or even liars — which undermines the process by which scientific knowledge is created in the first place.
In an earlier book called “The Scientific Attitude,” I argued that what is most special about science is that scientists care about evidence and are willing to change their minds in the face of new evidence. Although they may start with doubt, when the evidence is sufficient to overcome their reservations, a good scientist will give their assent. This is not to say that they necessarily believe that a hypothesis has been proven to be true, or that we are certain about it, because those are unreachable goals. No scientist can ever prove a theory. Science is not deductive logic or Euclidean geometry. But when there is sufficient evidence, it is rational to believe that a theory is true … that is until new evidence mounts to change one’s mind. In the philosophical trade this is called “fallibilism” and it means that you give your assent to a well-corroborated belief while always holding out the idea that in the long run it might not be true. That is a rational way to form one’s beliefs. We assert what we think is true, then revise as necessary.
Compare this to the way that a science denier reasons.
A few years back I went to the Flat Earth International Conference in Denver, where I spent two days mingling with folks who genuinely believe that the Earth is flat, that Antarctica isn’t a continent but an ice wall around the perimeter, that there is a dome over the top, and that we have never been to the moon. They believe this because they distrusted the scientists who told them otherwise. It was all part of a giant conspiracy theory, whereby the scientists were benefitting somehow from keeping this a secret. But this, of course, wreaked havoc with the flat Earther’s claim that their beliefs were not based on faith but instead good evidence, for who could they trust to provide that evidence in experiments they had not done themselves? And if they demanded proof as a standard for belief, why did they believe in flat Earth?
It is not just that their beliefs were wrong, it is that they were reasoning about them in the wrong way. Nothing could convince them. They would never say what — if any — evidence might compel them to change their minds. They did not have the scientific attitude. So it’s not just that their conclusions were mistaken, it’s that they were not willing to reason about an empirical matter like a scientist would. But the reason was that they distrusted the scientists.
And it works this way for all science denial.
Some years back, a few researchers found that all science deniers reason in the same way. This is not to say that the content of their beliefs is all the same — or that if you’re a science denier about one thing you are necessarily a science denier about everything. Rather it means that whether the topic is climate denial, flat Earth, anti-vax, anti-evolution or something else, all science deniers follow the same flawed reasoning strategy, which is:
1. Cherry-picking evidence.
2. Belief in conspiracy theories.
3. Engagement in illogical reasoning.
4. Reliance on fake experts.
5. Belief that science has to be perfect to be credible.
These five tropes sound pretty familiar don’t they? We all understand that vaccine deniers cherry-pick unvetted claims from the VAERS (Vaccine Adverse Event Reporting System) website or rely on fake experts on the internet (who sometimes have alternative cures to sell) to support their spurious claims. But here’s the payoff. By understanding this flawed reasoning strategy it allows you to push back against them. To address not just the content of their beliefs but the flawed logic behind them.
Consider for example the fifth trope: the idea that science has to be perfect. This is a common one among science deniers, who will often say “just give me proof” and then say “aha” when you can’t. “Let’s wait for more evidence on climate change,” they’ll say, or “just prove to me that the vaccines are safe.” They don’t understand how science works. They are reasoning about empirical beliefs in an illogical way, because when you are dealing with inductive reasoning strategies you cannot have proof. You have to rely on warrant and evidence, not certainty.
To a science denier, though, they feel virtuous for their great “skepticism.” They often say that they are the true skeptics. That they’re being more scientific than the scientists. But they don’t really understand what skepticism means either. Science deniers are what I call “cafeteria skeptics.” They are not usually “anti-science” but instead just skeptical about the piece of science that treads on their ideological beliefs or piece of their identity they want to protect. This is why they are so inconsistent in the way they reason about evidence. They insist on a standard of evidence tantamount to proof for the things they don’t want to believe, but ask for almost no evidence at all for the things they do want to believe. They embrace a double standard that would never be tolerated in science. Instead of being skeptics, science deniers are usually quite gullible.
It is worth reflecting on all this for a moment, because it will help us to see what we can do to fight back. Research reported in “Nature Human Behaviour” in 2019 shows that one can push back against science deniers, and sometimes even convince them to give up their irrational beliefs based on understanding the five tropes above. But it is hard. And it doesn’t always work. It works better than anything else, but it is a salvage strategy. It is what you do when the disease of denialism has already spread to a virgin population and there is nothing else you can do but try to treat it.
But what if you could get to the vulnerable people first, before they were radicalized? Because remember … science deniers are made. Their beliefs are created by someone whose interests are served by radicalizing them. And by understanding that, maybe we can address this problem before it gets any worse.
You can stop a disease by treating the sick but also by “removing the pump handle” that is getting everyone infected in the first place. One way to do this is to “prebunk” false beliefs, by publicizing the flawed reasoning strategy that all science deniers use to arrive at them. But another is to expose the fact that the science deniers are being duped in the first place. That they are victims. That they are doing someone else’s bidding without even realizing it.
Don’t forget where we started with this. Science denial is based on disinformation. It is the weaponization of doubt, which means that someone must have weaponized it. But who?
The manufacture of doubt into a modern science denial campaign began in the 1950s, when executives of six of the nation’s largest tobacco companies met at the Plaza Hotel in New York City and hired a public relations firm to advise them what to do about growing research showing a link between smoking and lung cancer. The advice? Fight the science.
They accomplished this through a public relations campaign that involved full-page ads in American newspapers, doing their “own research” in a precursor to The Tobacco Institute and creating an alternative narrative that they could feed to journalists encouraged to tell the “other side of the story” by admitting that the causal link between smoking and cancer had not yet been proven. (Which of course is true, though it would cause David Hume to spin in his grave, because all causal links based on inductive reasoning are less than certain.) The tobacco companies didn’t have to prove that smoking did not cause cancer. All they needed to do was create enough doubt about whether it did to continue to sell cigarettes.
As Naomi Oreskes and Erik Conway report in their wonderful book “Merchants of Doubt,” this “tobacco strategy” was then followed over the next six decades in denialist campaigns about acid rain, the ozone hole and later about climate change, so that special interests could profit. But a dangerous thing happened during those decades. On several topics, science denial morphed from profit to politics. The goal was no longer merely to make money but to protect one’s ideology; to create an army of deniers. And for that the manufacture of doubt was not enough. The goal was distrust that could be created only through the cynical manufacture of disinformation.
The point of a full-fledged denialist campaign is not just to create doubt about some particular scientific fact, but to foment alienation and polarization — to divide the world into “us versus them” — so that folks might begin to distrust scientists in general. After all, what use are facts if you don’t trust the people providing them?
Consider vaccine denial. If it were motivated merely by doubts about whether the vaccines caused infertility, say, it could be overcome by data showing that they do not. But if you look at how people come to have such suspicions it is because they have been fed propaganda that encourages them to think that Dr. Anthony Fauci, the Centers for Disease Control and Prevention and the Food and Drug Administration are lying to us. When you enter a world where the skeptics become “us” and the scientists become “them,” the battle is already half lost. Because at that point the facts don’t really matter, do they? Why would you believe any alleged facts if they were shared by a liar?
In April 2020, just a month into the pandemic, a story appeared in a publication called the Oriental Review that claimed that any future COVID-19 vaccines developed in the West would have biometric microchips in them, courtesy of Bill Gates, who had taken out a patent on this technology numbered 060606. At the bottom readers were encouraged to share this story on Facebook and Twitter.
What readers did not know, however, is that the story had been created by Russian intelligence, which was pumping out COVID-19 denial propaganda through four of the English language news outlets that it controlled, which includes the Oriental Review.
This was not the first time Russia had been involved in creating and disseminating disinformation that undermined Western confidence in science, about topics such as climate change, GMOs, vaccines and a host of health-related topics. An article in The New York Times two years ago titled “Putin’s Long War Against American Science” detailed how, for more than a decade, Putin’s Russia has been trying to undermine trust in Western scientific institutions. The story about microchips was just the latest in a long line of disinformation intended to destabilize American society. And this has been wildly effective.
By May 2020, CBS News reported that, according to a new poll, 44 percent of Republicans surveyed believed that any COVID-19 vaccines might have biometric microchips in them. That’s quite a payoff for one month’s propaganda. And you know the rest of the story.
Or do you?
I am not saying all of the scientific disinformation on the internet has come from a Russian troll farm. But some of it did, and the purpose of this is to exploit the existing fault lines in Western society and keep us at each other’s throats, so that we are less likely to have the time and resolve to pay attention to what other countries are doing. Given the political climate, it is not hard to see how the politicization of science denial has now become a weapon and — worse yet — an example that has blazed the trail from science denial to reality denial about topics that reach far beyond science.
It is a dangerous moment in history, and the stakes are high. If we cannot figure out what to do about the problem of disinformation in this country, and around the world, a number of our treasured institutions like science and democracy may be at stake.
I submit that part of the answer in fighting back is that we have to learn how to talk to one another again. I recommend a strategy for respectful conversation with those who are distrustful of science. But this is only part of the solution. We must also try to figure out how to stop the flow of disinformation so easily on the internet, where it is picked up and amplified by those with nefarious intent. What we really have before us are three problems: the creation, the amplification and the uptake of disinformation. We need all hands on deck.
The first thing to do is recognize the real problem that we are up against. That this is not all due to misinformation, but instead disinformation. Someone is doing this on purpose. We must face this and not hide behind euphemisms or the reluctance to name names.
It is time to stop asking merely, “Why do people believe such crazy stuff?” and instead ask “Who wants them to believe it?”
We are in an information war. And the first step to winning an information war is to admit that it’s well underway.