Communicating science: propaganda or education?
Let's take a step back in the disinformation debates and look at the basics
The foreboding implicit in Carl Sagan's famous quote about a democratic society deeply dependent on science and technology, but inhabited by a populace not only ignorant of science but also passionately seduced by all kinds of irrationality, looms larger and larger on the horizon as each day passes. The need for collective action and policy changes to deal with the impending, highly predictable train wrecks of climate change and pandemic infectious diseases makes the imperative of effectively communicating the relevant science to the public more than urgent.
And, as if the task weren’t complex enough by itself, there’s that pesky stumbling block to deal with — disinformation. Tylenol and vaccines cause autism. COVID shots are unnecessary. Vitamin supplements cure measles. Climate change is no big deal. From Instagram influencers to the White House, and encompassing a big fat slice of the so-called “wellness” and “lifestyle” beats in mainstream media, the sources of noise are ubiquitous.
The impulse to analyze and understand the phenomenon, to see why disinformative content time and again appears to be more persuasive, popular, and effective than informative alternatives, was both understandable and necessary. The answers weren’t hard to come by.
Among them are emotional appeal, the use of compelling stories, mobilizing primal feelings such as fear, indignation, and disgust, and the reinforcement and activation of latent prejudices. Also, making simple, easily digestible (even if blatantly false) explanations of complex or bewildering phenomena available, and affording a channel to vent frustration and anger through conspiracy ideation and outsider scapegoating.
Put simply, science disinformation is usually crafted and packaged to elicit a predetermined, visceral reaction or, conversely, to sow confusion: if nobody really knows what’s going on (with vaccines, or the climate), why run the risk of doing or changing anything at all?
But information, at least in its “naïve” forms, is constructed to — what else? — inform. You can’t honestly say that a vaccine saved a woman’s life or health (even without the shot, she might have never gotten the disease, by sheer luck), but you can dishonestly say that the vaccine made her give birth to a girl with two heads and six eyes (that pesky mRNA!).
This diagnosis, however, places those tormented by Sagan’s foreboding in a conundrum: if the adversary advances because they play dirty, shouldn’t we go there, too? To quote the immortal words of police detective Marion “Cobra” Cobretti (played by Sylvester Stallone in that unforgettable 1986 blockbuster “Cobra”), “As long as we play by these bullshit rules and the killer doesn’t, we lose.”
Of course, “these bullshit rules” are all that keep civilization (barely) viable, so we should think hard before disposing of them. But the stakes are so high, and the risk of losing our health, our lives, our environment is so significant — so what?
When the conversation gets to this point, I usually feel like there’s something obvious that I am missing — like a kid scrambling to find a way to guess the size of one of the sides of a right triangle, oblivious to the existence of the Pythagorean theorem. And then, the other day, I stumbled on this distinction between “propaganda” and “education” made decades ago by psychologist and social scientist Alex Carey:
“By ‘propaganda’ I refer to communications where the form and content is selected with the single-minded purpose of bringing some target audience to adopt attitudes and beliefs chosen in advance by the sponsors of the communications. ‘Propaganda’ so defined is to be contrasted with ‘education’. Here, at least ideally, the purpose is to encourage critical enquiry and to open minds to arguments for and against any particular conclusion, rather than close them to the possibility of any conclusion but one.” (Alex Carey. Taking the Risk Out of Democracy: Corporate Propaganda versus Freedom and Liberty)
Disinformation is, by definition, a form of propaganda (specifically, propaganda based on lies/distortions, and whose sponsor, usually but not always, tries to hide their identity and agenda behind a cat's paw). Now, science communication is — or should be — a form of education. So it must be designed to open minds, not to induce behavior at any cost.
But wait. Don’t we want to use science communication to encourage people to get vaccinated? To pressure their governments to fight climate change? In other words, to induce them to behave in a certain way?
We should, perhaps, rephrase: our goal is to provide people with the tools and information they need to make the best decision for themselves, offering them a clear, compelling, and honest presentation of the relevant facts and evidence. This presentation may employ rhetorical tools that go beyond hard data and dry syllogisms, but it must not be manipulative — or, as in the case of propaganda, single-minded.
The philosophizing about science communication in a world where a proper understanding of science carries high stakes could greatly benefit from an examination of the history of political propaganda. We should learn from the mistakes made by political parties and candidates, the most significant being the replacement of substantive debate by propaganda: it may help win elections and public opinion campaigns in the short term, but in the long run, it degrades the entire polity.
The first important academic theoretician in the field, Harold Lasswell, who analyzed the uses of propaganda during World War I, arrived at deeply elitist conclusions, affirming that democratic societies would only be viable if the people, whom he saw as unable to follow rational arguments, were seduced by illusions crafted by talented propagandists. His first major work, Propaganda Technique in the World War, couldn't be clearer:
“Familiarity with the ruling public has bred contempt. Modern reflections upon democracy boil down to the proposition, more or less contritely expressed, that the democrats were deceiving themselves. The public has not reigned with benignity and restraint. The good life is not in the mighty rushing wind of public sentiment. It is no organic secretion of the horde, but the tedious achievement of the few.”
“Preserve the majority convention but dictate to the majority!” Lasswell wrote this in 1927, and also: “If the mass will be free of chains of iron, it must accept its chains of silver. If it will not love, honor, and obey, it must not expect to escape seduction.”
This is a dark, pessimistic elitism that populist purveyors of disinformation easily denounce and exploit, and that, at least in the field of science communication, should be resisted. Scientific practice can be described as a series of procedures to reveal which hypotheses are persuasive for the right reasons — because they are consistent with well-established theories, conform to data from well-designed experiments or valid observations, and explain more things, and better, than the alternatives.
To give up the demand for honesty and good reasons when talking to the public — to do propaganda instead — is not only treasonous to the very spirit of the scientific endeavor. In the long run, it may become contagious — infecting science itself, as it has with politics, further degrading the public sphere. And, to an activity whose primary purpose is to discover truths, or at least to approach reality to the best of human capacity, such an infection would be poisonous, perhaps lethal.


