There are many reasons at individual, group and societal levels.
The spread of health-related misinformation and fake science news is not a trivial matter. When it comes to health, lives can be at stake. Indeed, in a previous article, were discussed three ways that pseudoscientific therapies could potentially be harmful.
But health-related misinformation and fake news related to a myriad of scientific topics persist and spread nonetheless. A simple Google search can reveal a disconnect between the scientific consensus and particular pockets of public opinion on topics ranging from vaccine safety to the treatment of mental health disorders and from evolution to climate change.
One might ask, what causes people to fall for false scientific claims? The answer to such a complex question is, of course, multifactorial—there many reasons at the individual, group, and societal levels of analysis. Fortunately, a recent review in the journal PNAS has helped to shed some light on this issue by perusing the evidence-based social science research literature.
At the individual level of analysis, there are two main contributors to the persistence of health-related misinformation: an inability to recognize misinformation and a lack of motivation to recognize such misinformation.
With respect to ability, this means that both limited ability to evaluate biases in media and limited scientific understanding can be part of the problem. For example, a recent large public survey revealed that only 64 percent of the public understood the concept of probability; only 51 percent correctly defined an experiment, and only 23 percent correctly described components of a scientific study.
Further, in this survey, the public’s capacity to distinguish science from pseudoscience was indicated by views on whether astrology is considered to be scientific. While more people viewed astrology as unscientific today than in the past, only 60 percent endorsed the statement that astrology is “not at all scientific.”
With respect to motivation, there have been decades of research devoted to the topic of motivated reasoning, which in a nutshell, describes how facts can be interpreted in different ways, by different audiences (think: Democrats “hate-reading” news about Republicans, and vice versa).
Indeed, the psychological discomfort known as “cognitive dissonance” can occur when one reads facts that are inconsistent with one’s world-view, which can contribute to the wearing of ideological blinders to the scientific consensus, so to speak. The philosophical underpinnings of this topic are also important, as beliefs about how one comes to know (i.e., epistemic beliefs) can influence a person’s perception of information. For example, people who put more faith in their ability to use intuition to assess factual claims than in their reasoning skills are more likely to support conspiracy theories.article continues after advertisement
Group and Societal Factors
While individual-level factors can help explain why people fall for fake science news, it is impossible to understand the complexity of this problem without situating it in its social context.
Indeed, scientific evidence does not exist in a vacuum and frequently flows through social networks that can influence a person’s perception of information. Social networks that are more homogenous are particularly ripe for misinformation, as the risk for acceptance of false claims can be heightened when false information is more visible and more socially rewarded when shared—some have referred to this phenomenon as an “echo-chamber.”
Also, the structure of social networks themselves can assist the spread of misinformation, as some false beliefs could be perceived by people as more prevalent in a network than they are in reality. This issue is further compounded by the fact that bad actors (human and machine-bots) do exist and can influence and exploit such visibility of false claims. And at an even higher level of analysis, political and commercial mass media conglomerates can be motivated to shape what information is disseminated to the public.
What Can We Do about All of the Fake Science News?
Just as the causes of misinformation proliferation are complex, the solutions are complex too and need to be addressed at multiple levels of analysis. Timothy Caulfield—Canada Research Chair in Health Law and Policy at the University of Alberta—has done some excellent work in this area, with a call towards disseminating facts (particularly via stories and anecdotes to make them more appealing and more likely to be heard); the promotion of a culture that values critical thinking; and correcting problematic systemic issues (think: problems with Big Pharma).
Fake science news is not a benign topic, particularly when it comes to the treatment of medical conditions, which is not a game to be toyed with and exploited.
Indeed, the demarcation between science and pseudoscience is complex with decades of literature devoted to the topic. But it is not complex to warn against some quasi-medical procedures that lack any credible theory or research to support their use.article continues after advertisement
A problem with the promotion of health-related misinformation is that it can legitimize unscientific, magical thinking about health, as well as pseudoscientific therapies. This can have grave consequences. Your loved ones deserve evidence-based care whereby the influence of fake science news and bad actors is mitigated as much as possible.
Source Psicology Today