Debunking Bad Science Journalism

Time to Get Rid of Bad Science Journalism

Nature News

One of the largest obstacles to the public understanding of science is the presence of pseudoscientific crankery that replaces evidence with personal testimony and critical thinking with personal credulity. However, another obstacles has become increasingly apparent during the last few years: the menace of bad science journalism. These practices have even managed to infiltrate high-quality publications such as Nature. Causes may range from cognitive myopia and increasing demands for sensationalism to boost ad revenue but they consequences could be dire. It misleads people, promotes falsehoods about science and damages the credibility of both science and science journalism. In this post, a number of possible causes and potential solutions are discussed.

Recent examples of the problem

There are plenty of examples of bad science journalism out there, even from magazines such as Nature and Scientific American. Here are just a few recent examples:

  • In the news feature section of issue 7483 on the prestigious journal Nature, Jeff Tollefson promote the false notion that global warming has taken a hiatus for the past 16 years, going so far as to call it the “biggest mystery in climate science today”. In reality, the notion of a global warming hiatus is due to cherry-picking 1998 as a starting point (a strong ENSO year). Once you control for that and other factors, there is a trend toward increased temperatures. In reality, the “no warming for 16-years” is a common climate change denialist trope.
  • In the popular science section called Nature News and Comment, Zeeya Merali wrote a piece suggesting that Stephen Hawking is now claiming that black holes do not exist. She even makes it appear as if she is directly quoting Hawking. In reality, that is a quote out of context. The paper in question merely suggest revising the mainstream account of the event horizon into an “apparent horizon” to make the entities more consonant with quantum mechanics. This story was also carried wrongly on a number of news outlets and presumably her article contributed to it.
  • Scientific American Mind editor Ingrid Wickelgren promoted the notion that diet, stressed parents and watching TV causes ADHD (and that supplements successfully treat symptoms) on her magazine-associated blog. Wickelgren would probably try to defend herself by stating that she only wrote down highlights of a talk and may or may not agree fully with it. However, the fact that she gave a platform to these kind of anti-psychiatry, alternative medicine and arguably anti-scientific viewpoints indicate bad science journalism. There is also no attempt to skeptically investigate the claims made in the video to see if they hold up against published research.
  • Forbes contributor and senior epidemiologist Albert Einstein College of Medicine Geoffrey Kabat recently wrote a pseudoscientific and cherry-picked post denying the association between passive smoking and lung cancer. According to WHO, about 600 000 people die each year from passive smoking. Granted, lung cancer is only part of the health dangers of passive smoking, but it cannot be dismissed in the way that Kabat does.

These are just a couple of recent examples of bad science journalism that contributes to the public misunderstanding of science and the spread of pseudoscientific crankery. There are countless more out there.

Contributing causes

This section discusses some of the potential causes of bad science journalism. Most of these ideas are probably not original to me, and they are not completely fleshed out in detail. Some of them are speculative and some might be less important than others. There may also be contributing causes that have been overlooked and factors may differ depending on the individual case of bad science journalism. They are listed in no particular order and they may be interconnected or overlap.

Deadline pressures: having tight deadlines for science journalism may compromise accuracy in several ways. The journalist may not have enough time to (1) find the relevant limitations of the current research project and thus risk giving a misleading picture or (2) contrast it against what is already known to put it into context. Because it is faster to write a he-said-she-said pieces than to investigate it thoroughly, this may contribute to false balance.

Unfamiliarity with the topic: scientific research has grown explosively in the last few decades. That means that even science journalists who specialize in e. g. medicine will not have an intimate familiarity with most overarching research projects in that area. Even if a given science journalist is very familiar with e. g. autoimmune diseases, he or she may not have any particular knowledge about e. g. ADHD or major depression. That, in combination with other factors, may contribute to less accurate reporting on those topics.

Moving from in-print to online edition: during the last couple of years, magazines in which science journalism feature prominently have shifted focus from in-print to online. There are many reasons for this including declining subscription numbers for the in-print edition, the expanding online world, decisions made by competitors, availability etc. This creates a unique demand for content production that may skew reporting towards faster updates more suitable for exploiting the average attention span of Internet visitors.

Cognitive myopia: most new studies on an established scientific or medical research topic only makes a modest contribution to the accumulated mass of knowledge. Although single papers can overturn a previously accepted notion, it is the combined knowledge of credible scientific research that determines the mainstream scientific position. Thus, overinterpreting the impact of new research facilitate popular misconceptions about what the scientific mainstream account is. Consider the areas of diet and harmful substances as two key examples.

Pressure for sensationalism: as print edition subscribers dwindle, magazines become increasingly committed to making money through advertisements. The more views a certain ad gets, the more money the magazine makes. So there is a vested interest in pushing out sensationalist articles in hopes of getting a lot of views. Also, because there is competition for views between different magazines, websites and blogs, the person who makes more sensationalist claims will get more traffic. In order to create sensationalist science reporting, quality and accuracy is often sacrificed.

Not thinking things through: science journalists are humans and humans can sometimes act before thinking. This is especially problematic in this case, where incorrect or inappropriate content can have a large effect in terms of spreading misinformation about science.

The lure of selective skepticism: there is a tradition among journalists to stand up to the government, companies and other authorities they critically examine. However, this can sometimes lead to selective skepticism were claims that run counter to the beliefs of some science journalists are met with extreme skepticism, whereas information that appeals to the existing beliefs are accepted with much less skepticism. More rarely, some science journalists may misuse their position to carry out personal crusades against things they do not like, such as psychiatric medication or neuroscientific research into human cognition.

Non-rational risk assessment: another human characteristics is risk aversion and this means that some science journalists may interpret some research findings as more dangerous than they really are. This leads to inaccurate reporting.

Potential consequences

This consequences of bad science journalism are pretty straight forward: (1) it misleads people, (2) it spreads wrong and potentially pseudoscientific claims,. (3) it reduces public confidence in the credibility of science journalism and (4) it may even reduce public confidence in science if they come to believe that scientific fact changes every week (look at e. g. reporting on diets, medical treatments and toxicity of various products and foods). There are really no coherent benefit to bad science journalism, except maybe the fact that sensationalism boosts ad revenue.

Potential solutions?

Here are a few proposed solutions that may alleviate some of the problems associated with bad science journalism. They are not fully fleshed by any means and there may be unforeseen issues and problems that have not been given careful consideration. To a large degree, these suggestions are the result of personal brainstorming and may turn out to be unfeasible, ineffective or irrelevant. Their purpose is mostly to get the conversation going and make people think about possible solutions to the problem of bad science journalism.

External peer review: if we cannot trust a certain proportion of science journalists to carry the story correctly, then external peer-review may be useful. Both the researchers responsible for the study and an external expert familiar with the topic could be used together. This would not take as much time as peer review of research papers and a competent reviewer familiar with the topic would have it pretty easy to identify pseudoscience, cherry-picking, ideologically driven arguments and abuses of the field.

Internal review: as a variant of above, allowing other science journalists to comment on a finalized draft could be useful and a way to weed out the worst. An editor have a lot other duties and may not have the required knowledge to judge on specific topics.

Promote meta-analytical thinking: as an antidote to cognitive myopia, bad science journalists need to practice meta-analytical thinking: most new studies will at best only provide a slight contribution to science scientific truth arises from an analysis of the entire accumulated mass of knowledge. This will help bad science journalists from overplaying the impact of a recently published single study.

Self-policing: good science journalists need to be better at publicly calling bad science journalists out. This should be encouraged from management and not frowned upon or punished. A lot of people consider such approaches to be inconvenient and confrontational, but keeping a person from continue to publish nonsense is an act of kindness, not malevolence. Some may find this little bit scary to confront a boss or a senior journalist. There may be some intersectional issues here as well as maybe some good female science journalists are not comfortable criticizing a male coworker as she may feel she is treated as a “hysterical nuisance” (or other stereotype of a powerful woman who bring valid concerns to attention).

Reward accuracy: keep your organization surrounded by science journalists. They are highly valuable and good science journalism should be rewarded.

Punish crankery and unnecessary sensationalism: as a corollary to the above point, punish bad science journalism by sanctions, mandates to attend seminars, group discussion or classes etc.

Constructive and adversarial cooperation: most popular science articles are written by individual science journalists. What about increasing the level of collaborations where multiple science journalists write pieces together? This means that there could be more pairs of eyes catching any problems. Perhaps multiple science journalists from the same organization could simultaneously cover a given issue with different takes? That could drown out any crankery.

Better rules for magazine-associated “personal” blogs: a lot of bad science journalism comes from magazine-associated blogs instead of published science news articles. This is probably because there are less stringent rules or guidelines for “personal” blogs (i.e. blogs on the magazine website) than the magazine itself. Improving and enforcing guidelines for good science journalism practices on blogs associated with the magazine could correct some of the problems. This does not mean that the magazine should be able to influence the content on truly personal blogs, only on those blogs connected to the network.

Change embargo rules: extend or otherwise alter rules for news embargo for science journals. This may give science journalists more time to understand the background information and avoid making mistakes.

Improve science journalism education: include sections or modules on how to avoid being a bad science journalism, explain common denialist tactics, properties of pseudoscience, cognitive biases etc. at education programs for science journalism.

Seminars, conferences and training programs: holding on-going seminaries, conferences etc. may help keeping good science journalism practices fresh and current. This would also be a way to train bad science journalists.

Establish grant programs: creating grant programs for good science journalism may prevent science magazines being with falling in-print subscriptions from being too reliant on ad revenue and thus mitigate some of the issues with sensationalism etc.

Improve recruitment recruiting better journalists and scientists could be beneficial. This might be facilitated by having real-world tests instead of overly focusing on CV, interview or old writings. These real-world tests could include writing new popular articles on a couple of areas that are socially controversial in the mind of the public, but not in the scientific community or where there are a lot of popular misunderstandings.

Think before publishing sounds so simple, but apparently this is very difficult for some bad science journalists. Asking questions such as “am I confidence that I have carried the story correctly?”, “what ways could I have misunderstood this study?”, “am I overselling it?”, “am I being too sensationalist?”, “am I promoting false balance?” could be beneficial. Simply asking the scientists responsible for the published study is not enough since they have a conflict of interest in that they want attention for their research.

People should not have to assume at the outset that a piece of science journalism they read got the story wrong. The quality of science journalism should not drop to the level of tabloids. With this post, Debunking Denialism launches a new category where bad science journalism will be taken to task.

emilskeptic

Debunker of pseudoscience.

4 thoughts on “Time to Get Rid of Bad Science Journalism

  • What was wrong with Geoffrey Kabats article? I thought it was calm, reasonable criticism of Health regulation Agencys take on passive smoking?

  • He is a passive smoking apologist who cherry-pick the research to spread uncertainty and doubt regarding the relationship between passive smoking and lung cancer. Research published in Lancet shows that passive smoking kills an estimated ~21400 people per year from lung cancer.

    Contrast this with Kabat’s claim that “any risk from passive smoking is very small, and this makes it difficult to detect a significant effect” and “the association is weak and inconsistent”.

  • I recently saw a BBC article about malaria which said, and I quote, “Parasites infected with malaria” when referring to Plasmodium. How is such a mistake made? Amusing, yet concerning.

    With regard to improving scientific journalism, I have one point to add to your very comprehensive list. I have always thought that articles such as those in New Scientist (and even on news sites such as the BBC) should include a link to the paper to which the article is written about. All it would need is a ‘Read more here’ link at the bottom to allow those interested to read further (and interpret the paper for themselves), yet no one seems to do it. It’s incredibly annoying when trying to get to the bottom of a potentially interesting story.

    Thanks for an enjoyable read!

    • Indeed, news articles not linking to the paper is highly annoying.

Comments are closed.

Discover more from Debunking Denialism

Subscribe now to keep reading and get access to the full archive.

Continue reading

Hate email lists? Follow on Facebook and Twitter instead.

Subscribe!