All EMBO Press journals Open Access as of 1 January 2024 - read the FAQs

Science & Society
3 November 2006
Free access

Bad science in the headlines: Who takes responsibility when science is distorted in the mass media?

EMBO reports
(2006)
7: 1193 - 1196
image
This year has seen an intensification in the debate on how scientific stories with societal relevance are reported in the media. One cause was the Korean stem‐cell scandal, in which Hwang Woo‐Suk and colleagues claimed to have cloned human embryos. Lambasting the media for their part in distorting or sensationalizing scientific findings is not new, but recent scandals—and some not so recent—highlight the role played by elements of the scientific community itself. Ultimately, each side must take an appropriate share of responsibility for the impact that controversial or dubious research has on the public.
When it comes to distributing blame for misleading the public, most scientists are quick to point to the media. In an analysis of the public reporting of science in the UK, published earlier this year by the Social Market Foundation (SMF; London, UK), the press in particular come under fire for negative sensationalism that is often to the detriment of the public good (SMF, 2006). The report repeatedly cites the media frenzy that began in 1998 when Andrew Wakefield, a UK‐based doctor, suggested a link between the administration of the combined measles, mumps and rubella (MMR) vaccine and the development of autism in children. The news reached the media through press releases and a press conference. Relentless negative press coverage over the next five years eventually led to a significant fall in MMR vaccinations in the UK and elsewhere. Many newspapers conveniently ignored expert advice on the safety of the vaccine in favour of attacking the UK government, which defended the vaccine's safety. “This mistrust of the government and its motives coupled with a sensationalist tendency in some parts of the press … lie at the heart of the problem,” the SMF report states. “Particular difficulties arise when a scientific story becomes a political one and balanced scientific reporting is left behind in the face of a ‘good’ political story.” It seems too easy to blame the press, given that the scientific community might have had little chance to alter the course of this runaway train. But a closer look at the scientific beginnings of this and other media disasters, such as genetically modified (GM) organisms, cloning and stem‐cell research, tells a different story.
…although it is easy to blame sensationalism and scandal‐mongering by the media, the scientific world, including scientific journals, are willing accomplices in many cases
Uncritical observers might interpret the SMF report as a fair comment on the media treatment of the MMR vaccine and the claimed dangers of GM food and crops. But the problems that it highlights reveal failings in the scientific world as much as irresponsible media coverage. The media have always been unashamedly interested in making money, but under increasing competition for funding, scientists are faced with similar ‘vices’ when seeking recognition for their work. Furthermore, although it is easy to blame sensationalism and scandal‐mongering by the media, the scientific world, including scientific journals, are willing accomplices in many cases.
Much has been written about The Lancet's decision to publish the original paper by Andrew Wakefield (Wakefield et al, 1998) and to stand by it in the face of growing criticism. In the meantime, it has emerged that Wakefield failed to declare funding from a group of parents wishing to take the vaccine manufacturers to court, and that the study itself had flaws (Fitzpatrick, 2006). Out of Wakefield's 12 co‐authors, 10 have since issued a ‘retraction of an interpretation’, to distance themselves from the suggestion that the MMR vaccine might be causally linked to autism (Murch et al, 2004). But it was the media—not the research community, peer reviewers or The Lancet—that brought the disturbing facts to light. While the world is finally getting to grips with the emerging details of the Wakefield case, two examples of seriously fraudulent research have made headlines in 2006: the Korean stem‐cell scandal—the investigation of which was actually precipitated by the media—and the case of a biomedical researcher from Norway.
It was a Korean investigative TV reporter, Han Hak Soo, who began the enquiry into the cloning of 11 human embryo lines, reported in Science (Hwang et al, 2004) and covered in newspapers worldwide. It duly emerged that Hwang not only violated ethical guidelines but also fabricated data; his paper was later retracted by Science (Kennedy, 2006). In January, the Norwegian government began investigating the conduct of researcher Jon Sudbø from the Rikshospitalet Health Trust in Oslo. He and his colleagues had claimed that anti‐inflammatory drugs could reduce the risk of oral cancer in smokers (Sudbø et al, 2005), but closer inspection revealed that 900 of the 908 subjects in the study had been invented. The paper was later retracted by The Lancet (Horton, 2006).
…when other aspects of societal interest, such as politics or health, have an impact on a scientific story, it can quickly and easily get out of control
At least in the Sudbø case, it was another scientist who triggered the investigation: the sister of the Norwegian Prime Minister, who works at the Norwegian Institute of Public Health. But the picture is gloomy: seen from the point of view of the public, not only has the scientific community been rather slow in making and enforcing guidelines for codes of conduct, but its proudly defended peer‐review system is quite able to miss flagrant fraud. One might well ask who is scrutinizing research before it is published. In the USA alone, 100 cases of fraudulent research are reported annually to the National Institutes of Health (Bethesda, MD). But co‐workers and lowly postdoctoral researchers desperate for a good reference rarely blow the whistle even when they know of malpractice. Scientists should perhaps take a closer look at their own behaviour and how they mediate their findings to the outside world; the problems range from the surveillance of research within an institute to the motivations of scientific journals.
In an increasingly competitive environment, scientists look to media coverage, and through it public attention, as an additional way to attract funding. Some also use the media to spread their views against the consensus of the scientific community. Research institutions and universities also use the media to bring themselves to the attention of the public. Issuing press releases and holding press conferences on the publication of a ‘sexy’ paper are commonplace. The science that enters the public consciousness in this way does so without a thorough examination by the scientific community, and has earned the fitting nickname ‘science by press conference’. Famous examples are the claims by Stanley Pons and Martin Fleischmann to have discovered cold fusion in 1989, and Clonaid's claim to have cloned a human baby in 2002.
Science is increasingly ‘sold’ in a competitive market, which is why many high‐profile scientific journals want to attract ‘sexy’ research to ensure that their scoop will be covered by the media within hours of publication. Harvesting the pearls of research, journals exert tight control about what can be made public and when; “Journals control when the public learns about findings from taxpayer‐supported research by setting dates when the research can be published,” wrote New York Times staff writer Lawrence Altman (2006). “Increasingly, journals and authors’ institutions also send out news releases ahead of time … so that reports from news organizations coincide with a journal's date of issue.” According to Altman, it is no surprise that scientific findings are distorted by the press: “…often the news release is sent without the full paper, so reports may be based only on the spin created by a journal or an institution.”
Bad science has a devastating effect on scientific communities and, if it is reported in the media, it can have a devastating effect on the whole of society
Although journalists should—and generally do—conduct background research to present a balanced story, genuine balance is not guaranteed when the lead comes from a lone scientist with a strong opinion. “The way in which Andrew Wakefield's research was reported created the erroneous impression among the general public that there were competing bodies of evidence both for and against the vaccine's safety…” the SMF report states, continuing “…by covering both sides of the argument as though they had equal support amongst the scientific community, the effect was to give more weight to one side than was due on the basis of evidence” (SMF, 2006).
In a similar critique of contemporary journalism in the USA, Chris Mooney described how ‘balanced’ coverage helps the scientific fringe to capture headlines, citing climate research and Clonaid's media stunt (Mooney, 2004). But even if good reporters are aware that the claims of one side do not represent the views of the larger scientific community, “…for journalists raised on objectivity and tempered by accusations of bias, knowing that phony balance can create distortion is one thing and taking steps to fix the reporting is another,” Mooney wrote. Apart from anything else, deadlines, pressure from editors and competition between newspapers often conspire to make the task of unravelling the whole truth behind a story infeasible.
With this background, a press release or conference issues a licence to general media expansion. As scientists are seen as trusted authorities of public information, they should be prepared to fight disinformation proactively to establish the true balance of evidence. However, in a recent survey of factors affecting science communication by scientists and engineers, carried out by the Royal Society (London, UK), only 2 out of 1,485 respondents thought that the main reason to engage with the non‐specialist public was to “combat negative images” or “combat [a] bad job done by others” (Royal Society, 2006). Scientists generally do not like to do battle in public, despite the fact that, given the right support, most media will be interested in the other side of a story. Most countries have codes of journalistic conduct that explicitly demand balance, accuracy and professional conduct, and some have a press complaints commission. Ethical guidelines for scientists—where they exist—tend to be institutional rather than national.
Unfortunately, when other aspects of societal interest, such as politics or health, have an impact on a scientific story, it can quickly and easily get out of control. Its direction is then firmly in the hands of the chief editor, not journalists. In this way, coverage of GM food became an irresponsible mix of fact, fiction and ridicule, for which the UK tabloid press is largely to blame. Who else could have thought up the brilliantly wicked front‐page headline “The Prime Monster”—accompanied by a Frankenstein‐like image of Tony Blair—which appeared in the Daily Mirror on February 16, 1999, in response to the UK prime minister's support for GM crops? But it pays to remember that much of the original impetus for “the great GM debate” came from the opinions of Arpad Pusztai, who claimed to have found evidence of a possible health risk from eating transgenic plant material (Ewen & Pusztai, 1999). A hitherto highly respected researcher, he was arguably not given appropriate institutional counselling on his controversial research, leading him to go it alone with the media in the form of a press conference and TV appearance.
Journalism will never be a cautious profession as long as its aim is to find and communicate events that are of interest to broad sectors of society
The MMR saga also shows how certain sectors of the media ignore important facts. Negative coverage of the story in the Daily Mail alone amounted to more than 700 articles in 1998, and continued unabated well into 2003. But whereas its editor, Paul Dacre, clearly supported the onslaught, other newspapers’ editors were more sceptical. The broadsheet newspapers reported Wakefield's claims about the MMR vaccine and the mainstream scientific response in a largely balanced way until the story became political in 2001.
Media reporting left the definitive negative mark because the general public read newspapers, not scientific journals. Bad science has a devastating effect on scientific communities and, if it is reported in the media, it can have a devastating effect on the whole of society. Scientists who behave unprofessionally, or use the media to push a premature minority view or fraudulent research, have generally found themselves ex institutio quite rapidly, as have Wakefield, Hwang, Pusztai and countless others: the scientific community has little mercy with its own kind. The same is not necessarily true of the world of journalism: Dacre has not resigned, nor—on the whole—have other editors or correspondents who have distorted a scientific story. Media resignations do sometimes occur, but mainly for legally punishable misconduct such as libel.
Many observers of the media– science wars still believe that the public reporting of scientific stories would improve if only more scientifically trained journalists would go into the media (SMF, 2006). What this assertion misses is the fact that the media primarily want journalists who can write interesting stories of relevance to the general public. Many of the most respected science journalists have no scientific background. Tim Radford, former science editor of The Guardian and one of the most respected science writers in the business, started his career as a general reporter with the New Zealand Herald at the age of 16, and does not have a university degree. John Noble Wilford of the New York Times, a doyen of science writing in the USA, started as a general assignment journalist with The Wall Street Journal. It was another non‐scientifically trained journalist, Brian Deer, who methodically exposed the truth behind the Wakefield case for the Sunday Times and Channel 4 television in the UK, doing what the biomedical research community could not, or did not, do.
…when it comes to uncovering fraud, freedom of the press and an editor who backs an investigative reporter can be superior to the mechanisms of the scientific community…
But more importantly, it is plain folly to think that a story of such interest to the general public as the MMR vaccine would be turned over to a scientifically trained writer in the interest of scientific accuracy and balance. ‘Irresponsible’ reporting is not usually due to ignorance or lack of appropriate experience, rather it happens when the original story takes on a broader significance. It then falls into the hands of political/current affairs correspondents and so‐called columnists—a prized species of generalist who can artfully spin a tale that ensnares readers with its relevance to their lives. And in the final analysis, reporters care about page space or air‐time, editors about boosting the profile of their publication, and owners of newspapers and TV stations about returns on investment. Demanding extra caution in the reporting of scientific stories of high relevance to the public would therefore be futile. Journalism will never be a cautious profession as long as its aim is to find and communicate events that are of interest to broad sectors of society.
Believing that one knows who is to blame or who behaves better can result in an embarrassing corrective lesson. The wrong‐doings of the media are not as common as is often claimed. Scientific research is also an overwhelmingly honest profession. But both the media and scientists have misled the public on some important matters. And when it comes to uncovering fraud, freedom of the press and an editor who backs an investigative reporter can be superior to the mechanisms of the scientific community, whose members sometimes do not have the freedom and protection that they need in order to speak out. Foundations, institutes and media observers might present the failings of journalism and ponder possible remedies, but there are three sure ways for the research community itself to contribute to better media coverage of science: deal with its own bad science before it gets into the news, reign back media‐hungry journals, and be more proactive in feeding good science into the news.

References

Altman LK (2006) For science's gatekeepers, a credibility gap. The New York Times, 2 May
Ewen SW, Pusztai A (1999) Effect of diets containing genetically modified potatoes expressing Galanthus nivalis lectin on rat small intestine. Lancet 354: 1353–1354
Fitzpatrick M (2006) Stop witch‐hunting Wakefield. Spiked Online, 19 Jun www.spiked‐online.com
Horton R (2006) Retraction—Non‐steroidal anti‐inflammatory drugs and the risk of oral cancer: a nested case‐control study. Lancet 367: 382
Hwang WS et al (2004) Evidence of a pluripotent human embryonic stem cell line derived from a cloned blastocyst. Science 303: 1669–1674
Kennedy D (2006) Editorial retraction. Science 311: 335
Mooney C (2004) Blinded by science. Columbia Journalism Review Nov/Dec
Murch SH, Anthony A, Casson DH, Malik M, Berelowitz M, Dhillon AP, Thomson MA, Valentine A, Davies SE, Walker‐Smith JA (2004) Retraction of an interpretation. Lancet 363: 750
Royal Society (2006) Survey of Factors Affecting Science Communication by Scientists and Engineers. London, UK: Royal Society
SMF (2006) Science, Risk and the Media—2014 Do the Front Pages Reflect Reality? London, UK: Social Market Foundation
Sudbo J et al (2005) Non‐steroidal anti‐inflammatory drugs and the risk of oral cancer: a nested case‐control study. Lancet 366: 1359–1366
Wakefield AJ et al (1998) Ileal‐lymphoid‐nodular hyperplasia, non‐specific colitis, and pervasive developmental disorder in children. Lancet 351: 637–641

Information & Authors

Information

Published In

EMBO reports cover image
EMBO reports
Vol. 7 | No. 12
December 2006
Table of contents
Pages: 1193 - 1196

Article versions

Submission history

Received: 14 July 2006
Revision received: 7 September 2006
Accepted: 15 September 2006
Published online: 3 November 2006
Published in issue: December 2006

Permissions

Request permissions for this article.

Authors

Affiliations

Metrics & Citations

Metrics

Citations

Download Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Select your manager software from the list below and click Download.

Citing Literature

View Options

View options

PDF

View PDF

Get Access

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share on social media