[Skip to content]
 Home
 News Index RSS XML Feed
 Our researched articles
 Science (General)
   List of studies
   Basic guide to EMFs
   EMF guidance levels
   RF unit conversion
   FAQs
   Other resources
 ELF ("Power" EMFs)
   Overview
   Powerlines
   Substations
   Electrical wiring
   Electrical appliances
 RF ("Microwave" EMFs)
   Overview
   WiFi
   Mobile phones
   Cordless phones
   Mobile phone masts
   Other resources
 Health
   Childhood leukaemia
   Brain tumours
   Electromagnetic sensitivity
   Other health effects
 Action
   Reduce your exposure
   - Mobile phones
   - Phone masts
   - Powerlines
   EMFields store

Valid XHTML 1.0! Valid CSS!

- Liability disclaimer -
- Privacy policy -
- Cookies policy -
© Copyright Powerwatch 2017

» Printer friendly version

11/12/2014 - Scientific press releases are often misleading

Dr Ben Goldacre has written an excellent critical editorial in this week's British Medical Journal (BMJ) regarding misleading media reports about scientific research results.

The BMJ recently published a very interesting paper by Petroc Sumner and colleagues, investigating the level and source of exaggeration in media coverage of new research. Their findings were unexpected, demonstrating that the majority of national news exaggerations were due to the press release that those journalists were provided with, normally by the academic institution itself, rather than the journalists putting their own newsworthy spin on the story.

As the determination of "exaggeration" is both vague and subjective, the authors made three sub-definitions to classify each of the press releases and news stories that they analysed, broken down as follows:

  • Offering explicit behavioural advice to the public that were not supported by the data
  • Connecting correlation with causation too strongly
  • Extrapolating animal data to humans inappropriately

They found that there was a much greater likelihood that sensationalist news stories would be written if the press release that accompanied the research was exaggerated (58%, 81% and 86% respectively on the three categories above). The proportion of exaggerated news stories was relatively low if the press release itself was more factual (17%, 18% and 10% respectively). This appears to be relatively clear evidence to support the fact that the majority of exaggerations in mainstream press were due to the quality of Journal, University or Science Media Centre press release rather than journalists using artistic license with their claims.

Exaggerations work in both ways

Unfortunately, the paper only appears to have assessed the effect of exaggeration the "positive" direction where proof/causality is being attributed where the data does not sufficiently support it. It is equally disingenuous to mark a study that failed to find an effect as proof of a lack of effect, despite the logical difficulties in proving a negative. It normally requires a precise and rigorous methodology, large sample sizes and a comprehensive inclusion of confounders and potential bias into the analysis of the data to produce "real" statistically significant results. Identifying environmental causes of chronic health problems is like searching for a needle in a haystack of a number of haystacks across a number of fields. Even with guidance as to where to look, and a well designed study, it can be easy to end up with a null result simply due to a confounder that hasn't been considered or one was not possible to control for.

The same is true for replications of existing studies, which ideally will be done independently by a different team in a different laboratory, with only necessary communication with the original study's authors to make sure the "replication" was really as comprehensive as possible. Professor Henry Lai presented an interesting analysis of the surprising factors that can affect the rats ability to remember a previous path through a maze, where even the material that was used to build walls of a maze can substantially affect the rats memory. A failure in another single study to produce another positive finding doesn't prove an effect is false and, depending on the quality of the study, may not even provide much evidence to support a lack of effect.

It would have been very interesting to analyse news stories with titles of "Mobile phones are safe" or "Electrosensitivity is all in the mind" to see if there were similarly strongly worded press releases, or whether it was the journalist's interpretation.

A new over-claimed example this week appeared in the Royal Society Interface Journal, spun it seems by the Press Office at Manchester University. "There is still some concern among the public about this potential link, which has been found in some studies into cases of childhood leukaemia, but without any clear mechanism for why", Alex Jones at the School of Chemistry at the University of Manchester and co-lead author of the paper, said. "Flavoproteins transfer electrons from one place to another. This research suggests that the correct conditions for biochemical effects of EMFs are likely to be rare in human biology."

There are now many reported mechanisms that could explain a link EMF exposure with biological and health effects. This research was only looking at one postulated mechanism and certainly should not be press released as an overarching dismissal of EMF bio-effects.

Links

» Full paper on the BMJ website
» An excellent editorial on the paper by Ben Goldacre
» Magnetic field effects as a result of the radical pair mechanism are unlikely in redox enzymes in the The Royal Society Interface Journal