[Skip to content]
 Home
 News Index
 Our researched articles
 Science (General)
   List of studies
   Basic guide to EMFs
   EMF guidance levels
   RF unit conversion
   FAQs
   Other resources
 ELF ("Power" EMFs)
   Overview
   Powerlines
   Substations
   Electrical wiring
   Electrical appliances
 RF ("Microwave" EMFs)
   Overview
   WiFi
   Mobile phones
   Cordless phones
   Mobile phone masts
   Other resources
 Health
   Childhood leukaemia
   Brain tumours
   Electromagnetic sensitivity
   Other health effects
 Action
   Reduce your exposure
   - Mobile phones
   - Phone masts
   - Powerlines

Valid XHTML 1.0! Valid CSS!

- Liability disclaimer -
- Privacy policy -
- Cookies policy -
© Copyright Powerwatch 2024

Bias and confounding in EMF science

Science index » Overview | Article library | List of studies | Basic guide to EMFs | International guidance levels | Unit conversion | Frequently asked questions | Other resources


This page is all on its own, not because we are making any claim that all papers with industry funding / industry personnel doing the research are biased, though it is always important to maintain an understanding of how conflicts of interest can affect scientific research.

Sharon Begley, writing for Newsweek, summarised an example of how statistical analyses can be used to hide associations, which is well worth a read.

There are a number of studies that fall into our area of interest that have had the integrity of their research questioned. We will summarise some of the points made and further reading below:

Lloyd Morgan's Graphs

Lloyd Morgan has put together an excellent analysis of the individual points of data in most of the recent Hardell studies (into increased risk of brain cancers from mobile phone usage) and in most of the recent Interphone studies (which received a large amount of their funding from the cellular telecommunications industries), and compared the results.

The results are very marked, with the independent research finding very clear and statistically significant increases in a number of brain cancers, especially Acoustic Neuromas and certain types of Glioma. The Interphone studies however showed a very distinct trend towards a protective effect, with statistically significant protective results for Acoustic Neuromas (meaning that regular use of the phone gives a protective effect, in one case lowering your risk to only 40% when compared to not using the phone!). Click on the graph thumbnails below to see the full graphs (and interactive points explaining where the data came from):

Glioma Graph Meningioma Graph Acoustic Neuroma Graph
Glioma Meningioma Acoustic
Neuroma

The fact that a study does not find an effect does not provide any evidence that the study is necessarily biased, but a protective effect would be expected to have some level of justification alongside it, even in discussion. This has not been present in any of the Interphone studies.

Having found such a clear difference between the two sets of studies, it becomes prudent to investigate what differences can be found in the methodology of the research itself: The three categories analysed by Hardell et al were analogue cellphones, digital cellphones, and digital cordless phones (such as DECT). They found statistically significant increases in all three categories, yet crucially when looking at the Interphone studies not only were digital cordless phone users not examined, they were also not recognised as a potentially cofounding factor. This means that the effectively unexposed "control group" included digital cordless phone users, which if Hardell et al are correct will have the effect of diluting the results of the whole study.

The Danish Cohort Study

Aside from this glaring omission, the latest Danish Cohort study has come under an enormous amount of criticism from some very important failings in the research data. To name but a few: Firstly, there is the cordless digital phone issue. Also, they could only get the data for phone subscriptions between '83 and '95, so any phone user after 95 was not only excluded from the phone user category, but in fact included in the "general population to be compared against category". Any Pay as you Go user (which will be a small proportion only because of the time period) will also fall into the general population category. On top of this, they could only use two thirds of the data from the date period as the other third they could not get hold of. The reason for this was that the phone was contracted to a company instead of an individual, thus not being possible to track down - the author's themselves admitted in the paper discussion that this may well be the heaviest users of their sample size, yet once again this group ended up in the "general population" / unexposed category. We have given or more detailed analysis of these points in one of our recent news stories.

With this lack of definition between the groups it is hardly likely that they found no association between the phones and cancer, but one has to ask why they ended up publishing epidemiological research that could never have found anything other than "no effect"? We can see only four possible candidates for an explanation: a) Semi-incompetence in failing to understand the implications of their lack of ability to accurately separate the users from non-users, b) Apathy in not particularly caring in the outcome provided they get a well-cited published paper into peer-reviewed literature, or c) Conflict of Interest in that the researchers appreciate maintaining a steady flow of grant money, and were happy to allow the research to come out backing up the industry status quo, regardless of whether or not they felt that it actually portrayed useful information, or d) Any disclaiming comments that the authors may have put into their paper regarding the difficulty in getting useful data may have been removed prior to publication of the research.

Obviously these are harsh accusations but the inescapable fact is that a paper with such obvious and glaring flaws should never have been able to be published. Reason a) seems unlikely for post-graduate scientists with a number of published papers. b) seems possible, but due to the possibility of having their reputation tarnished for publishing bad science also seems unlikely. d) is possible (we have heard it happen before from study authors themselves) but due to the possible scandal if found out is also unlikely. The only one that seems to have reasonable merit is a conflict of interest due to industry funding, especially for a study with funding from the cellphone industry.

Whilst this theory may seem somewhat conspiratorial, it is interesting to see that we are by no means alone in this point of view:

Industry Funding, the Air Force, and Radiation Research

Back in 2006, and in conjunction with the eminent Professor Henry Lai, Louis Slesin produced a highly detailed and well researched article documenting the apparent trend between the source of funding and the likely outcome of the paper in question. Whilst it was not so surprising to find that industry funded papers more often found a null or negative result, what was surprising was the proportion of papers published in a single journal, Radiation Research, that followed the same trend!

They put together a total of 85 genotoxic studies on RF/Microwave radiation, of which 43 were "positive" findings and 42 were "negative" or "null" findings. The statistical findings were striking: 32 of the 35 studies that were paid for by the mobile phone industry and the U.S. Air Force show no effect. They make up more than 75% of all the negative studies. One of the three industry studies that did find an effect nearly failed to make it into print. It was carried out by Jerry Phillips under a Motorola contract. Motorola opposed Phillips' decision to write up his positive findings and, according to Phillips, the company tried to stop him. Phillips resisted and succeeded, but it was the last piece of original EMF research he ever completed.

Of the 85 papers, 22 were published in the journal Radiation Research, yet 21 of them produced "negative" results - only 1 out of the 22 papers found a genotoxic effect from RF radiation, and again the lead author was denied money for follow up studies and has since moved sideways into other research areas.

The full article is well worth a read, and covers the issue in far greater detail, including some of the suspected people behind the reasons that some papers garner more favour than others - it is available in full from Louis' excellent Microwave News site.

Other Evidence for Industry Bias

Dr. George Carlo, currently head of the Science and Public Policy Institute in Washington (USA), used to work for Motorola looking into the possible health effects of cellphones. When he said that he could not rule out health effects based on the results of his research to date, the funding was simply stopped for his work. He resigned and now campaigns to expose what he feels is unnecessary and unethical industry interference in science. He wrote an open letter outlining the ties between the Danish research group and the cellphone industry, and has also written a very in depth analysis of the paper mentioned above, with very damning accusations of deliberately conducting the study in a way that would find nothing from the outset.

Lennart Hardell (the very same lead author on the independent mobile phone studies in Lloyd Morgan's graphs) has published a further criticism of the Danish study, where he makes a similar claim that the results were only ever likely to be "no effect" due to the nature of the collected data. Such accusations of industry influence are by no means limited to this field or related fields however. Hardell has also published a paper outlining a number of very high profile "secret" associations between prominent researchers and high-finance industries, none of which were admitted to at the time. Some examples of those relationships marked out are Dr. Ragnar Rylander and Philip Norris (Tobacco), Hans-Olov Adami and Exponent Inc (high profile US consultancy firm), Hans-Olov Adami (again) and Monsanto (Known mainly for GM related issues), and highly respected epidemiologist Sir Richard Doll and Monsanto (again, this time chemical exposure related). Published in the American Journal of Industrial Medicine in 2006, this has very well researched the backgrounds of some shady industrial science dealings.

Don Maisch, the owner of EMFacts.com, has also produced a number of documents for committees in Australia summarising some of his findings on alleged corruption between science and corporate entities, including between the industry and International organisations such as WHO. He has written an interesting paper on the dealings between the industry and Mike Repacholi, and another paper showing how corruption has destroyed the process of setting precautionary radiation exposure in Australia.

Finally, there has been a recent piece of research from Huss, Anke et al that has found a nine-fold increase in results showing "no effect" when comparing papers purely industry funding against papers with public body or charity funding. The papers used to make this analysis where 59 epidiological papers from EMBASE, medline and a couple of other specialist databases looking into health effects from mobile phone radiation. This nine-fold increase was found to be statistically significant.

In Conclusion

It is always going to be unreasonable to judge a paper with purely industry funding as biased for that reason alone, but with the evidence of so many researchers and scientists showing how corruption and conflicts of interest have occurred in the real world, knowing the source of finding is clearly very important. It is also simple common sense that it is in the cellphone industry's best interests for a large health effect not to be found with mobile phones, so whilst the funding may not in itself affect the outcome, they have the power to pick and choose what they look for with prior knowledge to which is most or least likely to find an effect.


This page has links to content that requires a .pdf reader such as Download Adobe Reader Adobe Acrobat Reader