Like what you've read?

On Line Opinion is the only Australian site where you get all sides of the story. We don't
charge, but we need your support. Here�s how you can help.

  • Advertise

    We have a monthly audience of 70,000 and advertising packages from $200 a month.

  • Volunteer

    We always need commissioning editors and sub-editors.

  • Contribute

    Got something to say? Submit an essay.

 The National Forum   Donate   Your Account   On Line Opinion   Forum   Blogs   Polling   About   
On Line Opinion logo ON LINE OPINION - Australia's e-journal of social and political debate


On Line Opinion is a not-for-profit publication and relies on the generosity of its sponsors, editors and contributors. If you would like to help, contact us.


RSS 2.0

Lies, damned lies, and radiation statistics

By Geoff Russell - posted Friday, 2 October 2009

A couple of weeks back a one-time hero of mine, Dr Helen Caldicott, had a piece on On Line Opinion called “The medical and economic costs of nuclear power”. She was a hero of mine back in my anti-nuclear days, and I've discussed elsewhere my change of position to anti-uranium mining, while supporting a type of nuclear fuel cycle involving Integral Fast Reactors which can chew up current nuclear waste, allow the closure of uranium mines, and give us a fighting chance of avoiding dangerous climate change.

Back in my anti-nuclear days, I would have latched on to Caldicott's medical citations with sufficient detail to recall and regurgitate at the slightest opportunity: for example, a German study found increased childhood cancer rates; meta-analysis found increased childhood cancer rates.

But a lot has happened since I first became anti-nuclear as a young student. I've learned much more mathematics, scrutinised many research protocols, and also developed a bit of a nose for the inappropriate use of statistics. So now that I've changed my position on nuclear, my reaction to articles like Caldicott's is to trace claims back to their source.


Evaluating simple statistical claims can take a lot longer than making them, so you'll need a little patience.

We won't deal with all of Caldicott's evidence, but consider in detail the meta-analysis she cites: Baker PJ, Hoel DG. Meta-analysis of standardized incidence and mortality rates of childhood leukemia in proximity to nuclear facilities. Eur J Cancer Care. 2007:16:355-363 . The lead author kindly sent me a copy of the full paper.

First, what is a meta-analysis? Everybody has seen arguments with people citing studies finding for or against some claim, or dismissing a study as small or poorly done. A meta-analysis collects as many studies as it can find on a topic and quantitatively combines them. The word “quantitatively” is crucial.

Here's a simple example which illustrates, in very simple terms, what a meta-analysis is and why they are so important. Imagine you have a bucket with one hundred 20 cent pieces. You get each of 20 friends, one after the other, to take four coins, toss them, record the outcome, put them back in the bucket and publish this as a scientific study (which it is) under the title "Investigation into the bias or otherwise of Geoff's bucket of coins". I'll accept the article in the Journal of trivial research which you can imagine exists with me as the editor.

Now let's pretend everybody gets four heads. Of itself and under the normal conventions used in most research, this isn't statistically significant evidence that the coins in the bucket are biased. Because, even with fair coins, this could happen once in every 16 tests of this kind. So we end up with 20 research papers all claiming not to have found statistically significant evidence of bias in the coins.

There are two basic ways of summarising the state of these bucket studies. First, we just count significant studies. There are none. End of story. We pronounce that the coins in the bucket must be fair since we did a whole heap of studies and none found any significant evidence of bias. You'd be surprised at the people who use this method when it suits them. The second approach is to combine all the results into a meta-analysis and it instantly becomes blindingly obvious that it is highly likely that the coins are biased.


In the real world, combining studies is much tougher and sometimes impossible because researchers don't all use the same methods and the differences can be extreme, but when it is possible then it’s clearly a pretty good way of assessing research.

So, that's a meta-analysis. Studies of the incidence of leukemia near nuclear facilities usually end up with a risk factor. If the study shows that leukemia is more likely near nuclear facilities then the risk factor is above 1. If it is less likely, then the risk factor is below 1. Thus a risk factor of 0.5 means leukemia was half as likely near the reactor than normal. A risk factor of 2 indicates a doubling of risk near the reactor.

Here's two sets of numbers I just made up from a mythical meta-analysis of risks due to reactor proximity:

  1. Pages:
  2. Page 1
  3. 2
  4. All

Discuss in our Forums

See what other readers are saying about this article!

Click here to read & post comments.

16 posts so far.

Share this:
reddit this reddit thisbookmark with Del.icio.usdigg thisseed newsvineSeed NewsvineStumbleUpon StumbleUponsubmit to propellerkwoff it

About the Author

Geoff is a mathematician and computer programmer and is a member of Animal Liberation SA. He has been published in The Monthly, The Age, Australasian Science, Independent Weekly and Dissent. His latest book GreenJacked has just been published.

Other articles by this Author

All articles by Geoff Russell

Creative Commons LicenseThis work is licensed under a Creative Commons License.

Photo of Geoff Russell
Article Tools
Comment 16 comments
Print Printable version
Subscribe Subscribe
Email Email a friend

About Us Search Discuss Feedback Legals Privacy