One study does not give the answer

By AnthonyLast updated: Tuesday, May 17, 2005 • Save & Share12 Comments

At the weekend I was running a course, which included critical appraisal of scientific papers. The point was made that if you were presented with a number of studies about drug A in condition X, published over a number of years, there would be differing results in different papers. The drug company representatives would, of course, cherry pick the results that would favour their drugs. The important thing to remember was not to be swayed by any one paper, but to look at the accumulating evidence as a whole.

The same is true of the controversial Lancet study on the Iraq casualties. What some commentators seem to have a problem with, is that a study can be 100% correct in its use of statistics, yet at the same time skepticism can be expressed about the results – and criticism can be made of the sampling techniques. This is not the same as dismissing the results for partisan reasons.

The much-debated Lancet study [PDF] involved a cluster sample of 33 clusters of 30 households (990 households) to find compare mortality in the 17·8 months after the invasion with the 14·6-month period preceding it. They estimated that 98 000 more deaths than expected occurred, but had wide confidence intervals (95% CI 8000–194 000). It should be noted that cluster sampling is a technique used when resources are tight, or the environment is dangerous, and is known to be less accurate than other methods. Cluster sampling is known to have a higher sampling error than simple random sampling with the same sample size.

The UN development programme have released a report, which according to The Times, “states that it was 95 per cent confident that the toll during the war and the first year of occupation was 24,000, but could have been between 18,000 and 29,000.”This isn’t exactly what the report says, but see later.

The UN study looked at all governorates in Iraq. In each governate 1,100 households were selected for interview, with the exception of Baghdad, where 3,300 households were selected. The sample was 22,000 households – 21,668 were actually interviewed.

The section on war-related deaths is on page 55 of the analytical report [PDF]:

The number of deaths of civilians and military personnel in Iraq in the aftermath of the 2003 invasion is another set of figures that have raised controversy. The ILCS data indicates 24,000 deaths, with a 95 percent confidence interval from 18,000 to 29,000 deaths. The confidence interval was estimated using a linearisation technique (using SPSS Complex Samples, version 12).

Again, cluster sampling was used, but the bigger sample size would be more likely to yield a more accurate figure, as well as providing smaller confidence intervals.

So we have:

Lancet: 98,000 (95% CI 8000–194 000)

UN report: 24,000 (95% CI 18,000 to 29,000)

It is important to note that the confidence intervals of the Lancet report completely encapsulate the confidence intervals of the UN report, which means the two studies are not necessarily contradictory. However, the UN report has a larger sample size in its favour making it more likely to contain the true number.

Of course neither study is the last word, and that is part of the problem with the debate on this issue. There is a section of opinion that wants to be able to say those in favour of the war were responsible for the deaths of 100,000 people. However, no one study will give that certainty, just as no one drug trial will give the full picture of a drug’s efficacy or adverse effects.

Someone at Crooked Timber said of the Lancet study on Iraq “I think I ended every single Lancet post with the observation that you can tell a lot about people’s character by observing the way in which they protect themselves from hostile information.” One can only hope that those who seemed to accept that a single study was the final answer will be open-minded enough to accept that the Lancet study was only one part of the puzzle. Discounting people, because they were skeptical of accepting the results of one study as the final word, as deficient in character is partisan in the extreme.

Perhaps more importantly neither study allows one to make a simplistic moral case for or against the Iraq war, but it has to be said a round figure of 100,000 makes a nice simplistic slogan.

UPDATE: Tim Lambert has a post about the two studies worth reading. In it, he makes the point that the since the Lancet Report also included excess deaths from increases in disease, accidents and murders the corresponding number of deaths from the Lancet study is in fact 33,000. There are also differences in the sampling period. I wonder about his use of the word vindicated. Also see Jim Lindgren’s comments on how even the ILCS report has confused the issue by suggesting the figures are comparable. I still stand by the point that to use the Lancet report to say the US has killed 100,000 people in the Iraq war is simplistic twaddle, as do supporters of the Lancet study “As I’ve mentioned before, I don’t like this 100,000 number, and it is irksome that the Lancet’s lasting legacy has been that the “100,000 dead!” factoid has become a commonly used stick for antiwar hacks to beat prowar hacks with”

Filed in General

12 Responses to “One study does not give the answer”

Comment from Ian
Time 17/5/2005 at 12:15 pm

Seeing as the situation in Iraq has changed little since the Lancet publication, then surely the death toll should continue. The last time I worked it out the detractors should now be claiming over 130,000 deaths.

Of course, the longer you repeat the 100,000 figure, the more acurate it becomes !

Comment from Friendly_Fire
Time 18/5/2005 at 3:36 pm

There is an interesting report by the NEJM here: Combat Duty in Iraq and Afghanistan, Mental Health Problems, and Barriers to Care

The report finds (Table 2) that in a survey of 894 US Army soldiers, 116 of them (out of 861 who responded to the survey question) regarded themselves as having been personally responsible for the death of a noncombatant.

Extrapolate that!

Comment from Phil Bailey
Time 18/5/2005 at 4:01 pm

I suspect it would have been a piss-poor course you were running if you completely overlooked the fact that the two studies discussed here cover different time periods and different subjects.

Comment from Anthony
Time 18/5/2005 at 4:19 pm

Phil,

Of course you have to take that into account, but I suspect the first thing on the UN’s mind wasn’t “how can we make sure that this study covers the same time period as the Lancet Study that hasn’t even been published yet”. If someone does another study on war-related deaths in Iraq, it is unlikely to exactly the same as either of these studies, so you have to make do with what you have. If you are going to compare studies, some times you have to accept that you are dealing with slightly differing designs and groups.

As for different subjects, what do you expect? The UN study has a sample size ten times bigger. Not to mention the fact, that deliberately re-sampling the same households as the Lancet study would not have been random.

Comment from dsquared
Time 18/5/2005 at 5:24 pm

I think that the only proposition I’ve ever staked material personal credibility on is that, with high confidence, the evidence suggests that the number of excess deaths is positive; ie that over a reasonable time period and in terms of death rates, the war has made things worse in Iraq rather than better. I think that the UNDP survey is an excellent piece of work and rather regret the drafting of the report, which erroneously suggests that (impossibly) their result is comparable to both the Lancet and the Iraq Body Count numbers.

Comment from dsquared
Time 18/5/2005 at 5:31 pm

By the way, assuming that you’re talking about me here (and in context, I think you are)

Discounting people, because they were skeptical of accepting the results of one study as the final word, as deficient in character is partisan in the extreme

is unfair. I was always polite to people who only doubted the conclusion (Alastair Mackay being a case in point). I was always rude to people who threw insults at the Lancet team and accused them of corruption either in ignorance or in bad faith. I was somewhat rude to you personally because you repeatedly made dark hints that there were methodological problems with the study and then refused to say what they were.

Comment from Josh
Time 18/5/2005 at 6:08 pm

Google is still the best tool for finding the correct figures. As it’s already been stated:

Dead Iran/Iraq War: 500,000
79 – 03 = 24*18000 = 432,000
Failed 91 Uprising = 300,000
Very Rough total = 1.664M dead due to SH regime.

1.664M vs say 100,000 (Using the Lancets median point)

Comment from Anthony
Time 18/5/2005 at 7:32 pm

Daniel,

Sorry about your problem postings, you tripped the spam filter with your second post. Just for the record I wasn’t actually referring to you by that comment you quote, but to others (I suspect you can guess who they are, they are at it again today).

I don’t think the UN report did try to suggest their work was comparable to the Lancet study. They rather pointed out the controversy and cited the other figures from the Lancet and the Iraq body Count. I do not think they were attempting to perform a meta-analysis.

I agree with you that the UN report is an excellent piece of work, and for that reason I don’t think it is correct to be too fixated on the Lancet report. The UN report, when considered as a whole, is hardly a glowing report about the state of Iraq. In fact, in the rest of the media (apart from The Times) the number of war-related deaths was sidelined in favour of other, generally negative, findings.

The UN are hardly a right wing partisan group either.

If you knew nothing about the results of both of these studies, but were told the sample sizes and the methodology and asked which study you expected would give the best picture of the war-related deaths in Iraq, then you would choose the UN study.

It is a shame that the debate has to be about the 100,000 figure that Galloway has been shroud-waving in the Senate and that is accepted as a “fact” by the Guardian readers of this world. To some extent this is a false debate, as though there was some magical number of deaths that would provide post-hoc justification for opposition to the war, or a figure that would be low enough to justify the war. There are so many unknowns (both known and unknown) and value judgements that such an argument is doomed to the endless, ultimately pointless, debate that you see repeated week after week in the comment boxes at Harry’s Place.

Where I depart most strongly from Tim Lambert is that the UN study is not looked at for its own results, but to vindicate another study, as though the Lancet report was some sort of sacred truth that should be defended. I can understand that part of this will be due to defensiveness due to some of the more hopeless attacks on the Lancet study.

Comment from Soldier’s Dad
Time 19/5/2005 at 12:14 am

Friendly Fire,

“extrapolating mental health numbers”

It would be reasonable to expect, that the rate of soldier’s seeking mental health counseling would be higher for those that had sustained significant combat. Hence, the numbers can’t be extrapolated across the larger force.