Would you believe that ""the largest, most definitive analysis of the mental health risks associated with abortion, synthesizing the results of 22 studies published between 1995 and 2009 involving 877,181 women, of whom 163,831 had abortions" has determined that "abortion harms women's mental health"? It concludes that "10% of all mental health problems and 34.9% of all suicides in women of reproductive age" are caused by abortion. Here's the author's own summary of the results.
Women who had undergone an abortion experienced an 81% increased risk of mental health problems, and nearly 10% of the incidence of mental health problems was shown to be attributable to abortion. The strongest subgroup estimates of increased risk occurred when abortion was compared with term pregnancy and when the outcomes pertained to substance use and suicidal behaviour.
Those numbers are so extravagantly extreme that there ought to be alarm bells going off in your head right now, and the research had better be darned thorough and unimpeachably clean.
As it turns out, it isn't. The author of the paper, Priscilla Coleman, is an anti-abortion advocate, and 11 of the 22 studies sampled for the meta-analysis are by…Priscilla Coleman. Methinks there might be a hint of publication bias there, something that has been confirmed statistically by Ben Goldacre.
Jim Coyne has carried out a thorough dissection of the paper, exposing the statistical games she played with the data.
If you examine Figures 1 and 2 in Coleman's review, you can see that she counts each of her own studies multiple times in her calculation of the effects attributable to abortion. This practice was also roundly criticized in the E-letter responses to her article because each study should only be entered once, if the conditions are met for integrating results of studies in a meta-analysis and providing a test of the statistical significance of the resulting effect size. This may sound like a technical point, but it is something quite basic and taught in any Meta-Analysis 101.
Coleman's calculation of overall effect sizes for the negative mental health effects of abortion involve integrating multiple effects obtained from the same flawed studies into a single effect size that cannot accurately characterize any of the individual effects - anxiety, depression, substance abuse, and suicide - that went into it. Again we are encountering a nonsensical statistic.
And just how good were the papers that Coleman chose to include in her meta-analysis? She claims they were the best, and that others were excluded because of their poor quality, but it seems other investigators hold her work in low esteeem.
…an APA task force report did find that Coleman studies--the ones she included in her meta analysis--had inadequate or inappropriate controls and did not adequately control for women's mental health prior to the pregnancy and abortion. A similar verdict about Coleman's work was contained in the draft Royal College of Psychiatrists report that also considered the bulk of her work too weak and biased to be entered into an evaluation of the effects of abortion on mental health.
I did find this comment by Jim Coyne bitterly amusing.
Readers should be to assume that the conclusions of a meta-analysis published in a prestigious journal are valid. After all, the article survived rigorous peer review and probably was strengthened by revisions made in the authors' response to a likely "revise and resubmit" decision.
Obviously, you can't assume that. This is a case where the editors and reviewers failed to do their jobs, and that happens way too often…and now this study has been thoroughly politicized and is being touted by the anti-abortion wackaloons to argue that abortion must be banned…for the good of the women. Which is probably one of the few times they've given a damn about the women involved.
But if you want a good, straightforward summary of why Coleman's paper should have been rejected, that last link is it.
(Also on FtB)
- Log in to post comments