Ivermectin: what does the science say?

11 minute read


Let's look at methodology - we hope you like graphs.


The covid-19 pandemic has often felt like one long parade of hope around miracle cures.

First we had hydroxychloroquine, which has been used to treat countless millions across the globe despite the most recent evidence indicating that it increases your risk of death if you have covid-19. Following that, we saw a seemingly endless cavalcade of supposed cures, from vitamin C to zinc to vitamin D to almost everything in between — there’s been an understandably desperate longing for a simple fix to the pandemic that has permeated conversations across the globe.

Which brings us to ivermectin. Depending on whether you listen to the World Health Organisation or sensationalist headlines, ivermectin is either an anti-parasitic drug that has no strong evidence for a benefit against covid-19 or a world-changing solution to the entire pandemic we’re suffering through.

It’s a fascinating dichotomy to watch — on one side, public health experts are mostly on the fence about whether there’s any use for ivermectin, on the other are passionately fierce adherents who set up entire advocacy websites for the explicit purpose of getting more people to take the drug. This makes any discussion about ivermectin wildly contentious, because many people are convinced that it reduces your risk of death from covid-19 to almost nothing, which if true would make it almost as useful as vaccines when it comes to ending the pandemic.

Sadly, the evidence doesn’t really support that idea, despite the recent furor on social media. It turns out that we really don’t know if ivermectin helps with covid-19, because the evidence is mostly of such low quality that concluding anything at all is difficult.

Let’s take a look at the science.

Investigating Ivermectin

The two new studies that have sparked the intense debate online are a review paper looking at the potential pathways of action for ivermectin, and a systematic review and meta-analysis that looked at the benefits of the drug in people with covid-19.

Both of these papers were written by people who have been outspoken for the last 12 months about using ivermectin to treat coronavirus infections, and they both argued that the drug should be used as a treatment for covid-19. Dr. David Gorski covered these potential conflicts of interest in detail on ScienceBasedMedicine, but suffice it to say that it’s probably not that surprising that these two papers ended up recommending ivermectin given the authors’ public stance since mid-2020.

But conflicts of interest are not necessarily a reason to discount research, so let’s have a look at what the papers actually said.

The first article was essentially an academic opinion piece that reviewed some lab-bench data on ivermectin’s potential use as a drug. While perhaps interesting, the conclusion gives you some idea of the meaningfulness of this sort of narrative review:

Pictured: not really evidence of much

The piece also cited some very weird sources, including an anonymous website that has been posted online of ivermectin that is, to put it bluntly, not very scientific. Even ignoring the anonymity, the website — ivmmeta dot com — makes incorrect claims about p-values and misrepresents quite a few studies. Suffice it to say that it was quite surprising to see the website cited as a useful source in an academic paper.

So the first article isn’t very useful in telling us about ivermectin. Even if we agree with the authors’ conclusions, all we need to be doing is considering repurposed medications for the treatment of covid-19, which is something that most doctors have been doing since February 2020 anyway.

The second study, however, is much more interesting.

The second study is a reasonably good meta-analysis of ivermectin, conducted by a group of doctors that have been trying to get people to use the drug since mid last year. The authors basically used previous systematic reviews and other public archives to gather together estimates of the benefit of ivermectin on mortality and some other endpoints in all randomized controlled trials of the subject. They found that there was moderate-certainty evidence that ivermectin had a very large benefit to mortality, or in other words that it prevented people from dying from covid-19.

And in general the methodology of this review was decent, although it has received some criticism from methodological experts online — the authors followed reporting guidelines and the study itself looks reasonably good. However, before we all go off to take our deworming pills, there are a few things that make everything quite a bit less certain.

Embracing uncertainty

The first issue that the review showed was publication bias. This is essentially the phenomenon whereby people are more likely to submit positive than null findings to be published, and those results are more likely to be accepted by academic journals. To put it another way, everyone wants to publish a study that finds that ivermectin cures covid-19, but it’s not really news if your results show that ivermectin doesn’t do much to help with the disease.

To assess publication bias, the authors produced what’s known as a funnel plot. This is basically a graph that plots the estimated rigor of studies included in the review against their effect size, giving you an idea of whether there are some studies that you would expect to see published which haven’t been. If you look at the funnel plot from this review, this issue is fairly clear:

If there was no publication bias, you’d expect the number of studies on each side to be relatively equal, and spaced similarly. Instead, we’ve got lots of smaller, less rigorous studies on the bottom left, and most of the bigger ones clustered around 1 (no effect).

What this means is that there are probably some studies that have not been published that show that ivermectin has no benefit for covid-19, and correcting for this issue might change the results entirely.

But that’s not the only drawback to the review. This is getting a bit into the technical side of systematic reviews, so skip through the italics if you just want to know the short version.

The authors used Cochrane’s Risk of Bias tool to rate studies in the review, and produced the below table of the results. Some of these studies are definitely fine — for example, the Mahmud study was extremely well-conducted and it’s hard to find fault with the authors’ judgement. However, there were a few studies for which the ratings seem a bit off, although it’s worth bearing in mind that these ratings are somewhat subjective regardless of the context (i.e. I could be wrong too).

In particular, if we look at the Niaee 2020 and Elgazzar 2020 studies, the ratings seem a bit optimistic. Both of these are preprints, which is not disqualifying per se, but they are also quite scant on pertinent information and have some worrying inconsistencies. For example, Niaee is a randomized trial that recruited people who were PCR negative for covid-19 as well as those who tested positive, but they somehow ended up with nearly 50% PCR negative in the control group and only 20% negative in the intervention. This is an issue for both the randomization and allocation concealment elements of the study, but both of these are rated as “low risk of bias”, which doesn’t seem to take this issue into account. The Elgazzar study simply has no information whatsoever on allocation concealment at all, and the two sentences on randomization procedures actually contradict each other, yet it is still rated as “low risk of bias” for both of these fields.

Now, all of this is somewhat subjective, but I personally would rate both of those studies as at “high risk of bias”, because they simply do not have much info on what the researchers did, even if you look at the pre-registration paperwork. This does not mean that the studies are “bad”, simply that there is a high risk that some elements of bias crept in during the research, and that the results may not mean as much as we’d like.

And more interesting still, if you exclude these two low-quality pieces of research from the analysis, the results entirely reverse. The primary analysis that the authors present found that ivermectin reduces the risk of death in treated patients by 62% (95% CI 27–81%), but if you exclude these two studies from the model and re-run it (I used the same Dersimonian-Laird model but ran it in Stata rather than Revman), you get some very different results.

Pictured: An hours work in Stata. Sadly, Stata makes quite ugly graphs.

This basically shows that without those two studies, the analysis demonstrates no benefit for ivermectin at all compared to placebo, with a confidence interval that includes everything from a big benefit to a large harm from the drug. Interestingly, the between-study heterogeneity also reduces when you do this from about 50% to 6.6%, which is lower than the value the authors give in their sensitivity analysis in the paper.

What this means is that, if you exclude some of the low-quality research on ivermectin, the paper goes from showing a massive benefit to no benefit at all. On top of this, there’s an interesting point — even if you don’t agree with these assessments, taking the only three studies that the authors of the meta-analysis considered to be at a “low risk of bias” (i.e. high-quality), you find that these high-quality studies have failed to find any benefit for ivermectin.

In other words, while the conclusions the authors came to are very positive, the results section of the paper seems to show that the evidence for ivermectin might not be strong after all.

The devil really is in the details with research like this.

Bottom Line

All of this uncertainty actually makes sense when you look at other studies into the topic of ivermectin for covid-19 — while there have been a few decent papers, other systematic reviews conducted recently have argued that the evidence is of extremely low quality, which is why the World Health Organization currently recommends that the drug only be given as part of a dedicated clinical trial.

That isn’t to say that the drug doesn’t work. What these guidelines are pointing out is that the evidence about ivermectin generally is of extremely low quality — remember, there are only a few studies with a low risk of bias! It’s not that we definitely know that ivermectin is worthless, it’s that the evidence we’ve got at hand simply doesn’t tell us whether the drug works or not.

That’s not necessarily a bad thing, and the research is ongoing. It may well be that in a few months time a massive trial is published that shows that ivermectin works extremely well. It’s also possible that the new studies rolling out will show that ivermectin is a total waste of time — it’s somewhat hard to say. While the best evidence currently seems to show no benefit from the drug, even the best evidence isn’t very good.

Ultimately, we’re left in a similar situation as we have been with so many things during this pandemic — uncertainty. The evidence for and against ivermectin is mixed, and it’s hard to say right now whether it works or not. That we are still so uncertain even after millions of people have been given the drug to treat covid-19 worldwide is an incredibly depressing fact, but it is still absolutely true to say that we are not sure if ivermectin works for covid-19.

We might one day figure out if ivermectin works for covid-19.

Sadly, that day has not yet come.

Note: because I know people will say silly things, I have never been paid by any pharmaceutical companies, hold no interests in drugs of any kind, and am funded entirely by the Australian state and federal governments, as well as a bit of money that I get from locking my stories on Medium for you all to read. I have no financial interests in any covid-19 drugs, and honestly would love it if ivermectin cured the disease because then the pandemic would be over — I could go back to writing about whether chili peppers can stop heart attacks and that’d be much more fun.

Gideon Meyerowitz-Katz is an epidemiologist who tweets @GidMK

This piece was originally published at Medium

End of content

No more pages to load

Log In Register ×