Data Integrity and Crisis Mentality
life is always incomplete, but we aim for accuracy
Buildings founded on mud will crumble. We tend not to build on mud. Insights extracted from faulty data are flawed. We absolutely love to take those insights and run with them, generally into walls or off cliffs.
In times of crisis, it’s important to remember that a panicked human isn’t that much more rational than a monkey, and rather less helpful.
Similar Methods, Various Results
Hydroxychloroquine has seen some success treating Covid-19. Trump mentioned it on a national broadcast, which may have helped create the current shortage of the drug as private actors buy it up. The idea is not without merit— if you’ve got a cure to this dangerous pandemic, share the knowledge with your citizens freely.
But reality looks different from the data. A recent study in Médecine et Maladies Infectieuses (Molina et. al) found that HCQ (specifically combined with Azithromycin) does not seem to aid the immune system in fighting off the virus. A prior study from Zhejiang University (Chen et. al) also found that HCQ doesn’t treat the virus effectively. However, another from Renmin Hospital in Wuhan (Zhaowei Chen et. al) found that the randomized HCQ group (50% of the sample) showed symptoms of viral recovery 24 hours earlier than control-group Covid patients, and increased the overall recovery rate from 17 to 25 (of 31 patients per group).
The hype around hydroxychloroquine grew greatly after March 17th, when a a study in Marseille (Gautret) yielded encouraging results. This added to the growing hope for the drug: when demand is extreme, people are desperate to turn anything into supply.
How can we explain these discrepancies? The Molina study above employed the same doses of HCQ as the Marseilles procedure, but found strikingly different results in efficacy. Two studies out of China tested the same combination on Covid patients and seem to disagree.
Lurking Variables
You can’t repeat the exact same procedure and yield different results. In this case, the patients themselves, as well as their infections, were the confounding factor.
Across all the above studies, the drug seems to be more effective the earlier it was administered. Studies that indicated hydroxychloroquine to be effective generally administered the drug to patients in earlier stages of the virus, while ineffective results were found in patients further along the timeline.
Furthermore, the severity of infections in patients varies significantly: 85% of the Marseille group didn’t have a fever, a primary indicator of the virus. In the Molina study, many of the patients had underlying health conditions that could have exacerbated symptoms.
We’re also dealing with small populations, from 66 to 11, due to the difficulty of conducting rigorous medical studies while hospitals are reaching capacity. We’re still very much at the level where sample size severely hampers veracity.
Struggling for Balance
It is difficult to measure the need for thorough procedure against the desire for quick results. Especially in times of crisis, the pressure to find any solution at all often leads us to rush false conclusions. But when so much is on the line, we can hardly afford to miscommunicate.
People live and die by information, and in this quarantined world, we’re desperate for truth and hope. It takes great restraint to pause — to ask critical questions about the information presented to us. Who’s presenting the information, and why? In what way was the data gathered, and from whom? The more unknowns behind an answer, the less certain we can be.
I can hardly blame us for rushing to certainty in a terribly uncertain world. Even more when lives are at stake, the integrity of data must be considered before conclusions are drawn.
—
If you’d like to read more on medical studies, Katherine Seley-Radtke of the University of Maryland has written a useful comparative analysis of several of these HCQ studies.