Taking on the Cum Hoc and Post Hoc fallacies in Microbiome

Posted 14th September 2017 by Jane Williams
“We need a shift from correlation to causation to support further progress,” says Weizmann Institute of Science’s Eran Elinav, echoing a widely espoused sentiment in the microbiome space. “We spent the first decade finding associations of microbiota with different clinical indications, but now we are discovering that only some of these are phenotypes that are caused by changed microbiota,” adds Elinav who is also scientific co-founder of Israel-based BiomX and DayTwo.
Two events occurring simultaneously or in close sequence provide important scientific evidence, but such correlations do not prove causality – known as the Cum hoc and the Post hoc fallacy, respectively – and thus must be systematically tested before any causal relationship can be concluded.
“The time has come to launch truly collaborative efforts that delve deeper into the biology of the microbiome – its functional modules and their interplay with the human genome,” posits Jun Wang, CEO of Shenzhen-based iCarbonX. “Vertical, multi-omic association analyses are the key to unlocking this deeper understanding and eventually developing targeted and efficient microbiome-based probiotics and antibiotics.” Analysis and vertical integration of microbiome functional data is a key component of iCarbonX’s artificial intelligence-driven strategy to build a collaborative ecosystem to develop personal medicine applications. Two microbiome plays – AOBiome and GALT – figure prominently among the eight companies currently involved in iCarbonX’s Digital Life Alliance.
In addition to a vertically integrated systems approach, many are also pointing out the need for horizontal data integration to move from correlation to causation.
According to Peter Christey, CEO of San Francisco-based GALT, “the research problem has changed from studying individual organisms to communities of microbes, and their relationship with their host. The lack of tools to understand how microbiome communities work has become the main roadblock to progress in the multi-faceted field of microbiome research.” Indeed, many key opinion leaders converge on the need to complement traditional microbiological approaches with new perspectives and different ways of looking at microbiome systems. One discipline many are looking to for synergies is ecobiology.
“It is clear that many of the diseases we work with are the result of combining genetic predisposition with microbial imbalances, as opposed to caused by single pathogenic organisms. We need to give more importance to understanding ecological principles if we want these results to be translatable,” says José Clemente, Assistant Professor at Icahn School of Medicine at Mount Sinai in New York. Specifically, adds GALT’s Christey, “priority should be given to the development of methods that model the whole microbiota, something that cannot be achieved in a petri dish. We need appropriately scaled tools.” GALT is developing digital microarrays representing whole microbiomes and allowing thousands of sampling points.
Neither the vertical nor the horizontal integration of data – or combinations thereof – will, however, solve a pervasive challenge in the microbiome space: reproducibility. While reproducibility is a general problem in the life sciences, it is a particular challenge in microbiome research because the underlying technology is constantly changing.
For example, 16S rRNA sequencing has long been the standard methodology used to broadly characterize microbiome communities, but its reliability has been long questioned. Shotgun metagenomics was introduced as an alternative method that can provide species and even strain level resolution, however, its sensitivity and accuracy are still quite limited, and more importantly, the results have shown less than acceptable agreement with the already highly variable results from traditional 16S rRNA sequencing.
“When we cannot reproduce a previous result it is difficult to determine if this is due to changes in technology or because the findings were not biologically robust in the first place. If the methods are fully reproducible we can at least remove one variable from this equation and improve the chances that findings in a single study can be reproduced and be translatable to humans,” says Clemente.
Correlations often provide the first scientific evidence hinting at potential mechanisms and thus can function as an invaluable hypothesis-building tool. How robust these correlations are ultimately determine how useful they will be on the long road from correlation to causation.
The poster child of microbiome-based therapeutic approaches, fecal matter transplantation (FMT) for the treatment of Clostridium difficile infection, was the first real proof that microbiome modulation can be effective. However, cautions Elinav, “ultimately we will need to know how it works and why it doesn’t work in some people to devise specific, reproducible and scalable therapies for C. difficile.” And this principle applies to most other conditions we are trying to tackle with microbiome-based approaches.
Tune in again next month for my next snapshot of our ongoing conversations with leaders in the space in preparation for Microbiome Futures in New York next year.
Gaspar Taroncher-Oldenburg is an independent biopharma consultant. He was previously Founding and Managing Editor of Nature’s SciBX: Science-Business eXchange (now BioCentury Innovations) and scientific editor of Nature Biotechnology.
Leave a Reply