It is interesting to consider what would have happened if the COVID virus had emerged in 1921. Or 1821. Or 1521. There would have been no vaccine, for one thing. Treatment would mostly have been worse. In the 17th century we would have blamed the entire thing on Catholics.
But in a few respects, bizarrely, we might have done better. For instance, miasma theory, although technically wrong, might have protected us better against airborne transmission than the early scientific consensus that the disease was spread via droplets on surfaces. A believer in miasma theory might have practiced mask wearing (ideally with a large beak), indoor ventilation and outdoor gathering more assiduously than we did. You can see the benign influence of this theory in the design of many large-windowed, lofty Victorian buildings, and in the heroic success of 19th-century sewage projects. Sometimes we are right for the wrong reasons.
Scientific knowledge aims to progress via the survival of the truest. But there is another large body of knowledge which survives not because it is true but because it is empirically beneficial. Much of our instinct takes this form. Seemingly delusional beliefs survive not because they are rational but because the people or groups who adhere to them tend to survive better than those who don’t, or because the benefits of such beliefs become apparent before we can explain the reasoning behind them.
Consider A.A. Milne’s thesis that there is a high correlation between children stepping on the cracks in the pavement and their risk of being eaten by bears. It is unclear where Milne acquired the data to support this assertion, and subsequent studies have failed to replicate the finding. Nevertheless it is not irrational to instill this belief in your children. It encourages them to pay attention to where they put their feet and keeps them from straying into the road. You could give them a lecture on road-traffic fatalities, but it may not hold their attention like the threat of a bear.
The awkward consequence of this is that it is never sufficient to judge the rationality of a tradition by the quality of the reasoning underpinning it. You also have to consider the extent to which any seemingly unsupported belief — for instance the belief in an all-seeing deity — is effective at driving better behavior.
Let’s consider the reaction of a random 18th-century human approaching some weird bat-infested cave. Blessed with a few million years of evolved instinct, but no scientific knowledge of virology, their reaction is nevertheless highly sensible: ‘Screw this, it’s dark, it’s full of creepy bats and it smells awful. Let’s not go in there.’ A tribal religion would probably deem the cave haunted or evil, and hence off-limits. It takes a highly rational scientist to start collecting feces before transporting it to a large urban centre where it can be manipulated under BSL-2 levels of containment. (I asked a scientist what BSL-2 means: ‘It’s effectively the level of hygiene you get when you go the dentist.’) We tend to assume that evidence of genetic manipulation will be needed to support the COVID lab-leak theory. This is sufficient but not necessary. The act of collection was irresponsible in itself. At least Bond villains have the decency to hatch their nefarious schemes on remote islands, not in the middle of cities.
Even now, scientists seem reluctant to consider the lab-leak theory. Perhaps it’s like Homer Simpson’s observation that ‘beer is the cause of, and solution to, all of life’s problems’. If the scientific work in developing vaccines and sequencing variants is ultimately shown to be a response to a problem other scientists created in the first place, science can no longer be awarded three points for a win: it is at best a score draw.
This article was originally published in The Spectator’s UK magazine. Subscribe to the World edition here.