Of all the offspring of Time, Error is the most ancient, and is so old and familiar an acquaintance, that Truth, when discovered, comes upon most of us like an intruder, and meets the intruder’s welcome.

Charles Mackay

Sigmund Freud, writing in 1928 in The Future of an Illusion, distinguished between errors, delusions and illusions [1]. Simply put, errors are mistakes not driven by wishes, delusions, errors all of which are wish-driven and illusions, beliefs some of which are true, some false but all which are wish-driven. Transplant experts, like all other humans, are subject to errors, delusions and illusions. When a group of people share a delusion, psychologists term it a cult.

Cults are as old as mankind. In 1841 Charles MacKay, a Scottish journalist published a three-volume treatise entitled: Extraordinary Popular Delusions and the Madness of Crowds, a study of crowd psychology [2]. Volume 1, National Delusions, dealt with economic bubbles and financial delusions such as the Dutch tulip mania of the early seventeenth century. Volume 2, Peculiar Follies, dealt with mass delusions such as the Crusades and witch trials in the Middle Ages. Volume 3, Philosophical Delusions, dealt with practices such as alchemy (typically efforts to turn base metals into gold). MacKay noted many alchemists and their sponsors were themselves deluded, convinced this was possible. Perhaps the most fateful example follower of this delusion was King George III who, facing a huge debt from fighting the Seven Years’ War (French and Indian War), hired alchemists to work in the basement of Buckingham Palace producing gold. Unfortunately, The King is thought to have had porphyria and that arsenic, commonly used in alchemy, precipitated attacks resulting in the madness of King George. (He was said to have mistaken an oak tree for the King of Prussia). (King George was said to have had blue rather than red urine and may have had familial benign hypercalcemia or have been treated with gentian violet. Alternatively, he may have simply been mad [3]).

During these psychotic episodes, the King imposed the Stamp Act ultimately resulting in American independence and proving not all delusions end unhappily (at least for Americans).

The Oxford English Dictionary offers several definitions of a cult. The most relevant to this typescript is: something popular, fashionable or delusional, especially among a particular section of society. Most of us belong to groups psychologists and sociologists might reasonably describe as a cult. Often this involves believing something seemingly incredulous and/or unproveable such as God visiting ten plagues on the Egyptians or Jesus being the Son of God. Millions or billions of people believe these stories absent convincing proof. Important examples of cult-like delusions among scientists are beliefs peoples act rationally or that everything happens for a reason. We discussed the fallacy people act rationally recently in the Journal [4]. The delusion everything happens for a reason denies the important role of chance in biology (reviewed in ref. [5]).

Why are delusions so powerful and persistent? William Bernstein, writing in The Delusions of Crowds, commented: Humans understand the world by narratives. However much we flatter ourselves about our individual rationality, a good story, no matter how analytically deficient, lingers in the mind, resonates emotionally, and persuades more than the most dispositive facts or data [6].

Opinions of crowds, when not delusional, can be useful. In another Journal article we discussed the wisdom of crowds citing the famous experiment of Sir Francis Galton (cousin of Charles Darwin, inventor of the correlation coefficient and of regression to the mean but sadly, father of modern eugenics). About 150 years ago at an English county fair, people in a crowd were challenged to guess the weight of an ox, the winner taking home the butchered animal. Galton thought average people would be terrible in correctly estimating the ox’s weight compared with butchers (experts). He collected the estimates from both cohorts, many ordinary people and a few butchers. The publics’ guess was much closer than the butchers’. The experiment, this time with Penelope the cow, was recently repeated online (https://www.npr.org/sections/money/2015/08/07/429720443/17-205-people-guessed-the-weight-of-a-cow-heres-how-they-did) with the same result, namely the averaged opinion of many non-experts is often more accurate than the opinion of the so-called experts. In another Journal article Barosi and RPG discuss limitations of consensus statements and clinical practice guidelines developed by a few experts [7].

Often people simultaneously hold obviously contradictory beliefs. For example, one may occasionally encounter a physician who believes sickness is an illusion curable by prayer while treating a sufferer with modern drugs which hey believes will improve the condition. Processes by which people manage these discordances are termed compartmentalization and cognitive dissonance by psychologists [8]. The latter is an unpleasant feeling arising when one’s belief is confounded by clearly contradictory data. These remarkably adaptive processes allow people to maintain their delusions. Less this seem too far afield, consider how often a physician helping a patient decide about a potentially dangerous intervention will predict an outcome while at the same time acknowledging substantial inaccuracy and imprecision (in the form of wide confidence intervals around point-estimates) in his or her publications on the same intervention.

By now the reader must be wondering what delusions and cult behavior have to do with haematopoietic cell transplants. We hasten to explain.

Sometimes an idea takes hold in the minds of scientists which makes such sense that it is difficult to refute despite considerable data proving the idea incorrect. Recently, we discussed to cult of granulocyte transfusions where believers remain convinced these are safe and effective despite considerable contradictory data [9]. Although we admit the absence of evidence is not evidence of absence, data of no efficacy for this intervention are overwhelming. Moreover, when a medical intervention puts donors at risk and whose safety is of concern, there is an unequal burden of proof: advocates must prove safety and efficacy whereas others are not obliged to prove the contrary. You may rightly ask why we term this cult behavior. Recently, a distinguished British colleague remarked there are believers and non-believers in granulocyte transfusions. You can ask yourself how, if any, of this differs from cult-like behavior.

Most of us have heard the same phrase of believers and non-believers in transplants. But on to more central issues to transplant experts. Take techniques to prevent graft-versus-host disease (GvHD). In one camp we have believers in graft manipulation, for example by depleting Tγδ cells (reviewed in ref. [10]; Fig. 1). In the other camp we have believers in posttransplant cyclophosphamide (reviewed in ref. [11]). Absent a randomized controlled comparison it is impossible to know which technique, if either, is better so it might not matter; both might work. Relevant to this typescript, however, is how strongly believers in either technique hold their technique is the right, the best technique and how difficult it is to get them to agree an alternative might be better. The ethos of each camp is very strong. Admittedly, the delusions here are different. They are not whether either technique works but rather that one technique is better than the other. In truth, we could know the answer were we to do a randomized controlled trial (RCT) but this seems unlikely to be done. This is, in part, because of the universal challenges in planning and executing RCTs. However, physician cults are the biggest obstacle. Many readers will recall the debacle over autotransplants in women with advanced breast cancer where it was argued RCTs were unethical because there was no need for a control cohort. It turns out there was.

Fig. 1: A typical T-cell-depletion cult.
figure 1

A group of Montana transplant experts who believe in depleting Tγδ cells to prevent GvHD.

Another example of cult-like behavior among transplant mavens is the role of anti-thymocyte globulin (ATG) in pretransplant conditioning which goes under the unlikely term in vivo T-cell-depletion (reviewed in ref. [12]). The discussion is reminiscent of St. Aquinas’ reductio ad absurdum question: Can several angels be in the same place? [13]. Regardless, there are strong believers and non-believers (blasphemers?) in the essential or non-essential role of ATG in transplants.

A third example is what role, if any, results of measurable residual disease (MRD)-testing should influence whether a transplant is done or not for persons with acute myeloid leukemia (AML) in first histological complete remission. One camp argues people who are MRD-test-positive receiving an allotransplant do so poorly they should not be transplanted (reviewed in ref. [14]). The other camp argues it is precisely persons with an MRD-positive test are those most likely to benefit from a transplant (reviewed in ref. [15]). However, the question is not whether persons who are MRD-test-positive do worse than those who are MRD-test-negative, they do, but rather how outcomes of either cohort (MRD-test-positive or -negative) compared with non-transplant therapy(ies). Again, it’s difficult or impossible to get people to change their opinion or to admit the question remains to be settled. The delusion here are more complex than those above because these are mutually exclusive; one must be wrong and the other, not a delusion. An answer can be known by doing an RCT comparing transplant outcomes in people who are MRD-test-positive to a comparator cohort but again this seems unlikely.

There are many more examples of people including transplant specialists strongly believing in unprovable things. Most psychologists believe this cult-like behavior is hard-wired in the human brain. As Bernstein noted: …a good story, no matter how analytically deficient, lingers in the mind …persuades more than the most dispositive facts or data [6]. Get rid of those nasty T-cells; keep the good ones. Why throw the baby out with the bath water? Don’t transplant people who might do badly compared with those who do better even if their fate is improved compared with alternative therapy(ies). We’re reminded of a complaint from the distinguished late Prof. Alberto Marmont who was to be the first author of an International Bone Marrow Transplant Registry (IBMTR) typescript reporting no benefit of T-cell-depleted transplants for leukemia [16]. In developing the reference list we avoided citing articles where he and his colleagues reported the opposite. Remarkably, Prof. Marmont was offended we failed to cite him. So much for rationally and science. And there are some practical reasons to tenaciously hold onto unproveable beliefs and delusions. For example, if one’s academic advancement depends on it. Every meeting organizer wants pro and con speakers for a controversial topic. If you are the pro guy, like to travel, enjoy lecturing to large audiences and enjoy lavash speakers’ dinners sponsored by drug companies you would be hard pressed, perhaps mad, to analyze your position critically. Delusions have their rewards.

So the next time you read about cults such as QAnon or NXIVM don’t think these are only for nuts or uneducated people, conditions sometimes confounded. In fact, many scholars theorize the strength of cultic beliefs increases with uncertainty of which we have a considerable amount in our field. For example, among fisherman in the Trobriand Islands the degree of magical thinking and rituals correlates with their distance from land [17]. And in our field we are often very far from land.

For those who want to read more on this topic we recommend Charles MacKay’s Extraordinary Popular Delusions and the Madness of Crowds [2], Losing Reality by Robert Lifton [18] and William Bernstein’s The Delusions of Crowds [6]. The Holy Bible [19] is always a good read. And remember Montaigne: Nothing is so firmly believed as that which a man knoweth least [20].