Nov 5, 2017

Change we can believe in

In a 2013 article for theguardian Sylvia McLain, a biophysicist, wrote,1

That most scientific studies are ultimately wrong is normal for science. There are more theories in the graveyard of science than theories that stand the test of time. Why? Because new data is always emerging and theories have to be adjusted. Theories are only as good as theories are, until new data comes along and ruins them. Theories give a best guess at what is going on based on things we observe (data), but they are not immutable. If you only have a few data points, then your working theory is more likely to turn out to be wrong. This is not news to science, this is science.

In 1956 Richard Feynman gave a talk in which he emphasized how important uncertainty is in science, he said, "when we know that we actually do live in uncertainty, then we ought to admit it... this attitude of mind - this attitude of uncertainty - is vital to the scientist."2 Scientific knowledge is not about searching certainty as per Karl Popper, science is about searching for "objectively true, explanatory theories."3 Appreciating that "knowledge is fallible and therefore uncertain" science, he said, is about "discovering and eliminating" mistakes.3 Carl Sagan also believed in our human fallibility and noted that we "will always be mired in error."4 According to Sagan, improvement comes from acknowledging our limitations and using the scientific method to correct our errors. Science, said Popper, helps us make progress and learn from our mistakes because criticism is indispensable for error corrections.5

The details how science makes progress are still a hot topic.6 Reading the history of science we can appreciate more than just how wrong we have been in the past; we can also appreciate the time it took to correct the error, and what mechanisms impaired and enabled those corrections.7 We used to believe the Earth was flat, the Earth was the center of the universe, the theory of phlogiston, alchemy, and more falshoods. The history of medicine is also replete with errors. We used to believe in humorism, a theory of disease supported by Galen, and bloodletting as its treatment. Rudolf Virchow later helped debunk humorism. Andreas Vesalius and William Harvey challenged and corrected many errors established by Galen's work in human anatomy and physiology. Before the germ theory of disease physicians and scientists believed in the miasma theory of disease, also spread by Galen. These are just a few examples of how we have established erroneous knowledge, but also made progress under the practice of the scientific method. The list goes on, but the point is that at the time these false theories were current we did not know they were wrong. As noted above, this is how science makes progress and science as a practice itself has also evolved. Historian of science David Wootton stated in his recent book that what we learn from the history of science is that "nothing endures... so too our most cherished theories will one day be supplanted."7 This echoes what Richard Feynman said in 1956, "It is impossible to find an answer which someday will not be found to be wrong."2 If we adopt the scientific attitude and accept our human fallibility and how uncertain knowledge is as noted above, it should not be a surprise when we encounter conflicting evidence of current medical practice. I am sure there's more evidence which is in conflict with current practices, but here are just a few: CK-MB for myocardial infarction,8 use of PPI's in upper GI bleeds,9 fluid management in sepsis,10, PCI for stable angina.11 In developing expertise with a scientific attitude, therefore, it becomes important how one learns about our human fallibility and the uncertainty of knowledge.

Our education, especially that from textbooks, do not expose us to the conflicts encountered between changing paradigms. Instead, we are presented with a linear picture of progress, which essentially is a "drastic distortion" of the past said Thomas Kuhn.12 He called it a "narrow and rigid education" that misrepresents not only how progress is made, but also how science works.12 A major change in medical education occurred in 1910 when Abraham Flexner, an educator, wrote a report about the state of medical education in the US and Canada.13 At the time much of medical education and medical practice were suboptimal and did not have a strong scientific foundation when compared to other specific parts of the world. Among Flexner's recommendations were to develop physicians who did not act like parrots, but instead, think like scientists who apply scientific principles in problem-solving. Misunderstanding and poorly implementation of Flexner's ideas remain present to this day as we are reminded by Kenneth Ludmerer.14 The idea of developing a scientific attitude and characterizing science as more than a "body of knowledge" has been highlighted by Carl Sagan and more recently by Andrew Shtulman in his most recent book Scienceblind: Why Our Intuitive Theories About the World Are So Often Wrong.15 Among other things, Shtulman has studied how highly educated people who support Charles Darwin's theory of evolution can still have the wrong explanation for this theory. The illusion of understanding has been demonstrated in a documentary where bright students and college graduates were unaware of their own misconceptions.16

Science and scientific explanations are not perfect and are always incomplete, but this is the best method we have to correct our errors and learn how nature works. Medicine is also not perfect, like science it uses imperfect tools and inferences full of all kinds of uncertainties. Unfortunately, these aspects of medicine continue to be neglected,17,18 making its practice less scientific, more dogmatic, and less realistic. Science and medicine both benefit from having the same critical attitude, recognizing that knowledge is both fallible and under constant revision. So, it seems to me a false dichotomy when talking about medical knowledge as if it were something completely separate from that of science. Unlike religion and cults, science does not demand beliefs in stories or authorities without considering evidence. Instead, science, as so should medicine, asks to judge the evidence and proportion our beliefs and actions accordingly. Unlike medicine, science emphasizes the distinction between having a scientific attitude, which helps "call the bluff of those who only pretend to knowledge,"4 and memorizing its products. The discrepancy in medical practice is when the attitude does not coincide with our human nature and the uncertainty of knowledge.

Unlike dogmatism, scientists are skeptical even of their own theories, Imre Lakatos said, he also added that a "blind commitment to a theory is not an intellectual virtue: it is an intellectual crime."19 Philip Tetlock found that poor predictions were associated with a dogmatic attitude,20 while better predictors considered their beliefs as "hypothesis to be tested, not treasures to be guarded."21 Most common forms of assessment these days do not take into account uncertainty while providing an illusion that memorization of context-free bullet-point-information is enough to understand the complexities of knowledge and decision making. These assessments are easily answered with superficial knowledge without asking for evidence of understanding. This creates an illusion of explanatory depth in which explanations rely on intuitions, are more likely to be wrong and severely incomplete.15,22 Not only does deep knowledge require assessment of evidence, it demands an understanding of how uncertainty affects the evidence and its application. One of the major authors of uncertainty in clinical medicine is Benjamin Djulbegovic who wrote and quoted Kenneth Ludmerer, "failure to train doctors about clinical uncertainty has been called "the greatest deficiency of medical education throughout the twentieth century.""23 Medicine is still not close to remediating this deficiency, authors in 2016 called tolerating uncertainty the "next medical revolution."17

The familiar phrase "all models are wrong" published by George E. P. Box should provide great insight into the uncertainty of knowledge. Not only should medicine discourage blind commitment to theories, stories, and practices without considering the validity of the evidence; it should also acknowledge how current problems with medical and scientific practices affect our knowledge. John Ioannidis has published relevant articles in these areas pointing out that most published research findings are false24 and most clinical research is not helpful for medical practice.25 A clinical practice closer to reality may be achieved by understanding how p-hacking,26 publication bias, statistical misinterpretations,27 the replication crisis, and other problems further exacerbate the uncertainty of knowledge and its application. Even when not influenced by financial, political, or social group interest, as it should be, scientific knowledge is not set in stone.

In a recent article, Richard Wenzel wrote that knowledge is "provisional... our books are vastly incomplete and that current concepts represent only a temporary resting place for understanding, continually requiring testing and further analyses."28 This uncertainty makes people uncomfortable and might even try to find refuge in others. This is only a false sense of security. If knowledge were determined on the strength of what others believe, "we should have to rank some tales about demons, angels, devils... as knowledge" said Lakatos.19 It is known that practice changes can take a very long time, even after evidence disconfirms current the status quo.29 Ignoring evidence that contradicts prior beliefs is not a hallmark of the scientific attitude. If confirmation bias is the modus operandi cognitive dissonance will be avoided, furthermore, stories of rationalization will be made. Having good stories is not good enough, we should take how we evaluate the evidence and our judgment more seriously. As Daniel Kahneman reminds us we "tend to have great belief, great faith in stories that are based on very little evidence."30 This is where problem-based learning, as an iterative process, helps with updating our prior beliefs when appropriate. It is a framework for independent or inter-dependent practice that cuts the "knowledge-middleman";31 it connects the uncertainty of knowledge and personal practice directly with the literature.



References:

  1. McLain, S., Not breaking news: many scientific studies are ultimately proved wrong!, theguardian, September 17, 2013, Accessed November 3, 2013
  2. Feynman, R., Robbins, J., The pleasure of finding things out : the best short works of Richard P. Feynman., 1999
  3. Popper, K., In search of a better world: lectures and essays from thirty years., 1996
  4. Sagan, C., The demon-haunted world: science as a candle in the dark., 1997
  5. Popper, K., Conjectures and Refutations: The Growth of Scientific Knowledge., 1963
  6. Nickles, Thomas, "Scientific Revolutions", The Stanford Encyclopedia of Philosophy (Winter 2016 Edition), Edward N. Zalta (ed.), URL = .
  7. Wotton, D., The Invention of Science: A New History of the Scientific Revolution., 2016
  8. Alvin MD, Jaffe AS, Ziegelstein RC, Trost JC. Eliminating Creatine Kinase–Myocardial Band Testing in Suspected Acute Coronary SyndromeA Value-Based Quality Improvement. JAMA Intern Med. 2017;177(10):1508–1512. doi:10.1001/jamainternmed.2017.3597
  9. Coronel E, Bassi N, Donahue-Rolfe S, Byrne E, Sokol S, Reddy G, Arora VM. Evaluation of a Trainee-Led Project to Reduce Inappropriate Proton Pump Inhibitor Infusion in Patients With Upper Gastrointestinal BleedingSkip the Drips. JAMA Intern Med. Published online October 09, 2017. doi:10.1001/jamainternmed.2017.4851
  10. Simpson, N., Lamontagne, F., Shankar-Hari, M., Septic shock resuscitation in the first hour., Curr Opin Crit Care. 2017 Dec;23(6):561-566. doi: 10.1097/MCC.0000000000000460.
  11. Al-Lamee, R., et al., Percutaneous coronary intervention in stable angina (ORBITA): a double-blind, randomised controlled trial., The Lance, November 2, 2017, DOI: http://dx.doi.org/10.1016/S0140-6736(17)32714-9
  12. Kuhn, T., The structure of scientific revolutions., 1962
  13. Flexner, A., Medical education in the United States and Canada: A report to the Carnegie Foundation for advancement of teaching., 1910
  14. Ludmerer, K., Commentary: Understanding the Flexner Report., Academic Medicine, 2010 Feb;85(2):193-6. doi: 10.1097/ACM.0b013e3181c8f1e7.
  15. Shtulman, A., Scienceblind: why our intuitive theories about the world are so often wrong., 2017
  16. A private universe and Minds of our own., Annenberg Learner, Accessed November 3, 2017
  17. Simpkin A., Schwartzstein R., Tolerating Uncertainty - The Next Medical Revolution?, NEJM, 2016 Nov 3;375(18):1713-1715.
  18. Kassirer, J., Our stubborn quest for diagnostic certainty. A cause of excessive testing., NEJM, 1989 Jun 1;320(22):1489-91.
  19. Lakatos, I., Science and Pseudoscience., BBC Radio talk, 1973
  20. New Freakonomics Radio Podcast: The Folly of Prediction by Stephen Dubner., Freakonomics, September 4, 2011, Accessed November 3, 2017
  21. How to predict the future better than anyone else by Ana Swanson., Washinton Post, January 4, 2016, Accessed November 3, 2017.
  22. Keil FC. Explanation and Understanding. Annual review of psychology. 2006;57:227-254. doi:10.1146/annurev.psych.57.102904.190100.
  23. Djulbegovic B. Lifting the fog of uncertainty from the practice of medicine: Strategy revolves around evidence, decision making, and leadership. BMJ : British Medical Journal. 2004;329(7480):1419-1420.
  24. Ioannidis JPA. Why Most Published Research Findings Are False. PLoS Medicine. 2005;2(8):e124. doi:10.1371/journal.pmed.0020124.
  25. Ioannidis JPA. Why Most Clinical Research Is Not Useful. PLoS Medicine. 2016;13(6):e1002049. doi:10.1371/journal.pmed.1002049.
  26. Kerr, N., HARKing: Hypothesizing After the Results are Known., Pers Soc Psychol Rev. 1998;2(3):196-217
  27. Greenland S, Senn SJ, Rothman KJ, et al. Statistical tests, P values, confidence intervals, and power: a guide to misinterpretations. European Journal of Epidemiology. 2016;31:337-350. doi:10.1007/s10654-016-0149-3.
  28. Wenzel, R., Medical Education in the Era of Alternative Facts., NEJM, 2017; 377:607-609 August 17, 2017
  29. Balas E, Boren S. Managing clinical knowledge for health care improvement. In: Bemmel J, McCray AT (eds). Section 1: health and clinical management. In Yearbook of Medical Informatics: Patient Centered Systems. Stuttgart, Germany: Schattauer Verlagsgesellschaft; 2000:65-70.
  30. Why We Contradict Ourselves and Confound Each Other., Onbeing, October 5, 2017, Accessed November 3, 2017
  31. Stemwedel, J., Brief thoughts on uncertainty. Scientific American, March 30, 2014, Accessed November 3, 2017

No comments:

Post a Comment

1. You should attempt to re-express your target’s position so clearly, vividly, and fairly that your target says, “Thanks, I wish I’d thought of putting it that way.
2. You should list any points of agreement (especially if they are not matters of general or widespread agreement).
3. You should mention anything you have learned from your target.
4. Only then are you permitted to say so much as a word of rebuttal or criticism.
Daniel Dennett, Intuition pumps and other tools for thinking.

Valid criticism is doing you a favor. - Carl Sagan