Dec 4, 2017

New reading and note taking practices


Teaching Machines and programmed instruction

Depending on what definition of technology one uses, technology has always been part of education. In the 1920's Sidney Pressy invented the teaching machine, a device used to teach and assess students' knowledge via multiple choice questions.1 These devices, not necessarily electronic, presented information in an audio and visual format and asked the students to choose the right answer by pressing a button. Fragments of information were presented and tested in a logical sequential manner, this was part of an educational practice known as programmed learning or programmed instruction.2 This type of instruction was and still is present at many different levels of education and training, including medical education.3,4

B. F. Skinner's viewed the traditional use of lectures, textbooks, and audio/visual material in education as inefficient and passive, instead, he wanted the student's learning to be more interactive and efficient, so he recommended using teaching machines as tutors.1 Skinner wanted to apply teaching machines differently from that of Pressey's approach. According to him, an important feature of his approach to teaching machines is that,

The student must compose his response rather than select it from a set of alternatives, as in a multiple-choice self-rater. One reason for this is that we want him to recall rather than recognize -to make a response as well as see that it is right. Another reason is that effective multiple-choice material must contain plausible wrong responses, which are out of place in the delicate process of "shaping" behavior because they strengthen unwanted forms. Although it is much easier to build a machine to score multiple-choice answers than to evaluate a composed response, the technical advantage is outweighed by these and other considerations.1

Skinner viewed teaching machines as programmed-tutors which ask students to perform progressively difficult tasks (behaviors) until the students can perform them effectively and independently. Ultimately, the goal was "more than the acquisition of facts and terms"1 via practice and self-instruction, not rote memorization. B. F. Skinner called the people who write the material for teaching machines programmers. In order to wean students from the teaching machine, programmers had to progressively write less helpful hints while at the same time increase the difficulty of each task.


Computers and programming

Another educator interested in student learning was computer scientist Seymour Papert. Unlike B. F. Skinner, Papert envisioned computers not just for providing questions and answers to the students, but as a tool students could use to learn how to think and solve problems. He developed the learning theory constructionism where students take into account their prior knowledge, contextual factors, and "construct mental models to understand the world around them."5

Papert recognized the distinction between teaching and learning, like Skinner, he also viewed the traditional lecture-driven classroom as an "artificial and inefficient learning environment."6 He envisioned the process of learning how to think and problem-solving facilitated by computer programming. In his 1980 book Mindstorms, he wrote,

In many schools today, the phrase "computer-aided instruction" means making the computer teach the child. One might say the computer is being used to program the child. In my vision, the child programs the computer and, in doing so, both acquires a sense of mastery over a piece of the most modern and powerful technology and establishes an intimate contact with some of the deepest ideas from science, from mathematics, and from the art of intellectual model building.6

Papert's approach to learning how the mind thinks via construction of mental models is not strange to cognitive science and philosophy of mind where the brain is viewed as an information processing organ that creates representations. Our mental models have helped us in using tools more effectively since before the invention of fire, the wheel, and definitely sliced bread. The problem is that education is not keeping up, and instead, it's only teaching the findings of science without its critical method and/or using technology only to impart more lectures (now TED-talk lectures) and answer a greater number of multiple-choice questions.


Coding personal wikis

Getting an education where we mainly use paper notebooks and textbooks, pencils and pens, or computers to point and click is a thing of the early twentieth century. Education should start teaching how students can harness computers for their own lifelong learning. Computers can help with our cognitive limitations just as any other piece of technology or tool has helped advance the human race in history when used correctly. Relying on technology only as a tool for information delivery and MCQ exams, a la Khan Academy, is to perpetuate the inefficiency and ineffective practices B. F. Skinner warned us about decades ago. Education should reflect what science has been telling us since its invention: that knowledge changes all the time and we are deep in error.

Our habits of reading and writing with the help of computers, the Internet, and the WWW are different from those of the twentieth century. The WWW makes it possible for us to have more access to information thanks to the Open Access and Open Educational Resources movements. We can learn more from experts when they post their thoughts and links on social media websites. It is important to note that all information (re)sources, on the Internet or off the Internet, are prone to errors, this is what science has taught us.

The free and open movement on the Internet also provides tools that are freely licensed and for anyone to copy and add modifications as they see fit. This aligns with what Seymour Papert envisioned as the person programs the computer instead of the computer programming the person. One of this tools is TiddlyWiki which I have been exploring lately. (I don't have the expertise to recommend this product so consult an expert if you are interested in using it.) Unlike paper notebooks, the wiki format of notetaking, linking, and editing information are some of the most attractive features of using wikis. Printed and online textbooks are sources of information that have a linear and narrow format that does not reflect real-world practice. A personal wiki can be used to integrate information not only from textbooks but also from other sources to give a better picture of what a personal practice looks like. Another great advantage of wikis is that they can be written collaboratively and fact-checked by its authors, see Wikipedia.

The concept of reading for notetaking on a wiki is quite different from that on paper notetaking and printed textbooks. It also requires knowledge of wikitext and other languages in some instances. Wikis can help us to better organize and identify knowledge gaps, and with the help of hypertext and tagging get a better picture of our semantic knowledge (see http://tiddlymap.org/). Information can also be presented in quiz format (http://tw5magick.tiddlyspot.com/). One of the TiddlyWiki's that has impressed me the most is by Alberto Molina Perez. He has modified the code several times to fit his own needs for notetaking (see image above) and also used it to post his thesis.


No one can do the reading or note taking for us

In 1940 Mortimer Adler wrote How to read a book which I think it's still relevant these days. He emphasizes the practice of reading comprehension in which a person reads for deep understanding and connects concepts from various disciplines where possible. He also added,

But, after all, we still have to read the periodicals which accomplish these extraordinary digests of current news and information. If we wish to be informed, we cannot avoid the task of reading, no matter how good the digests are. And the task of reading the digests is, in the last analysis, the same task as that which is performed by the editors of these magazines on the original materials they make available in more compact form. They have saved us labor, so far as the extent of our reading is concerned, but they have not and cannot entirely save us the trouble of reading. In a sense, the function they perform profits us only if we can read their digests of information as well as they have done the prior reading in order to give us the digests.7

No one can do the reading or notetaking for us, especially when science and knowledge are constantly undermined. There are practices that give an illusion of understanding or security when in reality what we are getting is only superficial knowledge at best. It is important to keep track of our thoughts and their sources in a reliable place where we can go back and make appropriate modifications if necessary.




References:

  1. Skinner, B.F., Teaching Machines., Science, Vol. 128, Number 3330, October 24, 1958., DOI: 10.1126/science.128.3330.969
  2. "Programmed learning." Wikipedia, The Free Encyclopedia. Wikipedia, The Free Encyclopedia, 9 Nov. 2017., Accessed 2 Dec. 2017.
  3. Owen SG, Hall R, Anderson J, Smart GA. Programmed learning in medical education. An experimental comparison of programmed instruction by teaching machine with conventional lecturing in the teaching of electrocardiography to final year medical students. Postgraduate Medical Journal. 1965;41(474):201-205.
  4. Owen SG, Hall R, Waller IB. Use of a Teaching Machine in Medical Education; Preliminary Experience with a Programme in Electrocardiography. Postgraduate Medical Journal. 1964;40(460):59-65.
  5. Constructionism (learning theory). (2017, September 20). In Wikipedia, The Free Encyclopedia. Accessed December 4, 2017,
  6. Papert, S., Mindstorms: Children, Computers, And Powerful Ideas., 1980
  7. Adler, M., and Van Doren, C., How to read a book: the art of getting a liberal education., 1972

Image 1 source: Skinner, B.F., Teaching Machines., Science, Vol. 128, Number 3330, October 24, 1958., DOI: 10.1126/science.128.3330.969

Image 2 source: https://pixabay.com/en/anatomy-biology-brain-thought-mind-1751201/

Image 3 source: http://bottomtabs.tiddlyspot.com/#William%20Shakespeare

Nov 23, 2017

There's none so blind as those who will not see

Unlike popular belief the advancement of teaching physicians to think like scientists, an unmet goal to this day, began before the 1910 Flexner Report. In the mid to late 1800's American physicians went to Germany to "study the fundamental medical sciences and learn the techniques of experimental medical research" reports medical historian Kenneth Ludemerer.1 At that time physicians in the U.S. did not find the relevance of laboratory research in their practice and were mostly concerned with clinical medicine. After graduating from medical school Henry P. Bowditch, nephew of Henry Ingersoll Bowditch, went to study in Germany and later returned to the U.S. and became dean of Harvard Medical School from 1883 to 1893.2 During his tenure, he advanced medical education by introducing laboratory research and with it created one of the best programs in the country at that time. In 1911 one of his students, Charles S. Minot, wrote,

He stood for the highest ideals of progress and maintained always that the old-fashioned "practical" physicians must be replaced by men scientifically trained and animated by the scientific spirit.3

In the late 1800's medical knowledge grew exponentially as a result of new scientific discoveries and modifications in diseases and treatments relevant to daily clinical practice. Also, medical information was spreading at a faster rate than before thanks to the help of new communication technologies. "Information overload from proliferating medical knowledge was noted by John Shaw Billings in 1881," Richard Lehman wrote in a recent article.4 According to Ludmerer, the physicians who returned from Germany were "versed in both bedside medicine and the application of the knowledge and techniques of the fundamental sciences to the study of human diseases."1 These novel practitioners, clinical scientists, were placed in the clinical department of medical schools where they shared the responsibility of teaching the new generations of physicians. Clinical scientists put less emphasis on didactic lectures, rote memorization, textbook knowledge, and authority, instead, medical education focused on the development of "problem solvers and critical thinkers who knew how to discover and evaluate information for themselves."5 These were also characteristics emphasized by the philosopher and educator John Dewey who, in 1910, wrote that science "has been taught too much as an accumulation of ready-made material, with which students are to be made familiar, not enough as a method of thinking, an attitude of mind, after the pattern of which mental habits are to be transformed."6

Sience as way of thinking was also emphasized by William Osler7 and W. S. Thayer, a former student of Osler, who in 1932 wrote: "our main objects should be to impress on the student that the clinician must reason for himself in a scientific fashion."8 The misunderstanding of science as more than memorization of facts was also addressed by Carl Sagan who wrote, "science as a method rather than a body of knowledge is not widely appreciated outside of science or indeed in, I’m sorry to say, in some of the corridors inside of science."9 In his 1910 report Abraham Flexner proposed training physician in scientific reasoning and discouraged the parrot-like memorization of facts. Flexner understood both fields and helped dispell the false dichotomy myth between the two when he wrote: "progress of science and the scientific or intelligent practice of medicine employ, therefore, exactly the same technique."10 Unfortunately, the misunderstanding of Flexner's conceptual reform continues to this day, a problem Kenneth Ludmerer has tried to clarify. One of Flexner's main goals according to Ludmerer was to develop "clinicians who could reason scientifically in the care of patients" with the understanding that a diagnosis was addressed as a hypothesis to be tested.1

The text above demonstrates that current hype about information overload, change in medical knowledge, and new technologies of communication are not new issues in medicine and medical education. Unfortunately, the medical curriculum continues to be stuffed with irrelevant information while not offering a strong background in how science works, current problems in publication, and questionable scientific practices. This creates an intellectual vulnerable environment where unfounded claims may be adopted without close scrutiny. This is something Carl Sagan warned us about when he wrote: "If you've never heard of science (to say nothing of how it works), you can hardly be aware you're embracing pseudoscience."11 Understanding science's critical method, instead of just accepting its products, is of utmost priority, Sagan alluded.11 Richard Feynman also wrote that scientific integrity is important if we don't want to make a fool of ourselves and/or others.12

The undermining of science as an inquiry process and the inappropriate use of its technical terms were addressed by Alan Sokal when he, and co-author Jean Bricmont, wrote Fashionable nonsense. They state "in our opinion, is the adverse effect that abandoning clear thinking and clear writing has on teaching and culture. Students learn to repeat and to embellish discourses that they only barely understand. They can even, if they are lucky, make an academic career out of it by becoming expert in the manipulation of an erudite jargon."13 The misunderstanding and misuse of science are more common than we can ever imagine, especially if we are not trained to learn how it works.14,15,16 The friction between the sciences and the humanities continues to this day as illustrated in a recent exchange between Steven Pinker and Leon Wieseltier. Pinker states that sources of errors in our beliefs include "faith, revelation, dogma, authority, charisma, conventional wisdom, the invigorating glow of subjective certainty" and what needs to be encouraged is "skepticism, open debate, formal precision, and empirical tests."17 In a different article, when prompted about adopting new technology Pinker wrote,

Take the intellectual values that are timeless and indisputable: objectivity, truth, factual discovery, soundness of argument, insight, explanatory depth, openness to challenging ideas, scrutiny of received dogma, overturning of myth and superstition. Now ask, are new technologies enhancing or undermining those values? And as you answer, take care to judge the old and new eras objectively, rather than giving a free pass to whatever you got used to when you were in your 20s.18

Without understanding how science works current fads, labels, and pseudo-debates about technology and information overload only serve as distractions from the real debate: Why isn't learning how science works a top priority in medical education? How can physicians claim expertise in taking care of patients without understanding how science works and its current problems? Also, doesn't expertise include mastering a domain's history and philosophy, two subjects severely absent in medical education? Other subjects absent in medical education include statistics, decision theory, and cognitive science. Anders Greenland has raised some related issues, he states,

I believe other educational omissions besides causal models have been major contributors to the currently lamented research ‘‘crises’’. Two topics in dire need of early and continuing education are basic logic and cognitive psychology (Gilovich et al. 2002; Lash 2007). Especially important are the logical and statistical fallacies manifested in routine misinterpretations of basic statistics, and the biases built into current teaching and practice that encourage these fallacies (Box 1990; Greenland 2011, 2012b, c, 2016; Greenland et al. 2016). Their persistence attests to the fact that degrees in statistics and medicine do not require substantial training in or understanding of scientific research or reasoning, but nonetheless serve as credentials licensing expressions of scientific certainty (Greenland 2011, 2012c).19

There's none so blind as those who will not see



References:

  1. Ludmerer, K., Let me heal: The opportunity to preserve excellence in American medicine., 2015
  2. "Henry Pickering Bowditch.", Wikipedia: The Free Encyclopedia. Wikimedia Foundation, Inc. 22 July 2004, Accessed November 22, 2017
  3. Minot, C. S., Henry Pickering Bowditch., Science, 1911 Apr 21;33(851):598-601.
  4. Ludmerer, Kenneth M. "Commentary: Understanding the Flexner Report." Academic Medicine 85.2 (2010): 193-96.
  5. Richard Lehman. Sharing as the Future of Medicine. JAMA Intern Med. 2017;177(9):1237–1238. doi:10.1001/jamainternmed.2017.2371
  6. Dewey, J., Science as Subject-Matter and as Method., Science, vol. 31, no. 787, 1910, pp. 121–127
  7. Osler, W., The old humanities and the new science., Br Med Jrnl, July 5, 1919, 2(3053):1-7
  8. W. S. Thayer,. Thoughts on medical education in the United States., JAMA. 1932;99(1):3–9. doi:10.1001/jama.1932.02740530005002
  9. Kerry Klein and Sarah Crespi, Science 12 Oct 2012: Vol. 338, Issue 6104, pp. 274, DOI: 10.1126/science.338.6104.274-b
  10. Flexner, A., Medical education in the United States and Canada., 1910
  11. Carl Sagan, The Demon-Haunted World: Science As a Candle in the Dark, 1996
  12. Feynman, R., Cargo Cult Science., 1974
  13. Sokal, A., Bricmont, J., Fashionable Nonsense: Postmodern intellectual's abuse of science., 1998
  14. Ioannidis, J., Evidence-based medicine has been hijacked: a report to David Sackett., J Clin Epidemiol. 2016 May;73:82-6. doi: 10.1016/j.jclinepi.2016.02.012.
  15. Ioannidis, J., Hijacked evidence-based medicine: stay the course and throw the pirates overboard., J Clin Epidemiol. 2017 Apr;84:11-13. doi: 10.1016/j.jclinepi.2017.02.001.
  16. Ioannidis, J., How to survive the medical misinformation mess., Eur J Clin Invest. 2017 Nov;47(11):795-802. doi: 10.1111/eci.12834.
  17. Pinker, S., Science is not your enemy., New Republic, August 6, 2013, Accessed November 23, 2017
  18. Pinker, S., The age of the informavore., edge, October 25, 2009, Accessed November 23, 2017
  19. Sander Greenland, For and Against Methodologies: Some Perspectives on Recent Causal and Statistical Inference Debates. Eur J Epidemiol. 2017 Jan;32(1):3-20.

Nov 20, 2017

Noise hides in plain sight

Information is highly valued, therefore, it is no surprise why it gets a high financial and emotional price. There are different ways of getting access to information which range from buying books and textbooks, going to conferences, getting an education, paying a private tutor, paying for Internet access, being part of a particular culture, paying to read research articles etc. Although information is ubiquitous the quality and value are usually negotiated between the receiver and the sender. This, of course, does not mean that all information is available to everyone at all times. Although we can be ignorant of certain information, we can also be ignorant of being ignorant of that particular information as well. The possession of information does not necessarily imply well-informed since information can also, unknowingly to us, misinform us.

Cognitive science studies the brain as an information processing organ, whose unobservable functions can be scientifically studied via observable behaviors. This is in contrast to Behaviorism which attributes causality of behaviors to the environment, but does not theorize about brain processes, except to say that neurons change over time.1 Cognitive scientists, with modifications, use concepts from the field of computer science in explaining, designing, and testing models of brain processes and human behavior. One of the most important concepts borrowed from the computer science field was Claude Shannon's information theory,2

Information theory studies the transmission, processing, extraction, and utilization of information. Abstractly, information can be thought of as the resolution of uncertainty. In the case of communication of information over a noisy channel, [...] "information" is thought of as a set of possible messages, where the goal is to send these messages over a noisy channel, and then to have the receiver reconstruct the message with low probability of error, in spite of the channel noise.3

Gallistel and King have studied how the brain, over noisy environments, extracts information and creates unobservable mental representations either for immediate or later use.4 If the aim of a system is to be accurate, understanding the complex relationship between signal to noise and bias is of crucial importance. This difficult to study relationship, which has been studied under decision theory, was illustrated by Daniel Kahneman et al. in a recent article.5



Kahneman et al. state that noise is "always undesirable—and sometimes disastrous", they add that it is beneficial "to know about bias and noise" in decision-making, but this can be challenging due to difficulties in error measurement and uncertainty of decision outcomes.5 If decision making is a function of our mental representations from acquired information, how our brains process information and the source of information need to be examined more closely as well. It is not uncommon to find medical information in different settings, including lectures and books, displayed without its evidence as follows,

Signs and symptoms for atrial fibrillation:

  • Fatigue
  • Tachypnea, dyspnea
  • Palpitations
  • Lightheadedness

Differential diagnosis for atrial fibrillation:

  • MI
  • MAT
  • Atrial flutter
  • Tachyardia

Etiology of atrial fibrillation:

  • Hypertension
  • Anemia
  • Thyrotoxicosis
  • COPD
  • Idiopathic

And this may be followed by the memorization of a "typical" presentation of a patient with a particular disease and treatment:

A 57-year-old man with a history of hypertension presents to the emergency department with a 3-hour history of palpitations, shortness of breath, and lightheadedness. An EKG shows irregularly irregular rate and his blood pressure is 105/70.

What this creates is not expertise in medical practice, but storytellers great at just-so stories with no regards towards accuracy, noise reduction, and uncertainty. This process of learning medical information neglects the fact that probability of data (signs, symptoms, test results, etc) given a disease is not the same as the probability of a disease given the data. Presentation of information, as shown above, does not fit with a realistic framework that medical knowledge is uncertain and continuously changing, signs and symptoms vary, diagnostic tests are not perfect, and treatments are not 100% effective. This decontextualized framework of learning and assessing information is flawed. Furthermore, this creates an illusion of validity and overconfidence under the use of vague verbiage. Kahneman et al state the problem of not being aware of noise is that "people do not go through life imagining plausible alternatives to every judgment they make."5 This, on the other hand, is an important feature of the scientific method and the purpose of the differential diagnosis in medical practice if done well. But accuracy and noise reduction requires understanding how science works, including mastering the language of probability and uncertainty.



Acquiring more facts doesn't necessarily improve the accuracy of an intelligent system, as Steven Pinker reminds us it "must be equipped with a smaller list of core truths and a set of rules to deduce their implications."6 Information should not be viewed as neutral characteristic of the system used to pass exams and placed in books, they are part of the process of building our mental representations for accurate decision-making. Daniel Dennett claims that semantic "information is valuable—misinformation and disinformation are either pathologies or parasitic perversions of the default cases" adding that we "can’t be misinformed by distinctions we are not equipped to make."7 If education is not helping us to to develop a system to reduce noise, what is the purpose of education?

If information is a distinction that makes a difference, we are invited to ask: A difference to whom? Cui bono? Who benefits?

Daniel Dennett


References:

  1. Epstein, R., The empty brain: Your brain does not process information, retrieve knowledge or store memories. In short: your brain is not a computer., Aeon, May 18, 2016, Accessed November 20, 2017
  2. Shannon, C. E. (1948), A Mathematical Theory of Communication. Bell System Technical Journal, 27: 379–423. doi:10.1002/j.1538-7305.1948.tb01338.x
  3. "Information theory." Wikipedia: The Free Encyclopedia. Wikimedia Foundation, Inc., November 13, 2017, Accessed November 20, 2017
  4. Gallistel, C. R., and Adam Philip. King. Memory and the Computational Brain: Why Cognitive Science will Transform Neuroscience. Hoboken, John Wiley & Sons, 2011
  5. Kahneman, D., et al., Noise: How to Overcome the High, Hidden Cost of Inconsistent Decision Making., HBR, October 2016, Accessed November 20, 2017
  6. Pinker, S., How the mind works., 1997
  7. Dennett, D., From Bacteria to Bach and Back: The Evolutions of Minds., 2017

Image 2 source: https://xkcd.com/795/

Nov 15, 2017

What is the purpose of education?

"Intuition is nothing more than recognition." Herbert Simon

The purpose of education means different things to different people. According to Steven Pinker, the purpose of education is "to make up for the shortcomings in our instinctive ways of thinking about the physical and social world."1 The most effective method of understanding how the world works and correcting our faulty intuitions is the scientific method.2 An education without teaching how science works is not just a bad education, it is also dangerous to a functioning democracy.3 Susan Jacoby has written about the effects an education without science has in our society and also how difficult it is to overcome our ignorance. She found that students and teachers have a misunderstanding how science works and that teachers tend to avoid topics that challenge students' prior beliefs. She believes the purpose of teaching is to "replace ignorance with knowledge—a process that generally does involve a fair amount of argument."4 This is pretty much in line with what John Dewey said in 1909, "teaching must not only transform natural tendencies into trained habits of thought, but must also fortify the mind against irrational tendencies current in the social environment, and help displace erroneous habits already produced."5



An education that appeals to intuition disregards that our minds are predisposed to cognitive errors and does not align with how challenging it is to obtain knowledge from the world. John Dewey wrote that education is not only "to safeguard an individual against the besetting erroneous tendencies of his own mind—its rashness, presumption, and preference of what chimes with self-interest to objective evidence—but also to undermine and destroy the accumulated and self-perpetuating prejudices of long ages."5 According to Pinker other sources of errors include "faith, revelation, dogma, authority, charisma, conventional wisdom, the invigorating glow of subjective certainty," in order to pursue knowledge he suggests "we must cultivate work-arounds for our cognitive limitations, including skepticism, open debate, formal precision, and empirical tests, often requiring feats of ingenuity."6 Science is not only more accurate than our intuition in knowledge acquisition, it also does a better job at teaching us how the brain gives rise to the mind.

We can't directly observe our thought process which makes it difficult to understand how our brains function. Daniel Dennett wrote, "we don’t see, or hear, or feel, the complicated neural machinery churning away in our brains but have to settle for an interpreted, digested version, a user-illusion that is so familiar to us that we take it not just for reality but also for the most indubitable and intimately known reality of all. [...] Our access to our own thinking, and especially to the causation and dynamics of its subpersonal parts, is really no better than our access to our digestive processes..."7 Since we can't directly observe our thought process, or that of others for that matter, we must rely on other means to infer what happens in the brain. The brain is an information processing organ with no direct access to the external reality and we don't have direct access to its processes. The brain creates what Daniel Dennett calls the user-illusion, which is similar to the illusion created by handling desktop items and the computer hardware; we are ignorant of the computer's hardware processes while managing the user's interface. Similar to how we can download software into electronic devices, the brain acquires "software" for the mind by processing information from the environment, Dennett states.


These thought processes have been called mental representations and computations and they can be studied scientifically without having to "ignore the evidence of our own eyes and claim that human beings are bundled of conditioned associations, puppets of the genes, or followers of brutish instincts" Steven Pinker wrote.8 Although mental representations are not directly observable, it is important not to attribute them to a Cartesian theater in the brain (pictured above) or as some indescribable thing that floats above our heads in the ether. Some scientists might claim that science has no business in theorizing about what is not directly observable, but scientific realists think otherwise.9 Another important feature of mental representations is that they are not definitions, Pinker writes,

"a definition (which admittedly is always incomplete) is not the same thing as a semantic representation. [...] A semantic representation is a person's knowledge of the meaning of an English word in conceptual structure (the language of thought), processed by a system of the brain that manipulates chunks of conceptual structure and relates them to the senses. Definitions can afford to be incomplete because they can leave a lot to the imagination of a speaker of the language. Semantic representations have to be more explicit because they are the imagination of the speaker of the language."1

This is an important distinction because memorization of definitions can easily be confused with mental representations. If we are interested in having accurate mental representations learning how our minds work and how it applies to our everyday lives is an important component in decision making. Another important distinction is that between intuition, judgment, and decision making. We all develop expert intuition with enough exposure and frequent feedback in various aspects of our lives according to Daniel Kahneman. He also wrote that 'the accurate intuitions of experts" under uncertainty "are better explained by the effects of prolonged practice than by heuristics."10 Most often times it is thought that intuition dictates the whole process of decision making in medicine, but this, in my opinion, is wrong. This fails to take into account the importance of adding and subtracting diseases in the differential diagnosis; an effortful process that treats diseases more like hypotheses to be tested rather than commandments set in stone. This notion also neglects the process of decreasing contextual uncertainties in the communication process between all parties involved in the decision-making process. Another important feature not taken into account is the uncertainty that exists in medical/scientific knowledge or the fact that diagnostic tests are not perfect and medical treatments are not 100% effective. If exposure and frequent feedback are important for intuition development then, I would expect differences in expert intuition to vary as disease prevalence varies across different regions and time. This renders intuition as a dynamic entity and not as a static trait in people. Therefore, performance, especially based on intuition alone, cannot explain expertise in clinical decision making.

Although expert intuition is more accurate than that of the non-expert, expert intuition is not 100% accurate. Daniel Kahneman reminds us that expert intuition "strikes us as magical, but it is not. [...] The psychology of accurate intuition involves no magic. [...] You can feel [Herbert] Simon’s impatience with the mythologizing of expert intuition when he writes: “The situation has provided a cue; this cue has given the expert access to information stored in memory, and the information provides the answer. Intuition is nothing more and nothing less than recognition." [...] Simon's point is that the miracles of expert intuition have the same character"10 as those when a child recognizes familiar animals. This aligns well with Dennett's views, intuition according to him "is simply knowing something without knowing how you got there" and adds, "It's not hard to program intuition into a computer. You simply write a computer program that solves any problem at all and when you ask it how did it get the answer it says I don't know it just came to me."11 It is difficult to accept intuition alone as the justification for getting expertise training.



Using the allegory of the cave by Plato, Steven Pinker writes that mental representations are the shadows and the skull is our cave.8 Some tips to improve our way of thinking have been talked about above, but a good education should emphasize what implications the scientific method and Charles Darwin's findings have on our understanding of ourselves and the world. Steven Pinker writes that traditional education "is based in large part on the Blank Slate: children come to school empty and have knowledge deposited in them, to be reproduced later on tests", later on, he adds that education "is neither writing on a blank slate nor allowing the child's nobility to come into flower [a progressive view of education]. Rather, education is a technology that tries to make up for what the human mind is innately bad at."12 Here are a few subjects we are innately quite bad at revising our prior beliefs, scientific reasoning, statistics, decision making, probability, and dealing with uncertainty. These subjects can help us with our cognitive limitations and improve our decision-making accuracy. A good education should help us to become critical thinkers and be able to learn independently from primary sources without depending on others to interpret the evidence. It should teach us how to evaluate the evidence from all different kinds of sources, even when it's from experts or other authorities in our domain of expertise. If knowledge retention is important Pinker writes that "people retain more when they are called on to think about what they are learning than when they are asked to pluck fact after fact out of lectures and file them away in memory."1 How then can education help us step out of the cave and get a more accurate picture of reality? Pinker adds that an "education is likely to succeed not by trying to implant abstract statements in empty minds but by taking the mental models that are our standard equipment, applying them to new subjects in selective analogies, and assembling them into new and more sophisticated combinations. The view from language shows us the cave we inhabit, and also the best way out of it."



References:

  1. Pinker, S., The stuff of thought: Language as a Window into Human Nature., 2008
  2. Shtulman, A., Scienceblind: Why Our Intuitive Theories About the World Are So Often Wrong., 2017
  3. Sagan, C., The demon-haunted world: science as a candle in the dark., 1997
  4. Jacoby, S., The age of American unreason., 2008
  5. Dewey, J., How we think., 1909
  6. Pinker, S., Science is not your enemy., NewRepublic, August 6, 2013, Accessed November 13, 2017
  7. Dennett, D., From Bacteria to Bach and Back: The Evolution of Minds., 2017
  8. Pinker, S., How the mind works., 1997
  9. Chakravartty, Anjan, "Scientific Realism", The Stanford Encyclopedia of Philosophy (Summer 2017 Edition), Edward N. Zalta (ed.), URL = . Accessed November 14, 2017
  10. Kahneman, D., Thinking, fast and slow., 2011
  11. Buckley, A., Is consciousness just an illusion?, BBC, April 4 2017, Accessed November 14, 2017
  12. Pinker, S., The Blank Slate: The Modern Denial of Human Nature., 2002

Image 2 source: https://xkcd.com/876/

Nov 5, 2017

Change we can believe in

In a 2013 article for theguardian Sylvia McLain, a biophysicist, wrote,1

That most scientific studies are ultimately wrong is normal for science. There are more theories in the graveyard of science than theories that stand the test of time. Why? Because new data is always emerging and theories have to be adjusted. Theories are only as good as theories are, until new data comes along and ruins them. Theories give a best guess at what is going on based on things we observe (data), but they are not immutable. If you only have a few data points, then your working theory is more likely to turn out to be wrong. This is not news to science, this is science.

In 1956 Richard Feynman gave a talk in which he emphasized how important uncertainty is in science, he said, "when we know that we actually do live in uncertainty, then we ought to admit it... this attitude of mind - this attitude of uncertainty - is vital to the scientist."2 Scientific knowledge is not about searching certainty as per Karl Popper, science is about searching for "objectively true, explanatory theories."3 Appreciating that "knowledge is fallible and therefore uncertain" science, he said, is about "discovering and eliminating" mistakes.3 Carl Sagan also believed in our human fallibility and noted that we "will always be mired in error."4 According to Sagan, improvement comes from acknowledging our limitations and using the scientific method to correct our errors. Science, said Popper, helps us make progress and learn from our mistakes because criticism is indispensable for error corrections.5

The details how science makes progress are still a hot topic.6 Reading the history of science we can appreciate more than just how wrong we have been in the past; we can also appreciate the time it took to correct the error, and what mechanisms impaired and enabled those corrections.7 We used to believe the Earth was flat, the Earth was the center of the universe, the theory of phlogiston, alchemy, and more falshoods. The history of medicine is also replete with errors. We used to believe in humorism, a theory of disease supported by Galen, and bloodletting as its treatment. Rudolf Virchow later helped debunk humorism. Andreas Vesalius and William Harvey challenged and corrected many errors established by Galen's work in human anatomy and physiology. Before the germ theory of disease physicians and scientists believed in the miasma theory of disease, also spread by Galen. These are just a few examples of how we have established erroneous knowledge, but also made progress under the practice of the scientific method. The list goes on, but the point is that at the time these false theories were current we did not know they were wrong. As noted above, this is how science makes progress and science as a practice itself has also evolved. Historian of science David Wootton stated in his recent book that what we learn from the history of science is that "nothing endures... so too our most cherished theories will one day be supplanted."7 This echoes what Richard Feynman said in 1956, "It is impossible to find an answer which someday will not be found to be wrong."2 If we adopt the scientific attitude and accept our human fallibility and how uncertain knowledge is as noted above, it should not be a surprise when we encounter conflicting evidence of current medical practice. I am sure there's more evidence which is in conflict with current practices, but here are just a few: CK-MB for myocardial infarction,8 use of PPI's in upper GI bleeds,9 fluid management in sepsis,10, PCI for stable angina.11 In developing expertise with a scientific attitude, therefore, it becomes important how one learns about our human fallibility and the uncertainty of knowledge.

Our education, especially that from textbooks, do not expose us to the conflicts encountered between changing paradigms. Instead, we are presented with a linear picture of progress, which essentially is a "drastic distortion" of the past said Thomas Kuhn.12 He called it a "narrow and rigid education" that misrepresents not only how progress is made, but also how science works.12 A major change in medical education occurred in 1910 when Abraham Flexner, an educator, wrote a report about the state of medical education in the US and Canada.13 At the time much of medical education and medical practice were suboptimal and did not have a strong scientific foundation when compared to other specific parts of the world. Among Flexner's recommendations were to develop physicians who did not act like parrots, but instead, think like scientists who apply scientific principles in problem-solving. Misunderstanding and poorly implementation of Flexner's ideas remain present to this day as we are reminded by Kenneth Ludmerer.14 The idea of developing a scientific attitude and characterizing science as more than a "body of knowledge" has been highlighted by Carl Sagan and more recently by Andrew Shtulman in his most recent book Scienceblind: Why Our Intuitive Theories About the World Are So Often Wrong.15 Among other things, Shtulman has studied how highly educated people who support Charles Darwin's theory of evolution can still have the wrong explanation for this theory. The illusion of understanding has been demonstrated in a documentary where bright students and college graduates were unaware of their own misconceptions.16

Science and scientific explanations are not perfect and are always incomplete, but this is the best method we have to correct our errors and learn how nature works. Medicine is also not perfect, like science it uses imperfect tools and inferences full of all kinds of uncertainties. Unfortunately, these aspects of medicine continue to be neglected,17,18 making its practice less scientific, more dogmatic, and less realistic. Science and medicine both benefit from having the same critical attitude, recognizing that knowledge is both fallible and under constant revision. So, it seems to me a false dichotomy when talking about medical knowledge as if it were something completely separate from that of science. Unlike religion and cults, science does not demand beliefs in stories or authorities without considering evidence. Instead, science, as so should medicine, asks to judge the evidence and proportion our beliefs and actions accordingly. Unlike medicine, science emphasizes the distinction between having a scientific attitude, which helps "call the bluff of those who only pretend to knowledge,"4 and memorizing its products. The discrepancy in medical practice is when the attitude does not coincide with our human nature and the uncertainty of knowledge.

Unlike dogmatism, scientists are skeptical even of their own theories, Imre Lakatos said, he also added that a "blind commitment to a theory is not an intellectual virtue: it is an intellectual crime."19 Philip Tetlock found that poor predictions were associated with a dogmatic attitude,20 while better predictors considered their beliefs as "hypothesis to be tested, not treasures to be guarded."21 Most common forms of assessment these days do not take into account uncertainty while providing an illusion that memorization of context-free bullet-point-information is enough to understand the complexities of knowledge and decision making. These assessments are easily answered with superficial knowledge without asking for evidence of understanding. This creates an illusion of explanatory depth in which explanations rely on intuitions, are more likely to be wrong and severely incomplete.15,22 Not only does deep knowledge require assessment of evidence, it demands an understanding of how uncertainty affects the evidence and its application. One of the major authors of uncertainty in clinical medicine is Benjamin Djulbegovic who wrote and quoted Kenneth Ludmerer, "failure to train doctors about clinical uncertainty has been called "the greatest deficiency of medical education throughout the twentieth century.""23 Medicine is still not close to remediating this deficiency, authors in 2016 called tolerating uncertainty the "next medical revolution."17

The familiar phrase "all models are wrong" published by George E. P. Box should provide great insight into the uncertainty of knowledge. Not only should medicine discourage blind commitment to theories, stories, and practices without considering the validity of the evidence; it should also acknowledge how current problems with medical and scientific practices affect our knowledge. John Ioannidis has published relevant articles in these areas pointing out that most published research findings are false24 and most clinical research is not helpful for medical practice.25 A clinical practice closer to reality may be achieved by understanding how p-hacking,26 publication bias, statistical misinterpretations,27 the replication crisis, and other problems further exacerbate the uncertainty of knowledge and its application. Even when not influenced by financial, political, or social group interest, as it should be, scientific knowledge is not set in stone.

In a recent article, Richard Wenzel wrote that knowledge is "provisional... our books are vastly incomplete and that current concepts represent only a temporary resting place for understanding, continually requiring testing and further analyses."28 This uncertainty makes people uncomfortable and might even try to find refuge in others. This is only a false sense of security. If knowledge were determined on the strength of what others believe, "we should have to rank some tales about demons, angels, devils... as knowledge" said Lakatos.19 It is known that practice changes can take a very long time, even after evidence disconfirms current the status quo.29 Ignoring evidence that contradicts prior beliefs is not a hallmark of the scientific attitude. If confirmation bias is the modus operandi cognitive dissonance will be avoided, furthermore, stories of rationalization will be made. Having good stories is not good enough, we should take how we evaluate the evidence and our judgment more seriously. As Daniel Kahneman reminds us we "tend to have great belief, great faith in stories that are based on very little evidence."30 This is where problem-based learning, as an iterative process, helps with updating our prior beliefs when appropriate. It is a framework for independent or inter-dependent practice that cuts the "knowledge-middleman";31 it connects the uncertainty of knowledge and personal practice directly with the literature.



References:

  1. McLain, S., Not breaking news: many scientific studies are ultimately proved wrong!, theguardian, September 17, 2013, Accessed November 3, 2013
  2. Feynman, R., Robbins, J., The pleasure of finding things out : the best short works of Richard P. Feynman., 1999
  3. Popper, K., In search of a better world: lectures and essays from thirty years., 1996
  4. Sagan, C., The demon-haunted world: science as a candle in the dark., 1997
  5. Popper, K., Conjectures and Refutations: The Growth of Scientific Knowledge., 1963
  6. Nickles, Thomas, "Scientific Revolutions", The Stanford Encyclopedia of Philosophy (Winter 2016 Edition), Edward N. Zalta (ed.), URL = .
  7. Wotton, D., The Invention of Science: A New History of the Scientific Revolution., 2016
  8. Alvin MD, Jaffe AS, Ziegelstein RC, Trost JC. Eliminating Creatine Kinase–Myocardial Band Testing in Suspected Acute Coronary SyndromeA Value-Based Quality Improvement. JAMA Intern Med. 2017;177(10):1508–1512. doi:10.1001/jamainternmed.2017.3597
  9. Coronel E, Bassi N, Donahue-Rolfe S, Byrne E, Sokol S, Reddy G, Arora VM. Evaluation of a Trainee-Led Project to Reduce Inappropriate Proton Pump Inhibitor Infusion in Patients With Upper Gastrointestinal BleedingSkip the Drips. JAMA Intern Med. Published online October 09, 2017. doi:10.1001/jamainternmed.2017.4851
  10. Simpson, N., Lamontagne, F., Shankar-Hari, M., Septic shock resuscitation in the first hour., Curr Opin Crit Care. 2017 Dec;23(6):561-566. doi: 10.1097/MCC.0000000000000460.
  11. Al-Lamee, R., et al., Percutaneous coronary intervention in stable angina (ORBITA): a double-blind, randomised controlled trial., The Lance, November 2, 2017, DOI: http://dx.doi.org/10.1016/S0140-6736(17)32714-9
  12. Kuhn, T., The structure of scientific revolutions., 1962
  13. Flexner, A., Medical education in the United States and Canada: A report to the Carnegie Foundation for advancement of teaching., 1910
  14. Ludmerer, K., Commentary: Understanding the Flexner Report., Academic Medicine, 2010 Feb;85(2):193-6. doi: 10.1097/ACM.0b013e3181c8f1e7.
  15. Shtulman, A., Scienceblind: why our intuitive theories about the world are so often wrong., 2017
  16. A private universe and Minds of our own., Annenberg Learner, Accessed November 3, 2017
  17. Simpkin A., Schwartzstein R., Tolerating Uncertainty - The Next Medical Revolution?, NEJM, 2016 Nov 3;375(18):1713-1715.
  18. Kassirer, J., Our stubborn quest for diagnostic certainty. A cause of excessive testing., NEJM, 1989 Jun 1;320(22):1489-91.
  19. Lakatos, I., Science and Pseudoscience., BBC Radio talk, 1973
  20. New Freakonomics Radio Podcast: The Folly of Prediction by Stephen Dubner., Freakonomics, September 4, 2011, Accessed November 3, 2017
  21. How to predict the future better than anyone else by Ana Swanson., Washinton Post, January 4, 2016, Accessed November 3, 2017.
  22. Keil FC. Explanation and Understanding. Annual review of psychology. 2006;57:227-254. doi:10.1146/annurev.psych.57.102904.190100.
  23. Djulbegovic B. Lifting the fog of uncertainty from the practice of medicine: Strategy revolves around evidence, decision making, and leadership. BMJ : British Medical Journal. 2004;329(7480):1419-1420.
  24. Ioannidis JPA. Why Most Published Research Findings Are False. PLoS Medicine. 2005;2(8):e124. doi:10.1371/journal.pmed.0020124.
  25. Ioannidis JPA. Why Most Clinical Research Is Not Useful. PLoS Medicine. 2016;13(6):e1002049. doi:10.1371/journal.pmed.1002049.
  26. Kerr, N., HARKing: Hypothesizing After the Results are Known., Pers Soc Psychol Rev. 1998;2(3):196-217
  27. Greenland S, Senn SJ, Rothman KJ, et al. Statistical tests, P values, confidence intervals, and power: a guide to misinterpretations. European Journal of Epidemiology. 2016;31:337-350. doi:10.1007/s10654-016-0149-3.
  28. Wenzel, R., Medical Education in the Era of Alternative Facts., NEJM, 2017; 377:607-609 August 17, 2017
  29. Balas E, Boren S. Managing clinical knowledge for health care improvement. In: Bemmel J, McCray AT (eds). Section 1: health and clinical management. In Yearbook of Medical Informatics: Patient Centered Systems. Stuttgart, Germany: Schattauer Verlagsgesellschaft; 2000:65-70.
  30. Why We Contradict Ourselves and Confound Each Other., Onbeing, October 5, 2017, Accessed November 3, 2017
  31. Stemwedel, J., Brief thoughts on uncertainty. Scientific American, March 30, 2014, Accessed November 3, 2017