Jun 15, 2017

The distinction between evidence and decision making

An important distinction that should be explicitly taught more often.

It is actually quite striking to witness failure of many authors to understand the need to separate evidence (and assessment of validity of that evidence) from decision making. [...] Knowledge of (quality of) evidence alone, while helpful, is not sufficient for decision making. Assessment of the ‘truth’ will always remain incomplete, and will continue to be expressed in the language of probability and uncertainty (Djulbegovic et al. 2000a). Decision making, on the other hand, is a categorical exercise: we act or we do not act. Evidence, however, does play a crucial role (in rational) decision making. The greater the data quality, the better decision making should be (Djulbegovic et al. 2000a), which brings us to the question: ‘What is, in fact, rational decision making?’

Normative theories of decision making holds that rational decision making is the one that maximizes the value of consequences: (1) based on the decision maker’s current assets, (2) based on the possible consequences (i.e. benefits and harms) and values associated with each consequences of a choice and (3) when these consequences are uncertain, their likelihood is evaluated according to the rules of probability theory (e.g. by integrating evidence summary measures within a framework of decision analysis) (Djulbegovic & Hozo 1998; Hozo & Djulbegovic 1999; Djulbegovic et al. 2000b; Hastie & Dawes 2001). In medicine, rational decision making often means choosing intervention for which benefits outweigh the harms (for the most important outcomes that we care about). As Hastie and Dawes forcibly argued rationality of choice is a matter of the process of choosing, not of what is chosen. That is, a good decision can result in bad outcomes and a bad decision can result in good outcomes (Hastie & Dawes 2001).

However, descriptive theories of decision making have repeatedly pointed out that people rarely make decisions in accord with normative theories (the difference between the ‘ought’ and the ‘is’) (Bell et al. 1988; Hastie & Dawes 2001). Because humans are fallible decision makers, prescriptive theories are increasingly applied to help optimize decision making. The goal is to help people make better decisions in a way to exploit some of the logical consequences of normative theories and empirical findings of descriptive studies trying to address reliability of evidence and processes such as attitudes, biases, values and so on (Bell et al. 1988).


Evidence, however, remains a backbone of various problem-solving and reasoning strategies, which is the essence of effective decision making. It can even be argued that clinical process represents a special case of the scientific methods in which application of formal rules of inference based on deduction, induction or probability calculus may optimize diagnostic and treatment decision making (Djulbegovic 1997). Studies on cognitive aspects of the reasoning process during the last several decades have generally demonstrated that three reasoning strategies are commonly used in medicine: (1) probabilistic, (2) causal and (3) deterministic (Kassirer & Kopelman 1991). The first two techniques are known as knowledge-based reasoning (synthetic thinking), while deterministic reasoning is also known as algorithmic or rule-based reasoning (Djulbegovic 1997). In addition, as they season, physicians develop skill-based reasoning based on ad hoc rules of thumb, known as heuristics (Djulbegovic 1997). The difference between the novice and the experts lies in the capability of the latter to move from knowledge-based to skill-based reasoning. Experts have a much larger repertoire of skill-based schemata and problem-solving rules than novices. However, exclusive use of heuristics leads to errors in diagnostic and treatment decision making. In fact - and this is the crux of the problem - errors do occur with each of the problem-solving and reasoning strategies described above. Having reliable evidence is not a guarantee that these errors will not be made. [...] in decision making, when should a practitioner rely on one type of knowledge versus the other? Unfortunately, research to date failed to identify the best decision-making strategies for a given class of medical problems. In this sense, it is unlikely that any theoretical framework will prove useful. This remains to be an empirical problem, which can only be addressed by empirical research.

Djulbegovic, B. (2006), Evidence and decision making. Commentary on M.R. Tonelli (2006), Integrating evidence into clinical practice..., Journal of Evaluation in Clinical Practice, 12: 257–259.

Evidence-based medicine is still an incomplete framework, but it's better than an unsystematic approach to the practice of medicine.

No comments:

Post a Comment

1. You should attempt to re-express your target’s position so clearly, vividly, and fairly that your target says, “Thanks, I wish I’d thought of putting it that way.
2. You should list any points of agreement (especially if they are not matters of general or widespread agreement).
3. You should mention anything you have learned from your target.
4. Only then are you permitted to say so much as a word of rebuttal or criticism.
Daniel Dennett, Intuition pumps and other tools for thinking.

Valid criticism is doing you a favor. - Carl Sagan