In a 2013 editorial for the Journal of Evaluation of Clinical Practice Loughlin et al. write:
It is commonly – although not universally – accepted that clinical reasoning should in some sense be ‘objective’, and that we lack a proper explanation of a problem, symptom or condition if we lack an objective account of the nature and causes of that problem, symptom or condition [1–4]. The pages of this journal over its 19-year history testify that we are still far from reaching a broad consensus on the precise nature of clinical reasoning, and that the reasons for this lack of consensus are rooted in deeper, philosophical disagreements about the nature of objective reasoning in science, the nature of evidence and the role of mechanistic reasoning, statistical reasoning and theoretical frameworks in clinical judgement [5–14].
It is also commonly – but again not universally – accepted that we lack a full understanding of the problems we seek to treat if we do not have a proper account of the human, subjective experiences that (arguably) prompt us to conceptualize a given condition as a ‘problem’ in the first place [15–20]. Some critics maintain that the modern emphasis on biomedical science, its undoubted initial benefits notwithstanding, has led in recent times to a neglect of the lived experience of health and illness, and to the rise of a theoretically motivated reductionism that threatens to impoverish practice [21–26].
What is more, we seem to inherit a set of underlying assumptions or conceptual framework suggesting some sort of opposition, or at least a tension, between these different notions of explaining a problem objectively and understanding its human dimensions. ‘Reintegrating’ the two approaches or perspectives thus becomes, in itself, an intellectual problem in need of resolution [27–31]. That so fundamental a problem remains unresolved – at the very least, no consensus is discernable among intelligent contributors to the debate about its resolution –reveals merely that we have not yet reached the end of intellectual history . Even a brief acquaintance with the history of ideas should fill any thoughtful student of that history with a sense of humility and hope, being confronted at once with at least two realizations: that even the greatest thinkers of earlier ages lacked insights now deemed commonplace, and that, via a determined effort to challenge entrenched ideas, human beings were able to identify assumptions and dogmas standing in the way of progress, to provide us with the moral and intellectual inheritance upon which our current civilization stands [33,34]. It will therefore come as no great surprise to the reflective reader that she did not happen to be born at that point in the evolution of human society when all the most important and fundamental problems were resolved, and that there remains a need to interrogate our underlying assumptions about our lives and practices, if we are to build upon the progress made by our ancestors .
It's important to test assumptions in order to check for illusion of explanatory depth.Tweet to @jvrbntz
Post a Comment
1. You should attempt to re-express your target’s position so clearly, vividly, and fairly that your target says, “Thanks, I wish I’d thought of putting it that way.
2. You should list any points of agreement (especially if they are not matters of general or widespread agreement).
3. You should mention anything you have learned from your target.
4. Only then are you permitted to say so much as a word of rebuttal or criticism.
Daniel Dennett, Intuition pumps and other tools for thinking.
Valid criticism is doing you a favor. - Carl Sagan