Article Text

Download PDFPDF

“Elementary, my dear Watson”
Free
  1. P McCrory

    Statistics from Altmetric.com

    Request Permissions

    If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

    It is clearly the silly season in journal publication when distinguished researchers are speculating about the ultimate destiny of missing teaspoons from their own departments.1 In spite of the dearth of published work on this topic, one has to wonder whether the overly scientific longitudinal cohort study was the best means of studying this phenomenon.

    In 19th century medicine, the art of clinical reasoning was held to be as important as the scientific aspects of medical practice. I was reminded of this at the recent BASEM meeting in Edinburgh when Dr Donald Macleod gave the Roger Bannister Oration. He referred to one of the distinguished sons of Edinburgh, Joseph Bell, who was professor of surgery at Edinburgh Medical School and the basis for the character of Sherlock Holmes.

    Although the history of detective literature conventionally dates back to 1841, when The Murders in the Rue Morgue by Edgar Allan Poe was published, Holmes is widely considered the doyen of such consulting detectives.

    During their golden age of the 19th century, both of the disciplines thrived on a faith based on their methods of interpretation of clues (by detectives) or understanding of signs and symptoms (by physicians). A final “diagnosis” was then reached by clues that were often meaningless to the layman. The amazement that Sherlock Holmes excites when he guesses through apparently insignificant details that Watson has been to send a telegram from the Wigmore Street post office in The Sign of the Four is similar to the reaction Dr Trousseau gets when he diagnoses meningitis by scratching a patient’s skin.2,3

    In The Sign of Four, Sherlock Holmes states that three qualities are necessary for the ideal detective: “observation, deduction and knowledge.”2 Here again, clear analogies can be drawn between the ideal profiles of doctors and detectives.4 Although these desirable qualities may make an ideal clinician, is that the same as what makes a good clinician?

    The first criteria must be logical reasoning. Good detectives and good clinicians share the same underlying approach as scientific researchers, as illustrated by Karl Popper’s hypothetico-deductive model.5 Clues are ascertained by their presence or absence, a hypothesis is generated and subsequently tested.

    In the long history of medicine, the discovery and interpretation of signs of disease are relatively recent features of diagnosis. The popularity of the “pathognomonic signs” described by the famous clinicians of the 18th and 19th centuries (eg Trousseau, Austin-Flint, Cheyne-Stokes) emanated from the flawed belief that the “internal site” of a disease can be diagnosed with absolute precision from its “specific external signs.” Even today, the search for pathognomonic signs often forms the first approach in the undergraduate formation of future doctors. Only later does the medic overcome blind faith in the “science” of clinical and instrumental signs, recognising their limitations.4

    The second quality is knowledge. Sherlock Holmes has clear ideas on this: “I consider that a man’s brain originally is like a little empty attic, and you have to stock it with such furniture as you choose. He will have nothing but the tools which may help him in doing his work. It is a mistake to think that that little room has elastic walls and can distend to any extent. Depend upon it there comes a time when for every addition of knowledge you forget something that you knew before. It is of the highest importance, therefore, not to have useless facts elbowing out the useful ones.”6

    These days the internet is the tool of choice for accessing huge amounts of knowledge without the risk of the information transiting through one’s neurones and clogging up the brain. The recent emergence of a new type doctor has become a major threat to medicine: the combination of a problem-based learning (PBL) type medical course (ie devoid of didactically taught basic science) resulting in an “armchair” specialist whose conviction is that the solution to every clinical dilemma is to be found somewhere on the web.

    Until the last century with the rapid evolution of diagnostic methods, physicians relied heavily upon case histories that developed in some cases over centuries with ongoing annotation about the nature and course of specific diseases. One of the most famous medical texts of medical cases written in 950 AD by the Persian physician Rhazes remained in print for almost 1000 years.7 The experience of physicians was shared for the common good and this was enhanced by the use of the common language of Latin between doctors from different countries enabling rapid and easy communication of such knowledge. The other more obvious aspect of medical training is that by a thorough understanding of anatomy, physiology, pathology and the other basic medical sciences there is a solid foundation that enables the “traditionalist” to resort to first principles when confronted by an new problem. This luxury is not an option to the PBL armchair “googlers”.

    The third criterion is experience and the ability to interpret a disease profile from an interview. The interview is a vital investigative tool for the vast majority of detectives; in some cases, the interview can be the sole means of detection and many TV detectives utilise the long, overnight interview which ends at dawn with the murderer’s confession.

    Historically, doctors based their diagnoses mainly on their patients’ spontaneous verbal communications. As diseases were primarily categorised by symptoms (eg the palsy), patients could communicate their symptoms verbally and doctors could effectively “visit” a patient (and make a diagnosis) by post. During the 19th century, the patient’s history progressively began to be articulated into a standard protocol in the form of an interview, with less time being dedicated to free verbal communication and the patient’s own interpretations.

    This structured approach has helped generations of students to become professional clinicians. All of us have met colleagues who have a special talent for interviewing patients and extracting the key elements needed for a correct diagnosis. This art is probably an individual talent that can only partially be transmitted to students, although beginners should remember that diagnostic information is gained not through a freewheeling dialogue with the patient but through active probing of precise diagnostic hypotheses.4

    It is hard to characterise the next quality – it is a combination of knowledge, initiative, lateral thinking and a form of stubbornness. Just as our fictional detectives have become increasingly dependant upon DNA and other modern techniques, so we can see similar changes in hospital medicine. It is easy for an inexperienced (read lack of supervision by experienced colleague) or a lazy clinician to resort to ordering endless tests and procedures—without any precise hypothesis—in the vague hope of stumbling on a plausible diagnosis. In some cases, the fear of making mistakes, attracting rebukes from superiors or incurring official sanctions plays a part. I am sure we can all remember the experience as interns of ordering every type of blood work on all patients ahead of a consultant ward round just in case it was asked for. The astute clinician interprets the relevant diagnostic findings in light of the hypothesis generated from the history and examination, and is prepared to revise the hypothesis depending upon the findings. In this context, simple examinations really do have the same value as the more complex and expensive ones.

    An ideal clinician could be said to present a harmonic fusion of almost all the criteria outlined above.

    So where did all the teaspoons go, I know you were wondering. After following a cohort of numbered teaspoons residing in the kitchens of an internationally recognised research institute, approximately 80% of the teaspoons disappeared during the study giving a teaspoon half life of 81 days. The rate of loss was not influenced by the teaspoons’ value. The incidence of teaspoon loss over the period of observation was 360.62 per 100 teaspoon years.1 The authors estimated that 250 teaspoons would need to be purchased annually to maintain a practical institute-wide population of 70 teaspoons.

    In speculating about the ultimate destination of the spoon, the authors suggest that researchers simply steal them for personal use although there are a number of other intriguing possibilities such as counterphenomenological resistentialism (where the spoons have a natural antipathy towards humans and migrate) or an alien involvement with the spoons being removed to an extraterrestrial destination.

    In the words of Sherlock Holmes “When you have eliminated all which is impossible, then whatever remains, however improbable, must be the truth”.2

    REFERENCES

    View Abstract