Abstract
Background
Errors in medicine are common and often tied to diagnosis. Educating physicians about the science of cognitive decision-making, especially during medical school and residency when trainees are still forming clinical habits, may enhance awareness of individual cognitive biases and has the potential to reduce diagnostic errors and improve patient safety.
Methods
The authors aimed to develop, implement and evaluate a clinical reasoning curriculum for Internal Medicine residents. The authors developed and delivered a clinical reasoning curriculum to 47 PGY2 residents in an Internal Medicine Residency Program at a large urban hospital. The clinical reasoning curriculum consists of six to seven sessions with the specific aims of: (1) educating residents on cognitive steps and reasoning strategies used in clinical reasoning; (2) acknowledging the pitfalls of clinical reasoning and learning how cognitive biases can lead to clinical errors; (3) expanding differential diagnostic ability and developing illness scripts that incorporate discrete clinical prediction rules; and (4) providing opportunities for residents to reflect on their own clinical reasoning (also known as metacognition).
Results
Forty-seven PGY2 residents participated in the curriculum (2013–2016). Self-assessed comfort in recognizing and applying clinical reasoning skills increased in 15 of 15 domains (p < 0.05 for each). Resident mean scores on the knowledge assessment improved from 58% pre-curriculum to 81% post curriculum (p = 0.002).
Conclusions
A case vignette-based clinical reasoning curriculum can effectively increase residents’ knowledge of clinical reasoning concepts and improve residents’ self-assessed comfort in recognizing and applying clinical reasoning skills.
Introduction
Diagnostic errors in medicine are estimated to account for 40–80,000 deaths per year. Of all medical errors, 10–20% are tied to errors in diagnosis [1]. Prior studies have shown that cognitive errors such as faulty information gathering and faulty synthesis of information are as likely to lead to misdiagnosis as system-related errors [2], [3]. Interventions to identify and prevent cognitive errors are still being studied.
Common cognitive errors are referred to as biases [4]. Though over 100 types of biases have been described, discussion of the impact of cognitive bias is not routine in medical education or practice, due in part to difficulty measuring or monitoring cognitive errors [1]. In addition, physicians are often unaware of their own cognitive processes, and may not appreciate the effect of cognitive bias on their own clinical decision-making [5]. Educating physicians about the science of cognitive decision-making, especially during medical school and residency when trainees are still forming clinical habits, may enhance awareness of individual cognitive biases and has the potential to reduce diagnostic errors and improve patient safety [6], [7]. However, few studies have evaluated the impact of a cognitive decision-making curriculum on trainees’ understanding of clinical reasoning concepts.
We sought to fill this gap by developing and implementing a clinical reasoning curriculum for second year Internal Medicine residents. In this paper, we describe the development and implementation of the curriculum and effect of the curriculum on residents’ knowledge and self-assessed understanding of clinical reasoning.
Methods
Settings and participants
We implemented the curriculum in December 2013 for PGY-2 residents in the Internal Medicine Residency Program at Montefiore Medical Center, Bronx, New York. From December 2013 through May 2016, 47 of 150 residents enrolled in the elective curriculum. The majority (29/47) of residents who enrolled were from the primary care track of our program, with the remainder (18/47) from the categorical track. The curriculum was initially delivered in eight 4-h sessions over the course of 1 month to groups of five to eight residents, but after the first year and in response to feedback it was condensed to six to seven sessions over a 2-week ambulatory block. Faculty facilitators included two of the members of the group that developed the curriculum.
Program description
Our clinical reasoning curriculum was developed by a group of five academic general internists (two senior faculty members with more than 20 years’ experience in medical education and formal roles in curriculum development at our institution, and three junior faculty members with less than 5 years’ experience in clinical education) using a theory-guided, literature-based framework [8]. Over the course of 1 year, we reviewed literature from cognitive psychology and medical education, developed curriculum aims and chose teaching strategies. Our curriculum was created with the following specific aims: (1) educate residents on cognitive steps and strategies used in clinical reasoning; (2) learn how cognitive biases can lead to clinical errors; (3) expand differential diagnostic ability and learn to develop illness scripts that incorporate discrete clinical prediction rules; and (4) provide opportunities for residents to reflect on their own clinical reasoning (metacognition).
We devised six to eight clinical reasoning sessions (Table 1), each highlighting a cognitive step (hypothesis generation, refinement and verification) or reasoning strategy (e.g. probabilistic reasoning, causal reasoning). For each session, we developed two to three clinical case vignettes that illustrated a cognitive step or reasoning strategy (Aim 1), demonstrated a cognitive error (Aim 2) and expanded knowledge of illness scripts to facilitate differential diagnostic ability (Aim 3). This case-based approach is well described as an effective method for teaching clinical reasoning [9].
Components of the curriculum matched to each curricular session and aim.
Curricular sessions | AIM 1 | AIM 2 | AIM 3 | AIM 4 |
---|---|---|---|---|
Clinical reasoning steps/Reasoning strategies | Cognitive errors highlighted | Strategies to strengthen medical knowledge | Reflection on clinical reasoning | |
Session 1 | Hypothesis generation Use of heuristics | Premature closure Anchoring Faulty causal model Retrospect bias | Clinical vignettes | |
Session 2 | Hypothesis refinement Analytical and nonanalytical reasoning strategies | Anchoring Unpacking principle | ||
Session 3 | Interpretation of diagnostic testing probabilistic reasoning Introduction to clinical prediction rules | Inaccurate interpretation of test result Overestimating pretest probability Anchoring | Clinical vignettes+clinical prediction rules | |
Session 4 | Diagnostic verification | Premature closure Base rate neglect Order bias Confirmation bias | Clinical vignettes+clinical prediction rules+ambulatory morning report | |
Session 5 | Causal reasoning Therapeutic decision-making | Errors in estimating diagnostic/therapeutic treatment thresholds | ||
Session 6–7 | Review of cognitive errors | All above | Clinical vignettes+clinical prediction rules+ambulatory morning report+assessing clinical reasoning in the novice learner |
Clinical case vignettes were selected from published problem-solving cases and from a textbook on clinical reasoning [8]. The cases were modified to include reflective questions after each new history or data element, to prompt problem representation after the initial presentation, and to elicit reflection on possible cognitive errors at the end of the case. Clinical case vignettes were finalized under guidance of senior faculty for accuracy, vetted with other members of the group, and piloted with third year residents (Example Clinical Vignette in Supplementary Index-S1).
Each 4-h clinical reasoning session utilized a modified form of the flipped classroom, in which residents are first exposed to new material (clinical vignettes and readings on clinical reasoning processes) outside of class and then use class time to assimilate new knowledge through problem-solving and discussion [10]. Residents were asked to prepare for sessions by reading articles on cognitive psychology, medical education and clinical prediction rules, and by completing the first half of the clinical case vignettes before the session (List of articles for sessions in Supplementary Index-S2). This was approximately 1–2 h of pre-session work, which was built into the residents’ schedule. During the sessions, residents worked through the clinical case vignettes in small groups, identified key clinical features of each case and proposed a unifying diagnosis. The faculty facilitator reinforced all four curricular aims through facilitated discussion and question prompts that highlighted cognitive steps taken and reasoning strategies employed (Aim 1), cognitive errors the case may elicit (Aim 2) and evidence-based approaches to the clinical scenarios (Aim 3).
As a final exercise, residents utilized a worksheet that we adapted from ambulatory morning reports to reflect on their clinical reasoning on a case from their own ambulatory clinical experience (Aim 4) (Self reflection worksheet in Supplementary Index-S3). Facilitators reviewed these worksheets the night prior to the next clinical reasoning session and provided feedback to the residents on their self-reflections during the ambulatory morning report at the beginning of the clinical reasoning session. Time was also allotted for residents to reflect on any cognitive errors that occurred on inpatient wards by critiquing intern or medical student presentations that were created by the facilitator. This allowed residents to discuss how to incorporate clinical reasoning concepts into teaching medical students and interns.
Program evaluation
We developed survey instruments to measure (1) residents’ self-assessed ability to recognize and apply clinical reasoning concepts and (2) residents’ clinical reasoning knowledge. The latter assessed their knowledge using clinical reasoning scenarios. The survey instruments were developed 1 and 2 years, respectively, after the curriculum was first delivered. Both survey instruments were completed by the residents on the first day of curriculum (pre) and on the final day of curriculum (post).
Residents’ self-assessed ability to recognize and apply clinical reasoning concepts was measured with a 15-item questionnaire with responses entered on a four-point Likert scale, with 1=not at all capable, 2=slightly capable, 3=somewhat capable and 4=very capable. We then compared the percentage of residents who felt “somewhat capable” or “very capable” prior to the curriculum to the percentage of residents who felt “somewhat capable” or “very capable” after the curriculum, using chi-square (χ2) tests for statistical significance. Residents’ knowledge of clinical reasoning steps, cognitive errors and application of Bayesian analysis was assessed with a 10-item multiple-choice questionnaire, and mean scores pre- and post-curriculum were compared using the Wilcoxon test. The purpose of the surveys was described to the residents prior to the start of the curriculum. This study was approved on expedited review by the Albert Einstein College of Medicine/Montefiore Medical Center Institutional Review Board (IRB # 2018-9863).
Results
Between 2013 and 2016, we offered the curriculum 3 times and 47 residents completed it. Because the survey instruments were developed at different times after the first year of the curriculum, 30 residents offered and completed the resident skills self-assessment, and 25 residents were offered the clinical reasoning knowledge test. Twenty-one residents completed both the pre- and post-test surveys on knowledge (84% of those who were offered the knowledge survey).
Resident self-assessed skills following completion of the curriculum increased in all 15 domains (p<0.05 for each). Domains with the greatest increase in self-reported comfort included: comparing analytic and nonanalytic reasoning strategies, describing working memory, understanding heuristics, applying Bayesian analysis to clinical scenarios, identifying cognitive errors, and identifying errors in diagnostic verification. In each of these six domains, fewer than 30% of residents reported feeling “somewhat” or “very capable” before the course compared to more than 75% afterward (Table 2). Resident mean scores on the knowledge assessment improved from 58% pre-curriculum to 81% post-curriculum (p=0.002). Knowledge questions with the greatest change reflected some of the same areas in which residents reported an increase in comfort (i.e. identifying a cognitive error and employing a clinical reasoning strategy to answer a case-based question).
Resident skills self-assessment.
Rate your level of comfort in performing the following tasks on the scale below | % of responses “somewhat” or “very capable” (n=30) | |
---|---|---|
Pretest | Posttest | |
1. Explain how clinicians reason through a clinical case – from chief complaint to generating a working diagnosis | 77% | 100%a |
2. Compare the pros and cons of analytical (Bayesian) and non-analytical (intuitive) reasoning strategies clinicians use in clinical diagnosis | 23% | 93%a |
3. Describe working memory and how clinical knowledge is stored | 20% | 77%a |
4. Give examples of how heuristics (rules of thumb) are used in clinical diagnosis | 33% | 93%a |
5. Recognize the effects of epidemiology of disease on clinical reasoning | 67% | 100%a |
6. Reflect on the shortcomings of using the outcome of similar clinical cases to drive future patient management decisions (i.e. representativeness heuristic) | 53% | 97%a |
7. Apply sensitivity and specificity to history and physical exam findings | 50% | 97%a |
8. Apply Bayesian analysis to clinical scenarios | 23% | 83%a |
9. Identify errors in diagnostic verification | 20% | 100%a |
10. Give examples of how pathophysiology can be used to reason through clinical scenarios (causal reasoning) | 67% | 97% |
11. Discuss the concept of diagnostic and treatment thresholds | 30% | 83%a |
12. Identify cognitive errors that clinicians commonly make that can lead to errors in diagnosis | 27% | 97%a |
13. Diagnose shortcomings in learner’s (student’s or intern’s) clinical reasoning | 57% | 97%a |
14. Provide feedback to learners about their clinical presentations and clinical reasoning. | 63% | 93%a |
15. Elicit feedback about diagnostic uncertainties from clinic preceptors | 50% | 93%a |
ap<0.05.
Discussion
Our clinical reasoning curriculum, using a combination of case vignettes, clinical prediction rules, and reflection exercises, improved resident self-assessed ability in recognizing and applying clinical reasoning concepts. Residents also demonstrated improvement in their ability to identify and apply knowledge of clinical reasoning concepts on a knowledge test after completing the curriculum. Our curriculum followed several best practice approaches for teaching clinical reasoning – it was case-based, provided a foundation in clinical reasoning theory with vocabulary and methods of clinical problem solving and provided immediate feedback on residents’ cognitive errors [9]. We suspect that our curriculum’s impact was maximized by incorporating exercises in which residents practiced their new skills by reflecting on their own ambulatory clinical experiences.
Residents reported the least comfort on the pre-curriculum assessment in comparing analytical and non-analytical reasoning strategies, understanding the cognitive psychology of how memory and clinical knowledge are stored and identifying cognitive errors. Even in domains that are more commonly incorporated into formal and informal inpatient teaching (such as utilizing epidemiology and pathophysiology in clinical reasoning), we saw improvements in self-reported comfort and knowledge. It is likely that both repeated exercises in self-reflection and a session focused on assessing the learner helped residents improve their comfort and skills in diagnosing shortcomings in their learners’ clinical reasoning and providing feedback to their learners on clinical presentations.
In informal discussion with the residents weeks to months after the curriculum, we noted that residents who received the curriculum reported that they were able to identify and name cognitive errors generated by medical students and interns, as well as to identify their own potential sources of error. This indicates that residents were able to engage in metacognition, a recommended strategy to reduce cognitive error [4], [11]. Our findings are consistent with other studies [12], [13] that have found that clinical reasoning curricular interventions can improve physician awareness of their own cognitive biases.
There are limitations to our approach. There is currently no standardized tool to measure the effects of clinical reasoning curriculum on resident skills, and only one other published study has assessed resident knowledge of clinical reasoning concepts [12]. While we could consider additional assessment methods, such as script concordance testing, observed standardized clinical exams, or completion of cases from the Human Diagnosis Project, results from studies using these methods have been mixed and none have shown a reduction in diagnostic errors [14], [15], [16]. One prior study describing a curriculum centered on illness scripts for medical students used a case-based assessment that showed an increase in diagnostic performance [17], which may be an area for future expansion of our curriculum. Finally, we were not able to measure diagnostic error rates in the residents’ clinical practice. To strengthen the evaluation of our curriculum, in future iterations we plan to evaluate sustainability of resident knowledge and application of clinical reasoning concepts by conducting additional surveys 1 and 6 months after completion of the curriculum. We also plan to assess intermediate outcomes, such as impact on residents’ use of critical thinking and metacognition.
Conclusions
A case-based clinical reasoning curriculum can effectively increase residents’ knowledge of clinical reasoning concepts and their self-assessed ability to recognize and apply clinical reasoning concepts.
Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission.
Research funding: None declared.
Employment or leadership: None declared.
Honorarium: None declared.
Competing interests: The funding organization(s) played no role in the study design; in the collection, analysis, and interpretation of data; in the writing of the report; or in the decision to submit the report for publication.
References
1. Graber ML, Wachter RM, Cassel CK. Bringing diagnosis into the quality and safety equations. J Am Med Assoc 2012;308: 1211–2.10.1001/2012.jama.11913Suche in Google Scholar PubMed
2. Bordage G. Why did I miss the diagnosis? Some cognitive explanations and educational implications. Acad Med 1999;74(10 Suppl):S138–43.10.1097/00001888-199910000-00065Suche in Google Scholar PubMed
3. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine. Arch Intern Med 2005;165:1493–9.10.1001/archinte.165.13.1493Suche in Google Scholar PubMed
4. Croskerry P. Cognitive forcing strategies in clinical decisionmaking. Ann Emerg Med 2003;41:110–20.10.1067/mem.2003.22Suche in Google Scholar PubMed
5. Croskerry P. From mindless to mindful practice – cognitive bias and clinical decision making. N Engl J Med 2013;368: 2445–8.10.1056/NEJMp1303712Suche in Google Scholar PubMed
6. Graber ML. Educational strategies to reduce diagnostic error: can you teach this stuff? Adv Health Sci Educ Theory Pract 2009;14(Suppl 1):63–9.10.1007/s10459-009-9178-ySuche in Google Scholar PubMed
7. Abrami PC, Bernard RM, Wade EB, Surkes MA, Tamim R, ZhangD. Instructional interventions affecting critical thinking skills and dispositions: a stage 1 meta-analysis. Rev Educ Res 2008;78:1102–34.10.3102/0034654308326084Suche in Google Scholar
8. Kassirer JP, Wong JB, Kopelman RI. Learning clinical reasoning, 2nd ed. Lippincott: Williams & Wilkins, 2009.Suche in Google Scholar
9. Kassirer JP. Teaching clinical reasoning: case-based and coached. Acad Med 2010;85:1118–24.10.1097/ACM.0b013e3181d5dd0dSuche in Google Scholar PubMed
10. Brame C. Flipping the classroom. Vanderbilt University Center for Teaching. [cited 2013. Available from: http://cft.vanderbilt.edu/guides-sub-pages/flipping-the-classroom/.Suche in Google Scholar
11. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf 2013;22(Suppl 2):ii65–72.10.1136/bmjqs-2012-001713Suche in Google Scholar PubMed PubMed Central
12. Reilly JB, Ogdie AR, Von Feldt JM, Myers JS. Teaching about how doctors think: a longitudinal curriculum in cognitive bias and diagnostic error for residents. BMJ Qual Saf 2013;22:1044–50.10.1136/bmjqs-2013-001987Suche in Google Scholar PubMed
13. Gay S, Bartlett M, McKinley R. Teaching clinical reasoning to medical students. Clin Teach 2013;10:308–12.10.1111/tct.12043Suche in Google Scholar PubMed
14. Semigran HL, Levine DM, Nundy S, Mehrotra A. Comparison of physician and computer diagnostic accuracy. JAMA Intern Med 2016;176:1860–1.10.1001/jamainternmed.2016.6001Suche in Google Scholar PubMed
15. Humbert AJ, Besinger B, Miech EJ. Assessing clinical reasoning skills in scenarios of uncertainty: convergent validity for a Script Concordance Test in an emergency medicine clerkship and residency. Acad Emerg Med 2011;18:627–34.10.1111/j.1553-2712.2011.01084.xSuche in Google Scholar PubMed
16. Park WB, Kang SH, Lee YS, Myung SJ. Does objective structured clinical examinations score reflect the clinical reasoning ability of medical students? Am J Med Sci 2015;350:64–7.10.1097/MAJ.0000000000000420Suche in Google Scholar PubMed PubMed Central
17. Lee A, Joynt GM, Lee AK, Ho AM, Groves M, Vlantis AC, et al. Using illness scripts to teach clinical reasoning skills to medical students. Fam Med 2010;42:255–61.Suche in Google Scholar
Supplementary Material
The online version of this article offers supplementary material (https://doi.org/10.1515/dx-2018-0093).
©2019 Walter de Gruyter GmbH, Berlin/Boston
Artikel in diesem Heft
- Frontmatter
- Editorial
- Diagnosis education – an emerging field
- Technology in Diagnosis Education
- Morning report innovation: Case Oriented Report and Exam Skills
- Integrating Bayesian reasoning into medical education using smartphone apps
- A simulation-based approach to training in heuristic clinical decision-making
- Educators
- Pediatric faculty knowledge and comfort discussing diagnostic errors: a pilot survey to understand barriers to an educational program
- A workshop to train medicine faculty to teach clinical reasoning
- Development and evaluation of a clinical reasoning curriculum as part of an Internal Medicine Residency Program
- Diagnostic uncertainty: from education to communication
- Basic Science of Diagnosis Education
- Use of clinical reasoning tasks by medical students
- Scaffolding clinical reasoning of medical students with virtual patients: effects on diagnostic accuracy, efficiency, and errors
- Understanding diagnosis through ACTion: evaluation of a point-of-care checklist for junior emergency medical residents
- Studies of Diagnosis in Clinical Contexts
- Internal medicine residents’ evaluation of fevers overnight
- Implementation of a clinical reasoning curriculum for clerkship-level medical students: a pseudo-randomized and controlled study
- Integration with Other Fields
- Diagnostic error, quality assurance, and medical malpractice/risk management education in emergency medicine residency training programs
- Teaching novice clinicians how to reduce diagnostic waste and errors by applying the Toyota Production System
Artikel in diesem Heft
- Frontmatter
- Editorial
- Diagnosis education – an emerging field
- Technology in Diagnosis Education
- Morning report innovation: Case Oriented Report and Exam Skills
- Integrating Bayesian reasoning into medical education using smartphone apps
- A simulation-based approach to training in heuristic clinical decision-making
- Educators
- Pediatric faculty knowledge and comfort discussing diagnostic errors: a pilot survey to understand barriers to an educational program
- A workshop to train medicine faculty to teach clinical reasoning
- Development and evaluation of a clinical reasoning curriculum as part of an Internal Medicine Residency Program
- Diagnostic uncertainty: from education to communication
- Basic Science of Diagnosis Education
- Use of clinical reasoning tasks by medical students
- Scaffolding clinical reasoning of medical students with virtual patients: effects on diagnostic accuracy, efficiency, and errors
- Understanding diagnosis through ACTion: evaluation of a point-of-care checklist for junior emergency medical residents
- Studies of Diagnosis in Clinical Contexts
- Internal medicine residents’ evaluation of fevers overnight
- Implementation of a clinical reasoning curriculum for clerkship-level medical students: a pseudo-randomized and controlled study
- Integration with Other Fields
- Diagnostic error, quality assurance, and medical malpractice/risk management education in emergency medicine residency training programs
- Teaching novice clinicians how to reduce diagnostic waste and errors by applying the Toyota Production System