Home Pediatric faculty knowledge and comfort discussing diagnostic errors: a pilot survey to understand barriers to an educational program
Article Publicly Available

Pediatric faculty knowledge and comfort discussing diagnostic errors: a pilot survey to understand barriers to an educational program

  • Joseph A. Grubenhoff EMAIL logo , Sonja I. Ziniel , Lalit Bajaj and Daniel Hyman
Published/Copyright: February 13, 2019

Abstract

Background

Improving Diagnosis in Healthcare calls for improved training in diagnostic reasoning and establishing non-judgmental forums to learn from diagnostic errors arising from heuristic-driven reasoning. Little is known about pediatric providers’ familiarity with heuristics or the culture surrounding forums where diagnostic errors are discussed. This study aimed to describe pediatric providers’ familiarity with common heuristics and perceptions surrounding public discussions of diagnostic errors.

Methods

We surveyed pediatric providers at a university-affiliated children’s hospital. The survey asked participants to identify common heuristics used during clinical reasoning (five definitions; four exemplar clinical vignettes). Participants answered questions regarding comfort publicly discussing their own diagnostic errors and barriers to sharing them.

Results

Seventy (30.6% response rate) faculty completed the survey. The mean number of correctly selected heuristics was 1.60/5 [standard deviation (SD)=1.13] and 1.01/4 (SD=1.06) for the definitions and vignettes, respectively. A low but significant correlation existed between correctly identifying a definition and selecting the correct heuristic in vignettes (Spearman’s ρ=0.27, p=0.02). Clinicians were significantly less likely to be “pretty” or “very” comfortable discussing diagnostic errors in public vs. private conversations (28.3% vs. 74.3%, p<0.01). The most frequently cited barriers to discussing errors were loss of reputation (62.9%) and fear of knowledge-base (58.6%) or decision-making (57.1%) being judged.

Conclusions

Pediatric providers demonstrated limited familiarity with common heuristics leading to diagnostic error. Greater years in practice is associated with more comfort discussing diagnostic errors, but negative peer and personal perceptions of diagnostic performance are common barriers to discussing errors publicly.

Introduction

The 2015 National Academy of Medicine’s report, Improving Diagnosis in Healthcare [1], called attention to diagnostic errors and their contribution to patient harm. Among the major themes highlighted in the report is the need to improve training in clinical reasoning among diagnosticians. Theoretically, diagnosticians who are aware of potential flaws in human decision-making might be better prepared to recognize and avoid pitfalls in their diagnostic reasoning, thereby reducing patient harm [2].

Diagnostic reasoning can be understood using the dual process theory of decision-making, a general model describing how humans make decisions [3]. Dual process theory suggests two modes of reasoning – System 1 and System 2 – are used to make decisions. Deliberate, conscious, time-consuming and effortful judgments characterize System 2 reasoning. System 1 reasoning, in contrast, is primarily intuitive, subconscious, fast and relatively effortless [4]. System 1 reasoning relies on heuristics (often called cognitive biases). These mental shortcuts allow experienced clinicians to arrive at decisions quickly with a reasonable degree of certainty in familiar situations. Using heuristics to make decisions is neither avoidable nor an inherently inappropriate method for clinical reasoning [5]. While experienced clinicians may recognize their reliance on System 1 reasoning in clinical practice, the theories underlying decision-making are not formally taught in most medical school or residency curricula [6].

In clinical vignettes, cognitive errors contribute to overall diagnostic errors in as many as 37–77% of cases [7] leading some authors to suggest that understanding the etiology of these errors is paramount to reducing them [2]. Others have argued that knowing about heuristics may not necessarily allow clinicians to avoid diagnostic errors [8]. Consequently, some authors have suggested novel interventions (e.g. diagnostic checklists, differential diagnosis generators) to mitigate the risks posed by heuristic-driven reasoning [9], [10]. However, the deliberate use of such tools requires awareness of situations at high risk for cognitive errors. It is not clear that diagnosticians currently possess this awareness, but some understanding of the role of heuristics in diagnostic reasoning may be helpful in reducing harm from diagnostic errors.

Discussion of diagnostic errors during morbidity and mortality (M&M) conferences represents an approach to reducing harm through structured analysis of the errors including identification of cognitive factors [11]. Katz and Detsky suggest that M&M is an ideal venue to introduce discussion of cognitive errors [12]. Fruitful discussions of diagnostic error require an environment that is open to honest discussion focused on patient safety rather than harmful criticism of individuals [13]. Yet, discussions of harm arising from flawed diagnostic reasoning are uncomfortable for clinicians [14], [15]. For example, loss of reputation and fear of incurring blame are two significant components that influence clinicians’ comfort discussing their own errors with colleagues [16]. Recently, the promotion of both psychological safety (including a non-blaming atmosphere) and transparency when reporting and reviewing medical errors in public venues has received significant attention [17]. While some institutions have deliberately revamped their M&M conferences to promote psychological safety [18], there remains an undercurrent of blame culture in medicine [19].

We aimed to better understand two obstacles to improving training in clinical reasoning for clinicians. First, little is known about how familiar clinicians are with heuristics operating during the diagnostic process. Second, it is not clear whether current culture promotes psychological safety when discussing diagnostic errors in public venues such as M&M. In anticipation of developing a curriculum on diagnostic errors at our institution, we wanted to better characterize these obstacles among a subset of pediatric faculty. We hypothesized that participants’ awareness of heuristic definitions would positively correlate with an ability to recognize heuristics operating in clinical vignettes. We also sought to describe participants’ comfort discussing their own diagnostic errors in public venues and the perceived barriers to such discussions.

Materials and methods

Study design and participants

An anonymous web-based survey was distributed by e-mail invitation to all faculty members of the Divisions of Critical Care, Emergency Medicine and Hospital Medicine within the Department of Pediatrics at our institution. These divisions were selected because they have mature tiered case review programs amenable to discussing cognitive errors alongside systems issues. Physicians and advanced-practice providers (APPs; i.e. nurse practitioners and physician assistants) were eligible. The survey was available from January 31 to March 31, 2017. E-mail invitations were distributed at the beginning of the enrollment period with e-mail reminders sent at 3 and 6 weeks after the survey opened. Full- and part-time clinical faculty were eligible to participate; we did not enroll trainees. The Local Institutional Review Board deemed the study exempt from review.

Survey instrument

The survey consisted of three sections. The first two sections asked participants to select definitions of common heuristics (base rate neglect, outcome bias, search satisfying, anchoring, gambler’s fallacy) and to name heuristics operating in four general pediatric clinical vignettes (premature closure, posterior probability error, visceral bias, framing effect). For each definition and vignette, the correct answer, three foils, and an “I don’t know” option were provided. No correct answer in the definition questions was used as a correct answer for the vignettes. We used definitions for heuristics as described by Croskerry [20]. We selected heuristics that commonly contribute to diagnostic errors discussed in M&M in our own clinical experience; data on the true frequency of diagnostic errors attributable to specific heuristics are lacking [7]. To ensure that the vignettes accurately demonstrated the heuristic of interest, two investigators (JAG, SIZ) held two focus groups with representatives from each division prior to finalizing the survey instrument; focus group members did not participate in the survey.

The third section of the survey assessed participants’ comfort discussing diagnostic errors and diagnostic uncertainty with trusted colleagues in private and in public venues using a five-point Likert scale. Additionally, we asked participants to identify the three most significant barriers to discussing their own diagnostic errors in public.

We pilot tested the survey with general pediatric residents, chief residents and fellows in the specialties to whom the survey was ultimately distributed. The survey was administered using REDCap [21] and took less than an average of 10 min to complete.

Analysis

Fisher’s exact and chi-squared (χ2) tests were used to compare survey completion by age, gender, years in practice and provider type. Descriptive statistics were used to evaluate demographic characteristics. Mean number of correct responses with standard deviation (SD) were calculated for the definition and clinical vignette items. “I don’t know” responses were categorized as incorrect. Tetrachoric and Spearman rank correlation coefficients were used to explore associations between the total number of correctly answered heuristic definitions relative to the total number of correctly identified heuristics in clinical vignettes. Fisher’s exact and χ2 tests were used to assess differences in the attitudinal items by years in practice (dichotomized to <6 years and ≥6 years), gender and provider type. Differences related to age were compared using the Kruskal-Wallis test. For items regarding comfort discussing diagnostic errors, we dichotomized the responses into “comfortable” (including ‘pretty’ and ‘very’ comfortable) and “uncomfortable” (‘not at all’, ‘a little’ and ‘somewhat’ comfortable) as efforts to improve comfort surrounding such discussions would likely target those who are less comfortable or at least ambivalent about such discussions.

Results

There were 229 eligible faculty from the three divisions surveyed. Overall, 88 (38.4%) faculty members accessed the survey and provided demographic information. Eight surveys were excluded from all further analyses because participants only provided demographic information; an additional 10 were excluded because they were only partially complete. There were no significant differences with respect to age, gender, years in practice or provider type between those that provided complete information compared to those with truncated responses. The overall response rate was 30.6% [22]. The response rate was significantly lower for APPs (9.0%) compared to physicians (44.3%) (p<0.001). Table 1 shows the demographics of participants. Only 15.9% endorsed receiving formal training in clinical reasoning beyond training received in medical school or, for APPs, during graduate training. All participants endorsing formal training were less than 50 years old. Most participants (84.3%) reported that they had been involved in a diagnostic error resulting in patient harm during their career.

Table 1:

Demographic characteristics of respondents with completed surveys.

Respondentsa: n=70 except where notedn%
Female (n=69)4260.9
Age, years
<3011.4
30–341825.7
35–392332.9
40–441318.6
45–49811.4
50–5445.7
55–5934.3
Provider type (APP)811.4
Years in practice after training completed
<157.1
1–2811.4
3–51927.1
6–102028.6
>111825.7
Formal training in clinical reasoning (yes) (n=69)1115.9
Personally involved in diagnostic error resulting in harm (yes)5984.3
Most recent personal experience with diagnostic error resulting in harm (n=59)
<1 year ago1830.5
1–5 years ago2949.2
>5 years ago1220.3
  1. aEligible participants from each section included: Critical Care=16 APPs, 16 physicians; Hospital Medicine=8 APPs, 31 physicians; Emergency Medicine=65 APPs, 93 physicians.

Knowledge of heuristic definitions

No participant answered all five heuristic definition questions correctly; the mean number of correctly selected heuristic definitions was 1.60/5 (SD=1.13) (Figure 1). Reporting formal training in clinical reasoning did not significantly increase the likelihood of correctly selecting the heuristic term for any definition (p-values range from p=0.05 to 1.00). Age, gender, years in practice and provider type were not significantly associated with answer accuracy.

Figure 1: Knowledge of heuristic definitions.Solid bars indicate the proportion of participants answering correctly. Stippled bars indicate the proportion of participants answering incorrectly. Hashed bars indicate the proportion of participants responding “I don’t know”.
Figure 1:

Knowledge of heuristic definitions.

Solid bars indicate the proportion of participants answering correctly. Stippled bars indicate the proportion of participants answering incorrectly. Hashed bars indicate the proportion of participants responding “I don’t know”.

Identifying heuristics in vignettes

Only two participants selected the correct heuristic for all four vignettes; the mean number of correctly selected heuristics among the vignettes was 1.01/4 (SD=1.06) (Figure 2). Reporting formal training in clinical reasoning did not significantly increase the likelihood of correctly identifying the heuristic for any vignette (p=0.72–1.00). Age, gender, years in practice and provider type were not significantly associated with answer accuracy for any of the vignettes.

Figure 2: Identifying heuristics in clinical vignettes.Solid bars indicate the proportion of participants answering correctly. Stippled bars indicate the proportion of participants answering incorrectly. Hashed bars indicate the proportion of participants responding “I don’t know”.
Figure 2:

Identifying heuristics in clinical vignettes.

Solid bars indicate the proportion of participants answering correctly. Stippled bars indicate the proportion of participants answering incorrectly. Hashed bars indicate the proportion of participants responding “I don’t know”.

There was a low but significant correlation between scores on the definition and vignette portions of the survey (Spearman’s ρ=0.27, p=0.02).

Comfort surrounding discussion of diagnostic errors

The proportion of participants reporting being “pretty” or “very” comfortable discussing diagnostic errors was significantly lower for public compared to private discussions: 28.6% vs. 74.3%, respectively (p<0.01). The proportion of participants reporting being “pretty” or “very” comfortable acknowledging diagnostic uncertainty to themselves was not significantly different than acknowledging diagnostic uncertainty to colleagues: 82.9% vs. 75.7%, respectively (p=0.06). Age, gender, formal training and provider type were not significantly associated with comfort levels. Years in practice was significantly associated with comfort discussing errors in public and acknowledging uncertainty to colleagues (Table 2).

Table 2:

Provider comfort discussing diagnostic errors in private and public settings by years in practice.

Comfort discussing diagnostic errors in privatep-Valuec
UncomfortableaComfortableb
Years in practice (n=70)
<6 years61.1%40.4%0.172
≥6 years38.9%59.6%
Comfort discussing diagnostic errors in public
Years in practice (n=70)
<6 years54.0%25.0%0.035
≥6 years46.0%75.0%
Comfortable acknowledging your diagnostic uncertainty to yourself
Years in practice (n=70)
<6 years50.0%44.8%0.761
≥6 years50.0%55.2%
Comfortable acknowledging your diagnostic uncertainty to colleagues
Years in practice (n=71)
<6 years76.5%35.9%0.005
≥6 years23.5%64.1%
  1. aIncludes not at all, a little or only somewhat comfortable; bincludes pretty or very comfortable; cFisher’s exact test. p-Values in bold typeface indicate statistically significant differences in proportion of respondents’ level of comfort discussing diagnostic errors.

Participants identified the three most significant barriers to discussing their own diagnostic errors in public venues (Table 3). Over 50% of the participants indicated that the most significant barriers were their reputation as a provider being at stake (62.9%), and their knowledge-base (58.6%) and decision-making being judged (57.1%).

Table 3:

Barriers to discussing one’s own diagnostic errors in public.

BarrieraProportion citing barrier % (n)
I feel like my reputation as a provider is at stake62.9% (44)
I feel like my knowledge-base is being judged58.6% (41)
I feel like my decision-making is being judged57.1% (40)
I feel like a bad clinician when I make an error48.6% (34)
I don’t want peers to know I made an error11.4% (8)
I don’t have any barriers to discussing my own diagnostic errors in public venues11.4% (8)
I feel like my job is at stake10.0% (7)
I don’t want trainees to know I made an error4.3% (3)
I feel like my work ethic is being judged2.9% (2)
I don’t want supervisors to know I made an error2.9% (2)
  1. aEach respondent could select up to three options.

Discussion

Pediatric faculty in this study demonstrated limited ability to identify heuristic definitions. Similarly, they were frequently unable to correctly name heuristics employed in clinical vignettes. We did find evidence to support our hypothesis that knowledge of heuristic definitions correlates with being able to identify heuristics in clinical vignettes, although the correlation is low. Pediatric faculty expressed significant discomfort discussing their own errors publicly in venues such as M&M despite being generally comfortable discussing diagnostic errors in private and acknowledging diagnostic uncertainty to themselves and their peers. This discomfort appears to arise from concerns about negative peer perceptions regarding clinical reputation and the diagnosticians’ clinical reasoning skills and knowledge.

Clinicians are central to efforts to reduce patient harm resulting from diagnostic errors. Success in these efforts may be dependent on two related conditions: (1) the diagnostician’s ability to recognize how diagnostic reasoning might be subconsciously influenced through reliance on heuristics; and (2) the willingness to discuss cognitive errors in public in order to identify solutions to avoid them in the future.

Pediatricians can appreciate their own diagnostic errors in hindsight. Singh and colleagues demonstrated that 54% of surveyed pediatricians admit to making 1–2 diagnostic errors per month, and 45% reported making 1–2 diagnostic errors per year that result in patient harm [23]. However, reducing patient harm due to diagnostic error requires that diagnosticians recognize and avoid heuristic failures prior to committing them.

There are few studies investigating strategies to avoid, or mitigate the effects of, heuristics and results are mixed [24]. Most studies have included trainees and have focused on clinical vignettes rather than real patient encounters. While some studies have demonstrated small but significant improvements in diagnostic accuracy when residents were instructed to rely on slow deliberate System 2 reasoning (through structured reflection) [25], [26], one study did not demonstrate a benefit with this approach [27]. Sherbino and colleagues provided instruction on two common heuristics (search satisfying and availability) to senior medical students followed by training on cognitive forcing strategies to avoid heuristic errors. Despite immediate testing after the educational intervention, fewer than 50% of students successfully “debiased” themselves [28]. Little work has been done to evaluate practicing clinicians’ familiarity with and ability to recognize heuristics. We conducted this study to begin to address that gap.

Participants in this study were unable to correctly name several defined heuristics or identify heuristics operating in several clinical vignettes. We found that formal training in diagnostic reasoning did not improve participants’ ability to identify heuristics. We did not collect information regarding the timing or content of this training so it is difficult to know whether more robust or more proximate education would have yielded improved accuracy. However, our results align with another study in which experts in diagnostic error were asked to identify heuristics (cognitive biases) perceived to be operating in eight clinical vignettes [29]. Participants demonstrated no agreement when naming heuristics represented by the vignettes (κ=0.0–0.6).

The goal of introducing diagnosticians to concepts including heuristics and dual process theory is not to teach new vocabulary. Rather, improving diagnosticians’ awareness of these concepts aims to improve recognition of the risk of diagnostic errors arising from System 1 reasoning. Nemes-Walsh and Lee evaluated the effect of an educational intervention introducing topics including dual process theory and cognitive bias pertaining to diagnostic errors [30]. This study demonstrated a slight improvement in responses on the National Patient Safety Foundation Reducing Diagnostic Errors: Strategies for Solutions Quiz and increased willingness to take a “diagnostic timeout” following the intervention. There was no indication, however, that fewer diagnostic errors occurred after the training. In a more rigorous study, residents completing a yearlong curriculum on cognitive biases and diagnostic errors demonstrated improvement in correctly naming heuristics [31]. Additionally, they were frequently able to identify heuristics in standardized clinical vignettes and describe potential debiasing strategies. The modest yet significant association between correctly naming defined heuristics and identifying heuristics in clinical vignettes in the present study aligns with those findings. While mastery of vocabulary alone is unlikely to reduce diagnostic errors arising from use of heuristics, conceptual familiarity with this type of reasoning may improve recognition of high-risk clinical scenarios and warrants further study to determine whether these interventions reduce diagnostic errors in clinical encounters.

Diagnosticians in this study were generally comfortable speaking privately to peers about their diagnostic errors. That comfort did not persist when participants considered discussing diagnostic errors in public settings. This discomfort appears to stem from a general concern over how diagnosticians’ knowledge, decision-making and reputation are judged rather than the error simply being made public. Diagnosticians also reported a poorer self-perception of themselves as clinicians as a barrier. These findings suggest significant cultural changes are necessary to advance patient safety. We did not inquire about the fear of malpractice litigation which may also be a substantial barrier to openly discussing diagnostic errors.

We were encouraged by the finding that more experienced clinicians were significantly more comfortable with public discussions of their errors than their less experienced peers. These data imply that senior clinicians may serve as influential role models in the cultural shift required to share experiences publicly in efforts to reduce harm from diagnostic errors.

Interpretation of our study must be considered in light of some limitations. We experienced a low response rate from APPs and, as a pilot study, sampled a relatively narrow segment of our hospital’s pediatric faculty. Our findings may not be representative of faculty at other hospitals or providers in non-academic and outpatient settings. We studied a small number of heuristics. Croskerry lists over 30 heuristics encountered in clinical reasoning but data describing which heuristics are most common or significant in diagnostic reasoning are limited [20]. Thus, we chose heuristics that our own clinical experience indicates are employed relatively frequently but acknowledge that they are not proven to be the most common culprits in diagnostic errors. We also acknowledge that our survey attempted to assess familiarity with heuristic concepts using definitions as a surrogate. It would be more useful to assess familiarity using more qualitative methods discussing vignettes with diagnosticians; this is an opportunity for future study.

Conclusions

Clinicians in our study were unable to match heuristic definitions or clinical vignettes exemplifying heuristics to the proper heuristic term. Attempts to foster a culture supportive of discussing diagnostic errors will require both improved awareness of the role heuristics play in the diagnostic process and better methods to promote psychological safety when sharing errors.


Corresponding author: Joseph A. Grubenhoff, MD, MSCS, Associate Professor of Pediatrics, Associate Medical Director of Clinical Effectiveness, University of Colorado Denver School of Medicine, 13123 E. 16th Avenue, B-251, Aurora, CO 80045, USA; and Children’s Hospital Colorado, Aurora, CO, USA, Phone: 3037242581

  1. Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission.

  2. Research funding: None declared.

  3. Employment or leadership: None declared.

  4. Honorarium: None declared.

  5. Competing interests: The funding organization(s) played no role in the study design; in the collection, analysis, and interpretation of data; in the writing of the report; or in the decision to submit the report for publication.

References

1. Balogh E, Miller BT, Ball J, Institute of Medicine (U.S.). Committee on Diagnostic Error in Health Care. Improving diagnosis in health care. Washington, DC: The National Academies Press, 2015.10.17226/21794Search in Google Scholar PubMed

2. Bordini BJ, Stephany A, Kliegman R. Overcoming diagnostic errors in medical practice. J Pediatr 2017;185:19–25.e1.10.1016/j.jpeds.2017.02.065Search in Google Scholar PubMed

3. Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf 2013;22:ii58–ii64.10.1136/bmjqs-2012-001712Search in Google Scholar PubMed PubMed Central

4. Evans JS. Dual-processing accounts of reasoning, judgment, and social cognition. Annu Rev Psychol 2008;59:255–78.10.1146/annurev.psych.59.103006.093629Search in Google Scholar PubMed

5. McLaughlin K, Eva KW, Norman GR. Reexamining our bias against heuristics. Adv Health Sci Educ Theor Pract 2014;19:457–64.10.1007/s10459-014-9518-4Search in Google Scholar PubMed

6. Rencic J, Trowbridge RL, Jr, Fagan M, Szauter K, Durning S. Clinical reasoning education at US medical schools: results from a national survey of internal medicine clerkship directors. J Gen Intern Med 2017;32:1242–6.10.1007/s11606-017-4159-ySearch in Google Scholar PubMed PubMed Central

7. Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: a systematic review. BMC Med Inform Decis Mak 2016;16:138.10.1186/s12911-016-0377-1Search in Google Scholar PubMed PubMed Central

8. Gordon R, Franklin N. Cognitive underpinnings of diagnostic error. Acad Med 2003;78:782.10.1097/00001888-200308000-00005Search in Google Scholar PubMed

9. Graber ML, Sorensen AV, Biswas J, Modi V, Wackett A, Johnson S, et al. Developing checklists to prevent diagnostic error in Emergency Room settings. Diagnosis 2014;1:223–31.10.1515/dx-2014-0019Search in Google Scholar PubMed PubMed Central

10. Riches N, Panagioti M, Alam R, Cheraghi-Sohi S, Campbell S, Esmail A, et al. The effectiveness of electronic differential diagnoses (DDX) generators: a systematic review and meta-analysis. PLoS One 2016;11:e0148991.10.1371/journal.pone.0148991Search in Google Scholar PubMed PubMed Central

11. Cifra CL, Jones KL, Ascenzi JA, Bhalala US, Bembea MM, Newman-Toker DE, et al. Diagnostic errors in a PICU: insights from the morbidity and mortality conference. Pediatr Crit Care Med 2015;16:468–76.10.1097/PCC.0000000000000398Search in Google Scholar PubMed

12. Katz D, Detsky AS. Incorporating metacognition into morbidity and mortality rounds: the next frontier in quality improvement. J Hosp Med 2016;11:120–2.10.1002/jhm.2505Search in Google Scholar PubMed

13. Woodward HI, Lemer C, Wu AW. An end to the witch hunts: responding to the defenders of blame and shame. A commentary on Collins, Block, Arnold and Christakis. Soc Sci Med 2009;69:1291–3.10.1016/j.socscimed.2009.08.008Search in Google Scholar PubMed

14. Hilfiker D. Facing our mistakes. NEJM 1984;310:118–22.10.1056/NEJM198401123100211Search in Google Scholar PubMed

15. Luu S, Leung SO, Moulton CA. When bad things happen to good surgeons: reactions to adverse events. Surg Clin North Am 2012;92:153–61.10.1016/j.suc.2011.12.002Search in Google Scholar PubMed

16. Kaldjian LC, Forman-Hoffman VL, Jones EW, Wu BJ, Levi BH, Rosenthal GE. Do faculty and resident physicians discuss their medical errors? J Med Ethics 2008;34:717–22.10.1136/jme.2007.023713Search in Google Scholar PubMed

17. Khatri N, Brown GD, Hicks LL. From a blame culture to a just culture in health care. Health Care Manage Rev 2009;34:312–22.10.1097/HMR.0b013e3181a3b709Search in Google Scholar PubMed

18. Tad YD, Pierce RG, Pell JM, Stephan L, Kneeland PP, Wald HL. Leveraging a redesigned morbidity and mortality conference that incorporates the clinical and educational missions of improving quality and patient safety. Acad Med 2016;91:1239–43.10.1097/ACM.0000000000001150Search in Google Scholar PubMed

19. Collins ME, Block SD, Arnold RM, Christakis NA. On the prospects for a blame-free medical culture. Soc Sci Med 2009;69:1287–90.10.1016/j.socscimed.2009.08.033Search in Google Scholar PubMed

20. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med 2003;78:775–80.10.1097/00001888-200308000-00003Search in Google Scholar PubMed

21. Harris PA, Taylor R, Thielke R, Payne J, Gonzalez N, Conde JG. Research electronic data capture (REDCap) – a metadata-driven methodology and workflow process for providing translational research informatics support. J Biomed Sci 2009;42:377–81.10.1016/j.jbi.2008.08.010Search in Google Scholar PubMed PubMed Central

22. American Association of Public Opinion Research. Standard definitions: final dispositions of case codes and outcome rates for surveys, 9th edition; 2016.Search in Google Scholar

23. Singh H, Thomas EJ, Wilson L, Kelly PA, Pietz K, Elkeeb D, et al. Errors of diagnosis in pediatric practice: a multisite survey. Pediatrics 2010;126:70–9.10.1542/peds.2009-3218Search in Google Scholar PubMed PubMed Central

24. McDonald KM, Matesic B, Contopoulos-Ioannidis DG, Lonhart J, Schmidt E, Pineda N, et al. Patient safety strategies targeted at diagnostic errors: a systematic review. Ann Intern Med 2013;158:381–9.10.7326/0003-4819-158-5-201303051-00004Search in Google Scholar PubMed

25. Monteiro SD, Sherbino J, Patel A, Mazzetti I, Norman GR, Howey E. Reflecting on diagnostic errors: taking a second look is not enough. J Gen Intern Med 2015;30:1270–4.10.1007/s11606-015-3369-4Search in Google Scholar PubMed PubMed Central

26. Mamede S, van Gog T, van den Berge K, Rikers RM, van Saase JL, van Guldener C, et al. Effect of availability bias and reflective reasoning on diagnostic accuracy among internal medicine residents. J Am Med Assoc 2010;304:1198–203.10.1001/jama.2010.1276Search in Google Scholar PubMed

27. Norman G, Sherbino J, Dore K, Wood T, Young M, Gaissmaier W, et al. The etiology of diagnostic errors: a controlled trial of system 1 versus system 2 reasoning. Acad Med 2014;89:277–84.10.1097/ACM.0000000000000105Search in Google Scholar PubMed

28. Sherbino J, Yip S, Dore KL, Siu E, Norman GR. The effectiveness of cognitive forcing strategies to decrease diagnostic error: an exploratory study. Teach Learn Med 2011;23:78–84.10.1080/10401334.2011.536897Search in Google Scholar PubMed

29. Zwaan L, Monteiro S, Sherbino J, Ilgen J, Howey B, Norman G. Is bias in the eye of the beholder? A vignette study to assess recognition of cognitive biases in clinical case workups. BMJ Qual Saf 2017;26:104–10.10.1136/bmjqs-2015-005014Search in Google Scholar PubMed

30. Nemes-Walsh JK, Lee AJ. Diagnostic errors: impact of an educational intervention on pediatric primary care. J Pediatr Healthcare 2017;32:10.10.1016/j.pedhc.2017.07.004Search in Google Scholar

31. Reilly JB, Ogdie AR, Von Feldt JM, Myers JS. Teaching about how doctors think: a longitudinal curriculum in cognitive bias and diagnostic error for residents. BMJ Qual Saf 2013;22:1044–50.10.1136/bmjqs-2013-001987Search in Google Scholar PubMed

Received: 2018-07-19
Accepted: 2019-01-21
Published Online: 2019-02-13
Published in Print: 2019-06-26

©2019 Walter de Gruyter GmbH, Berlin/Boston

Articles in the same Issue

  1. Frontmatter
  2. Editorial
  3. Diagnosis education – an emerging field
  4. Technology in Diagnosis Education
  5. Morning report innovation: Case Oriented Report and Exam Skills
  6. Integrating Bayesian reasoning into medical education using smartphone apps
  7. A simulation-based approach to training in heuristic clinical decision-making
  8. Educators
  9. Pediatric faculty knowledge and comfort discussing diagnostic errors: a pilot survey to understand barriers to an educational program
  10. A workshop to train medicine faculty to teach clinical reasoning
  11. Development and evaluation of a clinical reasoning curriculum as part of an Internal Medicine Residency Program
  12. Diagnostic uncertainty: from education to communication
  13. Basic Science of Diagnosis Education
  14. Use of clinical reasoning tasks by medical students
  15. Scaffolding clinical reasoning of medical students with virtual patients: effects on diagnostic accuracy, efficiency, and errors
  16. Understanding diagnosis through ACTion: evaluation of a point-of-care checklist for junior emergency medical residents
  17. Studies of Diagnosis in Clinical Contexts
  18. Internal medicine residents’ evaluation of fevers overnight
  19. Implementation of a clinical reasoning curriculum for clerkship-level medical students: a pseudo-randomized and controlled study
  20. Integration with Other Fields
  21. Diagnostic error, quality assurance, and medical malpractice/risk management education in emergency medicine residency training programs
  22. Teaching novice clinicians how to reduce diagnostic waste and errors by applying the Toyota Production System
Downloaded on 22.9.2025 from https://www.degruyterbrill.com/document/doi/10.1515/dx-2018-0056/html?lang=en
Scroll to top button