Startseite Diagnosis education: a decade of progress, with more needed
Artikel Open Access

Diagnosis education: a decade of progress, with more needed

  • Andrew P. J. Olson ORCID logo EMAIL logo , Joseph J. Rencic ORCID logo und Thilan P. Wijesekera
Veröffentlicht/Copyright: 13. Oktober 2025
Diagnosis
Aus der Zeitschrift Diagnosis Band 12 Heft 4

Abstract

In the decade since the National Academies of Sciences, Engineering, and Medicine (NASEM) report Improving Diagnosis in Health Care, substantial progress has been made in understanding and teaching diagnostic reasoning. This manuscript reviews key advancements in the science and theory of clinical reasoning, including the shift from exclusive focus on individual cognitive models to those that embrace context and team-based approaches. Recent innovations in diagnosis education, such as development of formal competencies, use of structured reflection, and approaches to assessment are discussed. Despite these gains, challenges remain in translating theory into practice, particularly in curricular innovation and implementation, faculty development, and assessment. The emergence of generative artificial intelligence presents both opportunities and imperatives for reimagining diagnosis education. The authors call for sustained efforts to embed diagnostic excellence across health professions education, emphasizing interprofessional collaboration, patient engagement, and system-level reform to reduce diagnostic error and improve outcomes.

Introduction

One of the main recommendations in the seminal NASEM report, Improving Diagnosis in Health Care, was to improve training in health professions education in diagnostic reasoning [1]. The burden of diagnostic error continues to weigh heavily on all those involved in health care, and the pursuit of diagnostic excellence has become a shared goal across health care [2]. In the decade since this report’s publication and this call to improve education, substantial progress has been made in how diagnosis is understood, taught and learned. Our understanding of the science of clinical (specifically diagnostic) reasoning has continued to expand and the empirical research underlying this understanding has deepened. However, like other areas of medicine and health care, there remains a gap between evidence and practice. We know much of what we should do with respect to education and training in diagnosis but the translation of this knowledge into educational practice remains challenging and incomplete, and opportunities remain. Such opportunities were highlighted in a national study of Internal Medicine Clerkship Directors: most institutions lack specific content focused on clinical reasoning [3]. In this manuscript, we will highlight important advances related to education about diagnosis and diagnostic reasoning, identify continued opportunities to translate these advances into educational practice, and highlight emerging trends that will impact education and training about diagnosis and diagnostic reasoning.

Evolution of the science and theory of clinical reasoning

For as long as there has been medicine, there has been clinical reasoning. That is, clinicians have gathered, appraised, and synthesized information to make diagnostic and prognostic judgements and management decisions. Until recently, however, our understanding of diagnostic reasoning and how to teach it has been largely experiential rather than theory-informed. Fortunately, over the last decade, there has been substantial progress in definitions regarding diagnostic reasoning, the theories that inform our understanding of diagnostic reasoning, and how these definitions and theories inform education.

A very helpful definition by Young et al. defines clinical reasoning as “…a process that operates toward the purpose of arriving at a diagnosis, treatment, and/or management plan, as well as striving for improved patient outcomes and well-being.” [4] While broad, this definition helps make clear that clinical reasoning is a process (rather than simply a phenomenon) with a clear goal. This definition of clinical reasoning also highlights the importance and interrelatedness of diagnosis and management reasoning, and the impact that both have on the health and wellness of patients.

Our theoretical understanding of the underlying cognitive processes of diagnostic reasoning has progressed significantly in recent years, and important concepts are shown in Table 1. Classical theories, such as dual-process theory, still form the basis for our conceptualization of how clinicians make decisions. Clinicians, just like humans in every other situation they encounter, use both non-analytic and analytic decision-making processes to make diagnoses. It has become clear, however, that both systems are error-prone. Efforts simply to encourage analytic reasoning and those focused on de-biasing may be helpful but not complete solutions [11], [12]. Instead, content-focused strategies to continually expand knowledge and enable its retrieval and application are necessary. There are some emerging ideas that conceive of clinical reasoning (both diagnosis and management) as a type of reasoning that both activates and implements context-specific schema to identify the best way to proceed – that is, clinical expertise drives at creating inference to the best action in each clinical situation [13].

Table 1:

Key concepts informing diagnosis education.

Concept Working definition Example
Context specificity [5] Patient factors, physician factors, and environmental factors that affect diagnostic reasoning Even despite a patient having the same symptoms, a tired physician working overnight may have more difficulty diagnosing them in a busy hospital than if it were the only patient they were seeing during the day
Content specificity [5] A clinician’s diagnostic ability is dependent on their medical knowledge in a given domain. A generalist clinician may be an excellent at diagnosing rotator cuff disease, but poor at dermatologic conditions
Cognitive load theory [6], [7] How different cognitive demands (intrinsic, extraneous, and germane load) affect our ability to make diagnoses A student is cognitively overwhelmed by a patient’s myriad symptoms and signs while an experienced clinician easily chunks the data into the cognitive bucket of an “acute inflammatory respiratory syndrome”
Dual process theory [8] How system 1 (fast, heuristic, intuitive) and system 2 (slow, systematic, analytical) thinking are used by a clinician to make a diagnosis A patient’s mole immediately triggers the thought of melanoma to a dermatologist, but they perform a careful exam to confirm the diagnosis.
Ecological psychology [9] How providers use their effectivities (i.e. knowledge, skills, and abilities) to diagnose and treat patients in a clinical environment that includes affordances (i.e. opportunities to make a timely diagnosis) Clinician hearing a murmur despite the noise in a busy emergency department
Distributed cognition [10] Bidirectional interactions between individuals and artifacts (e.g., medical charts) facilitate clinical reasoning A physician assistant’s history-taking and a radiologist’s chest X-ray read combine to make a pneumonia diagnosis

As knowledge expands, it is continually refined and reorganized based on experience and feedback about performance [14]. Unfortunately, unlike most fields of human performance that have robust and routine systems to enable individuals to know the outcomes of their decisions and refine their future practice based on these outcomes, clinical medicine is largely comprised of open-loop systems without reliable means to deliver feedback about diagnostic reasoning [15]. There has been significant focus on building programs to ensure this feedback occurs, although most of them have been in research or pilot settings, and getting meaningful feedback that will improve clinician performance is difficult [16], [17], [18], [19].

Perhaps the most important progress in the theory of clinical reasoning has been moving from a singular focus on individuals and their cognitive processes to a broader focus on how the interactions between clinicians, patients, and clinical environments interact in specific contexts impact diagnostic reasoning. Diagnosis is a team sport that includes patients and other health professionals [20]. Focusing only on how individuals reason can lead to an inaccurate or incomplete understanding of diagnostic judgements. As a result, researchers have used social cognitive theories to explore the impact of interactions between health professionals and the environment (e.g., clinical contexts, the electronic medical record, artificial intelligence) on diagnostic reasoning. In addition, the brain encodes knowledge within context so the more learners learn in clinical environments, the more likely they will be to recall that knowledge in similar environments. Context is foundational rather than something to be managed or mitigated [21], [22], [23], [24], [25]. Thus, medicine and diagnostic reasoning must be taught and learned as much in context as possible while seeking to manage cognitive load to maximize learning and promote accurate diagnosis [6].

Innovations in diagnosis education

As the science and theory of clinical reasoning, especially diagnostic reasoning, has continued to progress, diagnosis education has also made substantial progress. Even the coining of the term “diagnosis education” reflects a more nuanced understanding of the need for explicit (rather than tacit or purely experiential) teaching and learning about diagnosis [26]. Recent work has sought to identify diagnosis education and how to best further diagnostic excellence through education [27]. Interprofessional groups of health professions education experts and patients have sought to codify competencies necessary for diagnostic excellence based on the individual learner level, as well as the team and system level [28], [29], [30]. The overall goal of these projects was to outline outcomes that diagnosis education programs should seek to achieve in their learners. The content and structure of these competencies reflect the theoretical progress previously described and thus have important implications for the refinement and design of diagnosis education programs. Recognizing the importance of context and team-based diagnosis, the Society to Improve Diagnosis in Medicine competencies are applicable to all health professions and explicitly divided into individual, team-based, and systems competencies [30]. This structure clearly denotes that educational programs must address all of these levels. Further, robust educational programs must address competence in each of these levels. Perhaps most importantly, these competencies are meant to be tailored to, and shared between, different health professionals. Diagnosis is not the purview of one profession but is sacredly shared among all those involved in a patient’s journey. Thus, all health professions education programs must focus to a degree on that professional’s role in the diagnostic process with awareness about how they contribute to diagnostic excellence. Complementary projects by the DID-ACT consortium [28] and CReME [29] also delineated the structure and content of clinical reasoning curricula, highlighting the primacy of knowledge as well as the need for faculty development and high-quality curricular resources are available for use in all institutions.

With any complex ability or task (i.e., competency) like diagnostic reasoning, there are “sub-abilities” or “subtasks” that need to be developed in novices so that they can eventually perform the whole ability or task effectively. The key sub-abilities, or sub-tasks, of high-quality diagnostic reasoning include hypothesis-driven data collection, problem representation, differential diagnosis generation, prioritization, and justification, as well as metacognitive and reflective strategies to improve diagnostic performance. Identifying these sub-abilities or subtasks and strategies provides a structure for curricular design and a roadmap for assessments across professions as well as across the continuum of education to practice. As learners develop competence over time in diagnostic reasoning, repeated focus on these sub-abilities/tasks as well as the whole task of diagnostic reasoning allows for the development of more expert performance [31]. Deliberate practice is often discussed and encouraged, although often more in aspirational or theoretical discussions than in educational reality. Deliberate practice requires repeated opportunities to practice a specific task and receive feedback about that performance over time; it is effortful, time-intensive, and longitudinal. Classroom clinical reasoning activities are not authentic enough and it is nearly impossible to perform deliberate practice in clinical environments given competing priorities of patient care, learning, and resource constraints. Because clinical reasoning is content- and context-specific, high- or medium-fidelity simulation with artificial intelligence-provided feedback may be a scalable model for increasing opportunities for deliberate practice in diagnostic reasoning education. That is, a learner may be able to have enough repeated opportunities for practice – and receive feedback about that practice- to approach the level of focused repetition and feedback required for deliberate practice. It is clear that focus on whole task diagnosis practice, as well as sub-ability/task skill development, with immediate feedback and sustained, repeated exposures (repetitions) is necessary for expertise to develop [31], [32].

An additional sub-ability/process needed in diagnostic reasoning is metacognition, including reflection. Although reflection is clearly essential to developing any ability and has been proven to improve diagnostic learning, the impact of reflection on diagnostic accuracy in clinical contexts remains uncertain. For example, researchers have demonstrated that structured reflection (Table 2) improved diagnostic performance among clinicians on difficult clinical vignette cases [33]. Unfortunately, structured reflection did not improve diagnostic performance on easy clinical vignettes and there is a lack of research proving the benefit of structured reflection in authentic clinical settings outside learning settings. Given these limitations, additional research is needed to determine whether structured reflection improves diagnostic accuracy in clinical settings and, if so, what are best practices for implementing it given that physicians have difficulty discriminating easy from hard cases and need to use efficient reasoning processes to get through busy clinical days. Recent explorations pairing humans with artificial intelligence tools (specifically large language models) have intentionally used structured reflection as a diagnostic reasoning tool while also using performance in a structured reflection activity as a measure of diagnostic expertise [34].

Table 2:

Sample structured reflection grid with instructions.

Possible diagnoses (one per box) Findings or risk factors supporting this hypothesis: Findings opposing this hypothesis: Findings that were expected for the diagnosis, but not present: Final ranking of likelihood
  1. An example of a structured reflection grid (as per Mamede et al.) is shown [33]. (Users are instructed to first write down the most likely diagnosis based on initial impressions and then proceed to complete the associated row, including identifying information that supports and opposes this diagnosis. They are also instructed to identify information that would be expected to be present if the diagnosis were accurate but were missing. The process is then followed for additional differential diagnosis items in subsequent rows; at the end of the task, users are instructed to rank-order the diagnoses based on likelihood.

The Competencies to Improve Diagnosis developed by the Society to Improve Diagnosis in Medicine also highlight the importance of equipping learners (and those in practice) to work as members of the diagnostic team, especially given the calls for more clinical reasoning education within the professions of nursing and pharmacy [35], [36]. This means not only deeply and intentionally collaborating with other members of the health care team, but also stating unequivocally that patients and their loved ones are members of that health care team. Thus, learners must be trained to collaborate effectively with patients and their loved ones in order to co-create diagnostic excellence. Diagnosis is not something done for patients, but instead is done with patients. Educational programs have substantial opportunity to design curricular tools and structures to engage patients as teachers, collaborators, and co-designers of diagnostic curricula, just as patients -ideally- are co-creators of the diagnostic process. If we encourage diagnosing together in practice, we should aim to learn how to diagnose together during training as well.

Finally, these competencies also highlight the importance that health systems have on diagnostic performance. The modern health care ecosystem is typified by discontinuity, scarcity mindsets, inequity, and unaligned motivations and incentives between patients, health care systems, insurers, and others; yet, this modern health care ecosystem can also achieve miraculous outcomes. Learners must be trained to work within these systems to achieve diagnostic excellence and be equipped and empowered as change agents to ensure these systems work better in the future.

There are many challenges to furthering diagnosis education in Health Professions Education programs. Foremost, there are many competing curricular priorities that programs are called to address and balance. These are all worthy, yet innovations must be incorporated into already over-packed curricula. At the same time, there is are no shortcuts to diagnostic reasoning expertise. If we seek to have learners gain expertise through deliberate practice, they must have longitudinal, repeated, and focused opportunities to practice their diagnostic reasoning and receive feedback. Such opportunities remain generally aspirational in most training settings.

In addition, when designing curricula, it is important to recognize that diagnosis is deeply interconnected with all other aspects of health care, and thus curricular interventions focused on health equity, health systems science, communication, and technological innovation (to name but a few) all can incorporate diagnosis-focused content. High-quality diagnosis is the cornerstone of good medical care, and deliberate focus and sustained attention on diagnostic excellence will reap benefits in patient management, communication, patient outcomes, and even learner and clinician wellness. Another challenge is that diagnosis remains relatively tacit in program competencies, although there has been progress. Connor et al. suggest that clinical reasoning be codified as a core competency for Graduate Medical Education by the Accreditation Council for Graduate Medical Education [37]. Such a change is yet unrealized, but if made (possibly in the form of a dedicated “Diagnosis” course) would drive necessary sustained curricular attention and assessment as well as ensuring program-level accountability for ensuring trainees are competent in clinical reasoning and its various aspects. Similar calls have been made for competencies in Canada and Europe [38].

Finally, another major challenge in diagnostic reasoning education is in the limited progress in related faculty development across the health professions. The lack of diagnosis-specific curricula in these programs has proven to be a feed-forward loop in which many institutions have relatively few faculty equipped to design and/or deliver high-quality clinical reasoning content and perform assessments. This faculty development challenge must be met head-on in an internationally collaborative fashion, seeking to develop communities of practice aimed at improving diagnosis education.

Progress in assessment of clinical reasoning

Clinical reasoning assessment has been called the “Holy Grail” of assessment in health professions education – pursued, but not yet found [39]. Yet, it should be recognized that if resources were unlimited, a credible and reliable clinical reasoning assessment could be created. Imagine a 2-day assessment in inpatient and outpatient settings where learners would be required to diagnose 50 real patients with a wide variety of diseases with acute and chronic typical and atypical presentations that can be definitively diagnosed, with outstanding clinician educators who are expert diagnosticians serving as their observers. By the end of such an assessment, there is little doubt that one could determine the diagnostic reasoning ability of each student. The obvious problem with this type of assessment is its lack of feasibility. Diagnostic reasoning’s content-specificity (e.g., one student may easily diagnose an acute myocardial infarction but not a gout flare) and context-specificity (e.g., one student may easily diagnose myocardial infarction in the emergency department but misdiagnose it in a primary care clinic) means that large sample sizes of performance must be obtained for credible and reliable assessment. Some of the lack of progress in clinical reasoning and diagnosis education stems from the need to continue to further the science and application of assessment of clinical reasoning. This assessment can be divided into assessment for learning (i.e., formative) and assessment of learning (i.e., summative). Both modalities are necessary for programs of assessment and are not necessarily exclusive (e.g., a high stakes objective structured clinical examination [OSCE] for promotion to the clinical phase of medical school where students can review the videos of their performance with a faculty member).

The primary challenges in the assessment of diagnostic reasoning are content- and context specificity and limited assessment resources. However, in the past 10 years since the NASEM report, there have also been promising advancements and trends in the assessment of clinical reasoning. Foremost, educators have become increasingly aligned on the components of the clinical reasoning process, allowing them to map out which methods can help meet the needs of their learners. Commonly, assessment methods are divided into non-workplace based (i.e., classroom) and workplace based (i.e., clinical) modalities, each with their own advantages and disadvantages [40].

Some notable examples of non-workplace based assessments include multiple choice questions, online simulated case-based exercises, concept maps, essay questions, script concordance testing, oral examinations, observed structured clinical examinations, and technology-enhanced simulation. Due to their relative ease of development and capacity to cover a wide range of topics, multiple choice questions-where a clinical vignette is followed by up to five potential answer choices that could be selected-remain the most prevalent form of clinical reasoning assessment in health professions education despite the limitations of cueing effects and lack of explanation around how learners arrived at their answers.Of the non-workplace based assessments, OSCEs, evaluations where learners complete multiple “stations” of clinical tasks (e.g., history-taking and physical examination) with standardized patients, are generally seen as the highest fidelity and well-rounded clinical reasoning assessment method but are time and resource intensive. To that end, recent years have seen the growth of technology enhanced simulation, particularly around case-based products with virtual patients, which allow educators to evaluate learners’ clinical reasoning asynchronously in realistic scenarios. Unfortunately, the lower fidelity and laborious nature of creating, administering, and/or scoring concept maps nature of creating and scoring concept maps, essay questions, script concordance testing, and oral examinations has limited their recent use, though artificial intelligence may be helpful in the design and implementation of those assessment methods at scale.

Workplace based assessments, in which the learner performs the tasks of interest in an actual clinical environment, are non-standardized, prone to the subjectivity and expertise of the observing supervisor, more resource-intensive, and subject to disruption. However, they are also appropriately aligned with the fundamental concept of context-specificity; when a learner is observed performing a task in the environment in which that task usually takes place, the effect of that environment on the learner and the task are evident. Given its authenticity, workplace-based assessment is an essential component of any program of clinical reasoning assessment. Despite their limitations, institutions and programs should invest significant resources into improving these assessments’ credibility and reliability (e.g., develop a core group of educators who receive training on clinical reasoning assessment for workplace based assessments).

Just as research and education about diagnostic reasoning have moved beyond a singular focus on individual cognition to seeking to develop understanding and competence as a member of a diagnostic team in context, so must our assessments change to assess the diagnostic performance of teams. While there are existing competencies and attendant assessment tools that focus on interprofessional education and practice [41], these tools are not specific to diagnosis. Further, the dependent variable (outcome) of team-based assessments is teamwork itself, not diagnostic quality or safety. The two major challenges to creating and executing such assessments are feasibility and the development of meaningful and credible process and outcomes measures for team-based diagnostic reasoning quality. From a feasibility of nonworkplace-based assessment perspective, getting diverse health professional students, trainees, or staff together at the same time is difficult. For workplace-based assessment perspective, team-based diagnostic reasoning is often episodic and asynchronous, with different team members involved in various episodes, so capturing it in real-time is challenging [20]. The development of measures that specifically and reliably measure the quality of both the content and processes of team-based diagnostic reasoning teamwork will require significant theoretical and practical research. In addition, context specificity has an enormous impact on team-based diagnosis but finding credible measures of its impact is challenging. Hopefully, research can lead to the development of future tools to determine the quality of team-based diagnostic reasoning and then help inform the optimization of diagnostic teams.

What does the next decade hold for diagnosis education?

Just as the decade since the NASEM report has been met with substantial progress and remaining opportunities, the next decade holds great promise – and the risk of unrealized promises. The most significant innovation affecting education and practice in the last few years that will dominate the decade to come is the advent, proliferation, and improvement of generative artificial intelligence tools for clinical reasoning [42]. While there is some overlap of these tools with previous decision support tools that have shown mixed results [43], these tools have capabilities that are remarkably different and more expansive than previous tools. Early simulated studies of diagnosis [34] and management reasoning [44] have shown that these tools demonstrate excellent diagnostic accuracy and management reasoning ability compared to humans. However, it is imperative that we move from studying these tools in constrained simulated settings and into the world, just as our conception of clinical reasoning has moved from “between our ears” to “into the wild.”

Similarly, educational programs must proactively train learners to be diagnosticians for the next many decades and thus partner with these emerging tools to enable diagnostic excellence. These tools are not a panacea, however, and we must rigorously evaluate these tools, their attendant curricula, and their outcomes carefully [45]. Further, it is likely even more important that learners and teachers understand the underlying building blocks and competencies for diagnosis in order to serve as effective users of these tools. That said, there are already promising interventions employing artificial intelligence to apply to clinical reasoning competency assessment, even in near-real-time evaluation of learner documentation in electronic health record [46], [47], [48].

Generative artificial intelligence will likely prove to be a significant disruptive innovation in modern medicine, and we must equip ourselves and our learners for this new world. Further, unlike other decision support systems that have been met with either limited effectiveness [43], limited uptake [49], or both, artificial intelligence holds promise to be an actively participating member of the diagnostic team that does not rely on clinicians seeking their help. While an in-depth discussion of artificial intelligence is out of the scope of this paper, it is imperative to consider the changes that health professions education programs must make to prepare for the proliferation of these tools in health care. Just as it would be foolish to train new US drivers for careers as taxi drivers, it is foolish for health professions education programs to train clinicians who are not prepared to partner with artificial intelligence tools. Of course, this means that programs must critically evaluate all aspects of their curriculum and proactively identify opportunities for innovation. The clinicians of the future will have different requisite skill sets than those of the past – or even today – and thus some degree of deskilling, reskilling, and upskilling will be necessary. While diagnosis in the future will retain the same key building blocks and underlying theoretical principles, the tools to achieve diagnostic excellence will continue to evolve, and our learners must be prepared to use them as they emerge.

Conclusions

In the decade since the NASEM report about Improving Diagnosis in Healthcare, our understanding of diagnostic reasoning has evolved, allowing the field of diagnosis education to continue to mature and expand. However, the work must continue in translating these concepts into educational programs, curricula, and assessments in order to improve diagnosis in practice, and the authors’ priorities are summarized in Table 3. The burden of diagnostic error and the yet unrealized promise of diagnostic excellence for all provide the motivation to continue this work over the next decade.

Table 3:

Top priorities for diagnosis education.

Teaching Including diagnosis education more consistently across health professions curricula and/or creating diagnosis-dedicated courses
Deliberately using competencies to inform diagnosis education
Teaching learners how to utilize artificial intelligence to augment and not negatively “deskill” their own diagnostic process
Creating more opportunities for learners to understand and practice the diagnostic process with other health professions learners
Assessment Increasing the number of formative assessments across different workplace (e.g., oral presentations) and non-workplace (e.g., multiple choice questions) formats
Utilizing artificial intelligence to analyze various types of clinical reasoning assessments rapidly and provide targeted feedback.
Providing feedback to learners about how they communicate their diagnostic process to instructors, peers, and patients
Learning Helping learners use artificial intelligence to independently create individualized clinical reasoning content (e.g., cases) and feedback
Fostering a culture of continuous learning and improvement, possibly through educational (and even specifically diagnostic) portfolios

Corresponding author: Andrew P.J. Olson, MD, Department of Medicine, Division of Hospital Medicine, University of Minnesota Medical School, Minneapolis, USA; and Department of Pediatrics, Division of Pediatric Hospital Medicine, University of Minnesota Medical School, Minneapolis, USA, E-mail:

  1. Research ethics: Not applicable.

  2. Informed consent: Not applicable.

  3. Author contributions: All authors have accepted responsibility for the entire content of this manuscript and approved its submission.

  4. Use of Large Language Models, AI and Machine Learning Tools: Microsoft copilot was used to help draft the abstract.

  5. Conflict of interest: The authors state no conflict of interest.

  6. Research funding: None declared.

  7. Data availability: Not applicable.

References

1. Balogh, EP, Miller, BT, Ball, JR, editors. Committee on diagnostic error in health care, board on health care services, institute of medicine, the national Academies of sciences, engineering, and medicine. In: Improving diagnosis in health care. National Academies Press (US); 2015. http://www.ncbi.nlm.nih.gov/books/NBK338596/[Accessed 17 March 2024].10.17226/21794Suche in Google Scholar PubMed

2. Yang, D, Fineberg, HV, Cosby, K. Diagnostic excellence. JAMA 2021;326:1905–6. https://doi.org/10.1001/jama.2021.19493.Suche in Google Scholar PubMed

3. Rencic, J, Trowbridge, RL, Fagan, M, Szauter, K, Durning, S. Clinical reasoning education at US medical schools: results from a national survey of internal medicine clerkship Directors. J Gen Intern Med 2017;32:1242–6. https://doi.org/10.1007/s11606-017-4159-y.Suche in Google Scholar PubMed PubMed Central

4. Young, M, Thomas, A, Lubarsky, S, Ballard, T, Gordon, D, Gruppen, LD, et al.. Drawing boundaries: the difficulty in defining clinical reasoning. Acad Med 2018;93:990–5. https://doi.org/10.1097/ACM.0000000000002142.Suche in Google Scholar PubMed

5. Konopasky, A, Artino, AR, Battista, A, Ohmer, M, Hemmer, PA, Torre, D, et al.. Understanding context specificity: the effect of contextual factors on clinical reasoning. Diagnosis (Berl) 2020;7:257–64. https://doi.org/10.1515/dx-2020-0016.Suche in Google Scholar PubMed

6. Levine, EM, Olson, APJ, Ratcliffe, T, McBee, E. Cognitive load in hospital medicine: implications for teachers, learners, and programs. J Hosp Med 2024. https://doi.org/10.1002/jhm.13552. Published online [10 November].Suche in Google Scholar PubMed

7. Sewell, JL, Maggio, LA, Ten Cate, O, van Gog, T, Young, JQ, O’Sullivan, PS. Cognitive load theory for training health professionals in the workplace: a BEME review of studies among diverse professions: BEME Guide No. 53. Med Teach 2019;41:256–70. https://doi.org/10.1080/0142159X.2018.1505034.Suche in Google Scholar PubMed

8. Croskerry, P. A universal model of diagnostic reasoning. Acad Med 2009;84:1022–8. https://doi.org/10.1097/ACM.0b013e3181ace703.Suche in Google Scholar PubMed

9. Daniel, M, Torre, D, Durning, SJ, Wilson, E, Rencic, JJ. Ecological psychology: diagnosing and treating patients in complex environments. Diagnosis (Berl) 2020;7:339–40. https://doi.org/10.1515/dx-2020-0008.Suche in Google Scholar PubMed

10. Wilson, E, Seifert, C, Durning, SJ, Torre, D, Daniel, M. Distributed cognition: interactions between individuals and artifacts. Diagnosis (Berl) 2020;7:343–4. https://doi.org/10.1515/dx-2020-0012.Suche in Google Scholar PubMed

11. Norman, GR, Monteiro, SD, Sherbino, J, Ilgen, JS, Schmidt, HG, Mamede, S. The causes of errors in clinical reasoning: cognitive biases, knowledge deficits, and dual process thinking. Acad Med 2017;92:23–30. https://doi.org/10.1097/ACM.0000000000001421.Suche in Google Scholar PubMed

12. Staal, J, Alsma, J, Mamede, S, Olson, APJ, Prins-van Gilst, G, Geerlings, SE, et al.. The relationship between time to diagnose and diagnostic accuracy among internal medicine residents: a randomized experiment. BMC Med Educ 2021;21:227. https://doi.org/10.1186/s12909-021-02671-2.Suche in Google Scholar PubMed PubMed Central

13. Fedyk, M, Draughon Moret, J, Sawyer, NT. Inference to the best action and its basis in clinical expertise. Front Psychol 2023;14:1032453. https://doi.org/10.3389/fpsyg.2023.1032453.Suche in Google Scholar PubMed PubMed Central

14. Croskerry, P. The feedback sanction. Acad Emerg Med 2000;7:1232–8. https://doi.org/10.1111/j.1553-2712.2000.tb00468.x.Suche in Google Scholar PubMed

15. Fernandez, BC, Williams, M, Chan, TM, Graber, ML, Lane, KP, Grieser, S, et al.. Improving diagnostic performance through feedback: the diagnosis learning cycle. BMJ Qual Saf 2021;30:1002–9. https://doi.org/10.1136/bmjqs-2020-012456.Suche in Google Scholar PubMed PubMed Central

16. Kotwal, S, Udayappan, KM, Kutheala, N, Washburn, C, Morga, C, Grieb, SM, et al.. “I had No idea this happened”: electronic feedback on clinical reasoning for hospitalists. J Gen Intern Med 2024;39:3271–7. https://doi.org/10.1007/s11606-024-09058-1.Suche in Google Scholar PubMed PubMed Central

17. Lane, KP, Chia, C, Lessing, JN, Limes, J, Mathews, B, Schaefer, J, et al.. Improving resident feedback on diagnostic reasoning after handovers: the LOOP project. J Hosp Med 2019;14:622–5. https://doi.org/10.12788/jhm.3262.Suche in Google Scholar PubMed

18. Rosner, BI, Zwaan, L, Olson, APJ. Imagining the future of diagnostic performance feedback. Diagnosis 2023;10:31–7. https://doi.org/10.1515/dx-2022-0055.Suche in Google Scholar PubMed

19. Zwaan, L, Hautz, WE. Bridging the gap between uncertainty, confidence and diagnostic accuracy: calibration is key. BMJ Qual Saf 2019;28:352–5. https://doi.org/10.1136/bmjqs-2018-009078.Suche in Google Scholar PubMed

20. Olson, APJ, Durning, SJ, Fernandez Branson, C, Sick, B, Lane, KP, Rencic, JJ. Teamwork in clinical reasoning - cooperative or parallel play? Diagnosis (Berl) 2020;7:307–12. https://doi.org/10.1515/dx-2020-0020.Suche in Google Scholar PubMed

21. Boyle, JG, Walters, MR, Jamieson, S, Durning, SJ. Reframing context specificity in team diagnosis using the theory of distributed cognition. Diagnosis (Berl) 2023;10:235–41. https://doi.org/10.1515/dx-2022-0100.Suche in Google Scholar PubMed

22. Choi, JJ, Durning, SJ. Context matters: toward a multilevel perspective on context in clinical reasoning and error. Diagnosis (Berl). 2023;10:89–95. https://doi.org/10.1515/dx-2022-0117.Suche in Google Scholar PubMed

23. Linzer, M, Sullivan, EE, Olson, APJ, Khazen, M, Mirica, M, Schiff, GD. Improving diagnosis: adding context to cognition. Diagnosis (Berl). 2023;10:4–8. https://doi.org/10.1515/dx-2022-0058.Suche in Google Scholar PubMed

24. Olson, A, Kämmer, JE, Taher, A, Johnston, R, Yang, Q, Mondoux, S, et al.. The inseparability of context and clinical reasoning. J Eval Clin Pract 2024;30:533–8. https://doi.org/10.1111/jep.13969.Suche in Google Scholar PubMed

25. Penner, JC, Schuwirth, L, Durning, SJ. From noise to music: reframing the role of context in clinical reasoning. J Gen Intern Med 2024;39:851–7. https://doi.org/10.1007/s11606-024-08612-1.Suche in Google Scholar PubMed PubMed Central

26. Olson, APJ, Singhal, G, Dhaliwal, G. Diagnosis education - an emerging field. Diagnosis (Berl) 2019;6:75–7. https://doi.org/10.1515/dx-2019-0029.Suche in Google Scholar PubMed

27. Olson, APJ, Graber, ML. Improving diagnosis through education. Acad Med 2020;95:1162–5. https://doi.org/10.1097/ACM.0000000000003172.Suche in Google Scholar PubMed PubMed Central

28. Hege, I, Adler, M, Donath, D, Durning, SJ, Edelbring, S, Elvén, M, et al.. Developing a European longitudinal and interprofessional curriculum for clinical reasoning. Diagnosis (Berl). 2023;10:218–24. https://doi.org/10.1515/dx-2022-0103.Suche in Google Scholar PubMed

29. Cooper, N, Bartlett, M, Gay, S, Hammond, A, Lillicrap, M, Matthan, J, et al.. Consensus statement on the content of clinical reasoning curricula in undergraduate medical education. Med Teach 2021;43:152–9. https://doi.org/10.1080/0142159X.2020.1842343.Suche in Google Scholar PubMed

30. Olson, A, Rencic, J, Cosby, K, Rusz, D, Papa, F, Croskerry, P, et al.. Competencies for improving diagnosis: an interprofessional framework for education and training in health care. Diagnosis (Berl) 2019;6:335–41. https://doi.org/10.1515/dx-2018-0107.Suche in Google Scholar PubMed

31. Ericsson, KA, Harwell, KW. Deliberate practice and proposed limits on the effects of practice on the acquisition of expert performance: why the original definition matters and recommendations for future research. Front Psychol 2019;10:2396. https://doi.org/10.3389/fpsyg.2019.02396.Suche in Google Scholar PubMed PubMed Central

32. Vandewaetere, M, Manhaeve, D, Aertgeerts, B, Clarebout, G, Van Merriënboer, JJG, Roex, A. 4C/ID in medical education: how to design an educational program based on whole-task learning: AMEE Guide No. 93. Med Teach 2015;37:4–20. https://doi.org/10.3109/0142159X.2014.928407.Suche in Google Scholar PubMed

33. Mamede, S, van Gog, T, Sampaio, AM, de Faria, RMD, Maria, JP, Schmidt, HG. How can students’ diagnostic competence benefit most from practice with clinical cases? The effects of structured reflection on future diagnosis of the same and novel diseases. Acad Med 2014;89:121–7. https://doi.org/10.1097/ACM.0000000000000076.Suche in Google Scholar PubMed

34. Goh, E, Gallo, R, Hom, J, Strong, E, Weng, Y, Kerman, H, et al.. Large Language model influence on diagnostic reasoning: a randomized clinical trial. JAMA Netw Open 2024;7:e2440969. https://doi.org/10.1001/jamanetworkopen.2024.40969.Suche in Google Scholar PubMed PubMed Central

35. Graber, ML, Grice, GR, Ling, LJ, Conway, JM, Olson, A. Pharmacy education needs to address diagnostic safety. Am J Pharmaceut Educ 2019;83:7442. https://doi.org/10.5688/ajpe7442.Suche in Google Scholar PubMed PubMed Central

36. Gleason, K, Harkless, G, Stanley, J, Olson, APJ, Graber, ML. The critical need for nursing education to address the diagnostic process. Nurs Outlook 2021;69:362–9. https://doi.org/10.1016/j.outlook.2020.12.005.Suche in Google Scholar PubMed PubMed Central

37. Connor, DM, Durning, SJ, Rencic, JJ. Clinical reasoning as a core competency. Acad Med 2020;95:1166–71.10.1097/ACM.0000000000003027Suche in Google Scholar PubMed

38. Young, M, Szulewski, A, Anderson, R, Gomez-Garibello, C, Thoma, B, Monteiro, S. Clinical reasoning in CanMEDS 2025. Can Med Educ J 2023;14:58–62. https://doi.org/10.36834/cmej.75843.Suche in Google Scholar PubMed PubMed Central

39. Schuwirth, L. Is assessment of clinical reasoning still the Holy Grail? Med Educ 2009;43:298–300. https://doi.org/10.1111/j.1365-2923.2009.03290.x.Suche in Google Scholar PubMed

40. Daniel, M, Rencic, J, Durning, SJ, Holmboe, E, Santen, SA, Lang, V, et al.. Clinical reasoning assessment methods: a scoping review and practical guidance. Acad Med 2019;94:902–12. https://doi.org/10.1097/ACM.0000000000002618.Suche in Google Scholar PubMed

41. Buring, SM, Bhushan, A, Broeseker, A, Conway, S, Duncan-Hewitt, W, Hansen, L, et al.. Interprofessional education: definitions, student competencies, and guidelines for implementation. Am J Pharmaceut Educ 2009;73:59. https://doi.org/10.5688/aj730459.Suche in Google Scholar PubMed PubMed Central

42. Flaubert, JL, Formentos, A, Forstag, EH, editors. Advancing health care professional education and training in diagnostic excellence: proceedings of a workshop – in brief. Washington, D. C.: National Academies Press; 2025.10.17226/29203Suche in Google Scholar

43. Hautz, WE, Marcin, T, Hautz, SC, Schauber, SK, Krummrey, G, Müller, M, et al.. Diagnoses supported by a computerised diagnostic decision support system versus conventional diagnoses in emergency patients (DDX-BRO): a multicentre, multiple-period, double-blind, cluster-randomised, crossover superiority trial. Lancet Digit Health 2025;7:e136–44. https://doi.org/10.1016/s2589-7500(24)00250-4.Suche in Google Scholar PubMed

44. Goh, E, Gallo, RJ, Strong, E, Weng, Y, Kerman, H, Freed, JA, et al.. GPT-4 assistance for improvement of physician performance on patient care tasks: a randomized controlled trial. Nat Med 2025;31:1233–8. https://doi.org/10.1038/s41591-024-03456-y.Suche in Google Scholar PubMed PubMed Central

45. Rodman, A, Zwaan, L, Olson, A, Manrai, AK. When it comes to benchmarks, humans are the only way. NEJM AI 2025;2:AIe2500143. https://doi.org/10.1056/AIe2500143.Suche in Google Scholar

46. Schaye, V, DiTullio, DJ, Sartori, DJ, Hauck, K, Haller, M, Reinstein, I, et al.. Artificial intelligence based assessment of clinical reasoning documentation: an observational study of the impact of the clinical learning environment on resident documentation quality. BMC Med Educ 2025;25:591. https://doi.org/10.1186/s12909-025-07191-x.Suche in Google Scholar PubMed PubMed Central

47. Schaye, V, Miller, L, Kudlowitz, D, Chun, J, Burk-Rafel, J, Cocks, P, et al.. Development of a clinical reasoning documentation assessment tool for resident and fellow admission notes: a shared mental model for feedback. J Gen Intern Med 2022;37:507–12. https://doi.org/10.1007/s11606-021-06805-6.Suche in Google Scholar PubMed PubMed Central

48. Schaye, V, DiTullio, D, Guzman, BV, Vennemeyer, S, Shih, H, Reinstein, I, et al.. Large Language model-based assessment of clinical reasoning documentation in the electronic health record across two institutions: development and validation study. J Med Internet Res 2025;27:e67967. https://doi.org/10.2196/67967.Suche in Google Scholar PubMed PubMed Central

49. Salwei, ME, Hoonakker, P, Carayon, P, Wiegmann, D, Pulia, M, Patterson, BW. Usability of a human factors-based clinical decision support in the emergency department: lessons learned for design and implementation. Hum Factors 2024;66:647–57. https://doi.org/10.1177/00187208221078625.Suche in Google Scholar PubMed PubMed Central

Received: 2025-09-08
Accepted: 2025-09-11
Published Online: 2025-10-13

© 2025 the author(s), published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Heruntergeladen am 17.10.2025 von https://www.degruyterbrill.com/document/doi/10.1515/dx-2025-0132/html
Button zum nach oben scrollen