Startseite Challenges in mitigating context specificity in clinical reasoning: a report and reflection
Artikel Öffentlich zugänglich

Challenges in mitigating context specificity in clinical reasoning: a report and reflection

  • Abigail Konopasky EMAIL logo , Steven J. Durning , Alexis Battista , Anthony R. Artino , Divya Ramani , Zachary A. Haynes , Catherine Woodard und Dario Torre
Veröffentlicht/Copyright: 11. Juli 2020
Diagnosis
Aus der Zeitschrift Diagnosis Band 7 Heft 3

Abstract

Objectives

Diagnostic error is a growing concern in U.S. healthcare. There is mounting evidence that errors may not always be due to knowledge gaps, but also to context specificity: a physician seeing two identical patient presentations from a content perspective (e.g., history, labs) yet arriving at two distinct diagnoses. This study used the lens of situated cognition theory – which views clinical reasoning as interconnected with surrounding contextual factors – to design and test an instructional module to mitigate the negative effects of context specificity. We hypothesized that experimental participants would perform better on the outcome measure than those in the control group.

Methods

This study divided 39 resident and attending physicians into an experimental group receiving an interactive computer training and “think-aloud” exercise and a control group, comparing their clinical reasoning. Clinical reasoning performance in a simulated unstable angina case with contextual factors (i.e., diagnostic suggestion) was determined using performance on a post-encounter form (PEF) as the outcome measure. The participants who received the training and did the reflection were compared to those who did not using descriptive statistics and a multivariate analysis of covariance (MANCOVA).

Results

Descriptive statistics suggested slightly better performance for the experimental group, but MANCOVA results revealed no statistically significant differences (Pillai’s Trace=0.20, F=1.9, df=[4, 29], p=0.15).

Conclusions

While differences were not statistically significant, this study suggests the potential utility of strategies that provide education and awareness of contextual factors and space for reflective practice.

Introduction

Clinical reasoning is fundamental to every physician in practice [1]. Clinical reasoning involves a number of activities such as information gathering, formulating a differential diagnosis, providing diagnostic justification and making diagnostic and therapeutic plans [2], [3], [4]. Mistakes in clinical reasoning undoubtedly contribute significantly to diagnostic errors, which are hypothesized to account for approximately 15% of the errors in primary care [5]. There is mounting evidence that these errors cannot solely be attributed to gaps in physician knowledge or training [6], [7]. Indeed, recent research has identified the phenomenon of context specificity, defined as a physician seeing two identical patient presentations from a content perspective (e.g., identical histories, physical exams, labs and the same diagnosis) yet arriving at two different diagnostic decisions [6], [8]. In other words, in these situations, something other than case content is driving the physician’s decisions leading to unwanted variation in physician performance.

The primary theoretical framework for this article is situated cognition. From this theoretical perspective clinical reasoning is situated (or located) within the specifics of the encounter (e.g., the patient, the physician and the environment) and clinical reasoning processes and outcomes dynamically emerge from and are shaped by these specifics of the situation [9]. In situated cognition there are a host of interactions between individuals and their physical, social, and cultural systems. From this perspective, context specificity is associated with contextual factors, elements arising from patients, physicians, clinical environment and the interactions among all three [6]. When these contextual factors are distracting (e.g., a patient suggesting an incorrect diagnosis, an electronic health record not functioning optimally, compressed time for completing an encounter), this can negatively affect clinical reasoning performance and ultimately lead to diagnostic error [6], [8].

Our recent work investigating context specificity within a situated cognition framework has shown that contextual factors, particularly diagnostic suggestion, can have a negative effect on clinical reasoning performance and can significantly shift the process of clinical reasoning [10], [11], changing, for instance, what participants focus on in think-aloud reflections. That work, however, did not explore, within a situated cognition theoretical framework, strategies that could potentially mitigate the negative impacts of contextual factors on the clinical reasoning process [6], [8]. In this study, we draw from this work to test the mitigating effects of several strategies.

One proposed strategy for reducing diagnostic error is the use of reflective practice [12], [13], [14]. In reflective practice, there are a number of behaviors and reasoning processes that occur in response to complex clinical problems. These include (1) a search for alternative explanations of the problem, (2) exploration of the consequences of such alternative explanations that leads to predictions to be tested by the acquisition of new data, (3) a testing of these predictions against data and a reframing of the problem, and (4) a critical review of one’s own assumptions and conclusions about the problem (meta-reasoning) [12]. Part of reflective practice for the reduction of error can also be related to metacognition, stepping back from the immediate situation to reflect on the thinking process [15]. However, these educational strategies have not yet been employed in medical education settings to determine if they can mitigate the negative effects of context specificity in an authentic (e.g., simulated) environment. Such strategies may enhance physicians’ ability to recognize and possibly reduce the adverse effects of distracting contextual factors on clinical reasoning performance and reduce diagnostic errors. The development and exploration of effective educational strategies that address specific aspects of the clinical context are important for a future implementation of new instructional tools in the training of medical students, residents and physicians.

An optimal way to test out these strategies is through a simulated scenario, which creates a complex, highly interactive context. Simulated environments can help us disentangle the potential effects of distracting contextual factors on clinical reasoning and error and assess the outcome of potential strategies designed to ameliorate context specificity [16], [17]. Scenario-based simulations provide an environment that is similar to an authentic clinical setting, allowing for different degrees of contextual and cognitive complexity in the clinical encounter while controlling for specific leading and differential diagnoses [17], [18], [19]. Further, there is evidence that scenario-based simulations provide effective environments for assessing clinicians’ performance in activities associated with clinical reasoning [18], [20].

According to situated cognition, thinking and reasoning are conceived as occurring within the specifics of the situation. Through this lens, the development of an environment of increasing complexity, where the various interactions between patient, physician and context can be captured, is essential to better understand the intricacies of context specificity in clinical reasoning and the potential effect of mitigating strategies. The purpose of this study was to assess clinical reasoning performance during a simulated encounter, comparing physicians who participated in a computer-based tutorial on contextual factors and their potential link to diagnostic errors and a think-aloud exercise, with those who did not. Our hypothesis was that the use of these dual strategies – tutorial and think-aloud – would enhance clinical reasoning performance measured by a post-encounter form (PEF), leading to better performance by those participants when compared to the control group.

Materials and methods

This comparative experimental study examined whether strategies (consisting of an interactive computer training and a think-aloud reflection) designed to support physicians’ clinical reasoning in the presence of distracting contextual factors improved clinical reasoning performance, as measured by a PEF, an open-ended series of diagnostic and management questions for which we have previously gathered validity evidence [6], [21], [22].

Population

This convenience sample was comprised of practicing military physicians in internal medicine, family medicine, and surgery from the Uniformed Services University for the Health Sciences, Walter Reed National Military Medical Center, and the University of Texas Health Science Center at San Antonio, who were assigned to either the experimental (n=20) or control (n=19) condition, based on scheduling availability. Institutional Review Boards at all three sites approved this research (complying with World Medical Association Declaration of Helsinki).

Study design

The study included two groups of participants. The control group began with the simulated encounter, followed by the PEF (the outcome measure for this study) which closely mirrors typical practice [21]. The experimental group participated in an interactive computer-based training module, completed a simulated encounter, did the think-aloud reflection, and then completed the PEF (Control participants also completed the think-aloud exercise, but only after completion of the outcome measure; see Figure 1).

Figure 1: Study design.
Figure 1:

Study design.

Data collection

Training module

Participants in the experimental condition completed a computer-based clinical reasoning and diagnostic error training module covering: the nature of clinical reasoning and diagnostic error, the role of context specificity and contextual factors in diagnostic error, and the presentation of a reflection strategy (thinking aloud) for countering the potentially harmful effects of distracting contextual factors. Each of the three sections was accompanied by open-ended written reflection questions connecting the topics to participants’ own practice and experience. This module took approximately 20 min. See Table 1 for training module details.

Table 1:

Computer-based training sequence.

OrderLengthaContentReflection task(s)
156 sOverview of training and interactive expectationsHow do you define clinical reasoning? How do you define diagnostic error?
21 m, 21 sDiscussion of complex tasks involved in clinical reasoning (e.g., assessment and planning, patient communication). Definition of diagnostic error (from 2015 National Academies of Science report).Describe an instance when you or a peer made a diagnostic error. Did any of the components of clinical reasoning play a part? If so, how?
33 m, 34 sOverview of factors that can impair clinical reasoning (e.g., premature closure) with examples. Introduction of context specificity and contextual factors and how they may increase cognitive load, leading to error. Overview of situated approach to context specificity, associating factors with the patient, physician, and/or environment.See #4 and #5 below
41 m, 13 sSegment of videotaped patient encounter with contextual factors (non-native speaking patient, questioning physician credentials)What contextual factors did you notice in this video? How do you think they could lead to error?
52 m, 7 sSegment of videotaped patient encounter with contextual factor (non-linear presentation of symptoms)What contextual factors did you notice in this video? How do you think they could lead to error?
Reflecting on your own practice, describe an instance in which you encountered one or more of these contextual factors.
655 sIntroduction to strategy of thinking aloud as non-judgmental, non-analytical reflection on what one is/was thinkingThink-aloud practice: Name five animals that live in the zoo. What is the sixth letter after B? What is the fifth letter before M?
  1. aThis is the length of the video-recorded segment only — reflection questions took an additional several minutes for each section. Given in minutes (m) and seconds (s)

Simulated encounter

Participants in both groups engaged in the identical standardized patient encounter. This encounter was designed in three phases, using a participatory design procedure involving clinical and simulation stakeholders (see Battista et al. for a description of the design process and all supporting materials [23]). A male standardized patient (SP) was trained to portray someone with unstable angina and a distracting contextual factor (patient conveys diagnostic suggestion of GERD). Participants were given details about the patient’s initial complaint (chest pain) and then entered a room designed much like the outpatient clinic rooms where they practice, where they introduced themselves to the SP and conducted a history and physical. Encounters were stopped if they reached 18 min, but no time penalties were given. Encounters ranged in total time from 11 to 18 min.

Think aloud

Immediately after the encounter, experimental participants were reminded of the think-aloud practice they had done at the end of the computer-based training and were given additional practice opportunity if needed. When ready, they were asked to watch the video tape of their encounter while thinking aloud, without judgment or analysis, about what they were considering as they came to the diagnosis and treatment. Previous research has argued that think aloud exercises provide a useful window into cognition and experience [24], [25], [26]. Additionally, recent work on how participants reconsider their thought processes during think-aloud exercises suggests that they may also be a valuable tool for reflection in the presence of contextual factors [10].

Outcome instrument (PEF)

Clinical reasoning performance was captured through the PEF, a measure developed and used in prior work for which we have gathered considerable validity evidence [6], [21], [22]. The form asks participants for: (1) additional information they would like to obtain by history, (2) additional physical exam actions they would take, (3) a problem list, (4) a differential diagnosis, (5) a leading diagnosis, (6) supporting evidence for the diagnosis and (7) management plans. For this analysis we focused on steps (4–7) because we were interested in exploring the cognitive processes of specific elements of clinical reasoning such as differential diagnosis, supportive evidence, and diagnostic and therapeutic plans. Participants were given up to 30 min (determined to be ample time in prior studies [6], [23]) to complete the PEF. We used a scoring key developed by a panel of board-certified internists in prior research [6] (with kappas between 0.82 and 0.93 in measure development) that assigns each free-text response a point value (participants gave multiple responses for most items) for correct (2 points), partially correct (1 point) or incorrect (0 points). All responses were scored by at least two raters, coming to complete consensus on disagreements and updating the scoring sheet with the decision. In order to compare across participants, who gave varying numbers of responses for each item, we created percentage scores for each item, dividing the total number of points by the total number of possible points (e.g., someone who offered five pieces of evidence for the leading diagnosis had a total possible score of 10 for that item).

Analysis

First, descriptive statistics were calculated separately for each group (experimental and control). Then, to determine if there were differences between groups in clinical reasoning performance across the four variables, while also controlling for increased risk of Type I error that occurs with multiple univariate tests, we conducted a multivariate analysis of covariance (MANCOVA), with percentage scores on the four PEF items as dependent variables and group (experimental versus control) as the independent variable. Age and number of years since graduation from medical school were used as covariates to control for years of experience. A power analysis indicated that a total sample size of 51 was needed to detect a medium-sized effect at a significance level of 0.05, so the study was underpowered with n=39. To mitigate the potential effect of specialty (due to our convenience sample, the experimental condition contained no family medicine or surgery physicians), we also ran an analysis of only the internal medicine physicians.

Results

Participants in the control and experimental groups were equally distributed in terms of gender (10 females out of 19 for control; 10 out of 20 for experimental), but in the control group the range of ages was wider and they had more years of experience (i.e., control participants were older and had been practicing longer; see Table 2 for demographic details).

Table 2:

Demographic variables arranged by study condition.

Control (12 internal medicine, 2 family medicine, 5 surgery)Experimental (20 internal medicine, 0 family medicine or surgery)
Mean (SD)RangeMean (SD)Range
Age in years37 (10)26–6131 (4)25–39
Years of experience7 (11)0–352 (3)0–9

Means for the four outcome variables ranged from 65 to 80% for control group participants and 70 to 87% for experimental group participants, with higher mean scores on all four variables for the experimental group. See Table 3 for means and ranges of outcome variables by group.

Table 3:

Descriptive statistics for dependent variables by condition.

Control (n=19)Intervention (n=20)
Mean (SD)Percentage rangeRaw score rangeMean (SD)Percentage rangeRaw score range
Differential diagnosis64.6% (12.8)33–83%3–869.7% (11.6)38–83%3–9
Leading diagnosis80.3% (17.8)50–100%1–2a85% (15)50–100%1–2a
Supporting evidence78.3% (14.4)44–100%4–1486.7% (6.6)75–100%5–17
Management plans79.9% (14.5)50–100%2–2481.5% (13.6)56–100%2–12
  1. aLeading diagnosis had a maximum of two points.

MANCOVA results revealed no significant differences between experimental and control groups (Pillai’s Trace=0.20, F=1.9, df=[4, 29], p=0.15, ƞp2=0.2; see Table 4 for details). Box's M test did not indicate a significant violation of the assumption of homogeneity of variances (F=1.8, p=0.06). Levene’s test, however, only indicated equal variances for supporting evidence; the other three variables violated the assumption.

Table 4:

MANCOVA results.

DfPillaiF-Valuep-Value
Condition4, 290.21.860.15
Age4, 290.211.920.13
Years of experience4, 290.171.50.23

When the analysis was run without the family medicine and surgery physicians, means remained higher for the experimental group and MANCOVA results again revealed no statistically significant differences (Pillai’s Trace=0.31, F=2.5, df=[4, 22], p=0.07, ƞp2=0.17).

Discussion

We developed strategies to attempt to mitigate the negative effects of context specificity on clinical reasoning performance that included a computer-based training module to raise awareness of these effects and a think-aloud exercise on participation in a simulated clinical encounter. We hypothesized that this computer-based training and think-aloud exercise would lessen the negative impact of distracting contextual factors on diagnostic and therapeutic accuracy of participants in the experimental group compared to the control group. However, we found no statistically significant difference between the two groups. Nonetheless, we believe this work is an important first step exploring how we might mitigate the effects of context specificity on clinical reasoning performance and patient care and should represent the first in a series of studies. [27], [28] We discuss below the importance of – and challenges with – continuing research on context specificity using a situated cognition framework.

Situated cognition anticipates the potentially negative effects of distracting contextual factors, since an individual’s clinical reasoning is interconnected with the social and environmental elements of a patient encounter [9], [29]. Yet the environment created by distracting contextual factors also creates opportunities; Ng and colleagues refer to “indeterminate zones of practice – uncertain, unstable, unique or value-conflicted practice situations” that provide opportunities for the development of clinical practice (p. 463) [30]. Contextual factors may move clinicians into these zones. Explicit education around and reflective practice upon distracting contextual factors in these zones may help clinicians develop better situation awareness to trigger reflection in future encounters with contextual factors [31]. Explicit discussion of biases and “diagnostic timeouts” to reflect on these biases have been suggested more broadly as error reducing techniques [32] and reflection strategies (e.g., using a think aloud) could help build on those insights.

Additionally, strategies like the one employed in this study could offer a somewhat different approach to reflective practice from other scholars [12], [13], [14], [28], asking participants to “think aloud” without explanation, judgment, or structure [24], [26]. While this approach may sacrifice some of the benefits of specific reasoning instructions [28], it may offer a more comfortable space for reflective practice than other, more directed approaches. Boud and Walker point to the importance of the “micro-contexts” instructors create for reflection (defined slightly differently by these authors, but still relevant to our context), particularly in terms of instructors’ potential power over the learner [33]. By removing the instructor from the room altogether and with minimal interruption (beyond cuing learners to continue when they fall silent), think-aloud exercises offer an opportunity to reflect across a full encounter that (in concert with other reflective practices) “permit[s] the making of meaning” (p. 10) [33]. Even if this act of meaning-making does not change the decision made in a particular case, it may serve to “promote adaptive expertise and practical wisdom” (p. 1,048) as physicians develop over the course of a career [34].

There were several limitations to this study. First the sample size did not give us enough power to adequately determine group differences. The experimental group was younger (in fact, the age difference between groups was statistically significant) and less experienced than the control group, which may have dampened the impact of the strategies as experience would be expected to improve performance [35]. Future studies should include a greater number of participants with similar or at least equivalent baseline characteristics. Second our approach was comprised of two parts, the computer-based training and the think-aloud exercise, separated by the encounter; the sequence of these two sets of strategies may have affected the overall effects while hampering our ability to disentangle individual effects. Moreover, the experimental group had additional time to review the case (i.e., the think aloud), which could also explain some of the trends. Future studies with access to more participants might explore these additional conditions. Finally, while our PEF measure has a considerable amount of validity evidence [6], [21], [22], [23], it is one tool and as such provides a limited view on clinical reasoning performance.

In conclusion, this study highlights the difficulty in mitigating and measuring context specificity. Yet, guided by situated cognition, the strategies we employ should be explored in future research, as they required that physicians develop increased situation awareness that moves beyond the self’s actions to the broader range of interactions and systemic influences that comprise a clinical encounter. Future work could explore what cues trigger this kind of awareness in physicians and how sensitivity to those cues shifts over time, helping us to more fully understand not only context specificity, but also the adaptive expertise that characterizes truly excellent physicians.


Corresponding author: Abigail Konopasky, PhD, Assistant Professor of Medicine, Uniformed Services University of the Health Sciences and The Henry M Jackson Foundation for the Advancement of Military Medicine, Bethesda, MD, 20814, USA, Phone: +301 295 2904, E-mail:

Award Identifier / Grant number: JPC-1, #NH83382416

Funding source: Joint Pathology Center

  1. Research funding: This study was supported by a grant from the JPC - 1, CDMRP - Congressionally Directed Medical Research Program (#NH83382416).

  2. Author contributions: All authors collaborated together on the research design and data collection. Torre, Haynes, Woodard, and Durning did the coding, coming to consensus. Konopasky ran the analyses and wrote up the results section. Konopasky and the remaining authors co-wrote the remainder of the paper. All authors offered substantive revisions and approve of this final version of the paper. All authors have accepted responsibility for the entire content of this manuscript and approved its submission.

  3. Competing interests: Authors state no conflict of interest.

  4. Informed consent: Informed consent was obtained from all individuals included in this study.

  5. Disclaimers: The views expressed in this paper are those of the authors and do not necessarily reflect the official position or policy of the US Government, Department of Defense, Department of the Navy, or the Uniformed Services University of the Health Sciences.

References

1. Norman, G. Research in clinical reasoning: past history and current trends. Med Educ 2005;39:418–27. https://doi.org/10.1111/j.1365-2929.2005.02127.x.Suche in Google Scholar PubMed

2. Elstein, AS, Shulman, LS, Sprafka, SA, Cambridge, CA. Medical problem solving an analysis of clinical reasoning. Cambridge, MA: Harvard University Press; 1978.10.4159/harvard.9780674189089Suche in Google Scholar

3. Croskerry, P. A universal model of diagnostic reasoning. Acad Med 2009;84:1022–8. https://doi.org/10.1097/ACM.0b013e3181ace703.Suche in Google Scholar PubMed

4. Young, M, Thomas, A, Lubarsky, S, Ballard, T, Gordon, D, Gruppen, LD, et al.. Drawing boundaries: the difficulty in defining clinical reasoning. Acad Med 2018;93:990–5. https://doi.org/10.1097/ACM.0000000000002142.Suche in Google Scholar PubMed

5. Graber, ML. The incidence of diagnostic error in medicine. BMJ Qual Saf 2013;22:ii21–7. https://doi.org/10.1136/bmjqs-2012-001615.Suche in Google Scholar PubMed PubMed Central

6. Durning, SJ, Artino, AR, Boulet, JR, Dorrance, K, van der Vleuten, C, Schuwirth, L. The impact of selected contextual factors on experts’ clinical reasoning performance (does context impact clinical reasoning performance in experts?). Adv Heal Sci Educ 2012;17:65–79. https://doi.org/10.1007/s10459-011-9294-3.Suche in Google Scholar PubMed

7. Eva, KW. What every teacher needs to know about clinical reasoning. Med Educ 2005;39:98–106. https://doi.org/10.1111/j.1365-2929.2004.01972.x.Suche in Google Scholar PubMed

8. Ratcliffe, TA, McBee, E, Schuwirth, L, Picho, K, van der Vleuten CPM, Artino, AR. Exploring implications of context specificity and cognitive load in residents. Med Ed Publish 2017;6. https://doi.org/10.15694/mep.2017.000048.Suche in Google Scholar

9. Durning, SJ, Artino, AR. Situativity theory: a perspective on how participants and the environment can interact: AMEE Guide no. 52. Med Teach 2011;33:188–99. https://doi.org/10.3109/0142159X.2011.550965.Suche in Google Scholar PubMed

10. Konopasky, A, Durning, SJ, Artino, AR, Ramani, D, Battista, A. The linguistic effects of context specificity: exploring affect, cognitive processing, and agency in physicians’ think-aloud reflections. Diagnosis (Berl) 2020;7:273–80. https://doi.org/10.1515/dx-2019-0103.Suche in Google Scholar PubMed

11. Konopasky, AW, Artino, AR, Battista, A, Ohmer, M, Hemmer, PA, Torre, D, et al.. Understanding context specificity: the effect of contextual factors on clinical reasoning. Diagnosis (Berl) 2020;7:257–64. https://doi.org/10.1515/dx-2020-0016.Suche in Google Scholar PubMed

12. Mamede, S, Schmidt, HG, Penaforte, JC. Effects of reflective practice on the accuracy of medical diagnoses. Med Educ 2008;42:468–75. https://doi.org/10.1111/j.1365-2923.2008.03030.x.Suche in Google Scholar PubMed

13. Mamede, S, Schmidt, HG. The structure of reflective practice in medicine. Med Educ 2004;38:1302–8. https://doi.org/10.1111/j.1365-2929.2004.01917.x.Suche in Google Scholar PubMed

14. Mamede, S, Schmidt, HG, Rikers, R. Diagnostic errors and reflective practice in medicine. J Eval Clin Pract 2007;13:138–45. https://doi.org/10.1111/j.1365-2753.2006.00638.x.Suche in Google Scholar PubMed

15. Croskerry, P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med 2003;78:775–80. https://doi.org/10.1097/00001888-200308000-00003.Suche in Google Scholar PubMed

16. Kneebone, RL, Scott, W, Darzi, A, Horrocks, M. Simulation and clinical practice: strengthening the relationship. Med Educ 2004;38:1095–102. https://doi.org/10.1111/j.1365-2929.2004.01959.x.Suche in Google Scholar PubMed

17. Battista, A. An activity theory perspective of how scenario-based simulations support learning: a descriptive analysis. Adv Simul 2017;2:23. https://doi.org/10.1186/s41077-017-0055-0.Suche in Google Scholar PubMed PubMed Central

18. Kneebone, RL, Kidd, J, Nestel, D, Barnet, A, Lo, B, King, R, et al.. Blurring the boundaries: scenario-based simulation in a clinical setting. Med Educ 2005;39:580–7. https://doi.org/10.1111/j.1365-2929.2005.02110.x.Suche in Google Scholar PubMed

19. Dieckmann, P, Gaba, D, Rall, M. Deepening the theoretical foundations of patient simulation as social practice. Simul Healthc J Soc Simul Healthc 2007;2:183–93. https://doi.org/10.1097/SIH.0b013e3180f637f5.Suche in Google Scholar PubMed

20. Daniel, M, Rencic, J, Durning, SJ, Holmboe, E, Santen, SA, Lang, V, et al.. Clinical reasoning assessment methods: a scoping review and practical guidance. Acad Med 2019;94:902–12. https://doi.org/10.1097/ACM.0000000000002618.Suche in Google Scholar PubMed

21. Durning, SJ, Artino, A, Boulet, J, La Rochelle, J, Van der Vleuten, C, Arze, B, et al.. The feasibility, reliability, and validity of a post-encounter form for evaluating clinical reasoning. Med Teach 2012;34:30–7. https://doi.org/10.3109/0142159X.2011.590557.Suche in Google Scholar PubMed

22. McBee, E, Ratcliffe, T, Picho, K, Schuwirth, L, Artino, AR, Yepes-Rios, AM, et al.. Contextual factors and clinical reasoning: differences in diagnostic and therapeutic reasoning in board certified versus resident physicians. BMC Med Educ 2017;17:211. https://doi.org/10.1186/s12909-017-1041-x.Suche in Google Scholar PubMed PubMed Central

23. Battista, A, Konopasky, A, Ramani, D, Ohmer, M, Mikita, J, Howle, A, et al.. Clinical reasoning in the primary care setting: two scenario-based simulations for residents and attendings. Med Ed PORTAL 2018;14:10773. https://doi.org/10.15766/mep_2374-8265.10773.Suche in Google Scholar PubMed PubMed Central

24. Durning, SJ, Artino, AR, Beckman, TJ, Graner, J, van der Vleuten, C, Holmboe, E, et al.. Does the think-aloud protocol reflect thinking? Exploring functional neuroimaging differences with thinking (answering multiple choice questions) versus thinking aloud. Med Teach 2013;35:720–6. https://doi.org/10.3109/0142159X.2013.801938.Suche in Google Scholar PubMed

25. Burbach, B, Barnason, S, Thompson, SA. Using “think aloud” to capture clinical reasoning during patient simulation. Int J Nurs Educ Scholarsh 2015;12:1–7. https://doi.org/10.1515/ijnes-2014-0044.Suche in Google Scholar PubMed

26. Ericsson, KA, Simon, HA. How to study thinking in everyday life: contrasting think-aloud protocols with descriptions and explanations of thinking. Mind Cult Act 1998;5:178–86. https://doi.org/10.1207/s15327884mca0503_3.Suche in Google Scholar

27. Mann, K, Gordon, J, MacLeod, A. Reflection and reflective practice in health professions education: a systematic review. Adv Heal Sci Educ 2009;14:595–621. https://doi.org/10.1007/s10459-007-9090-2.Suche in Google Scholar PubMed

28. Mamede, S, Schmidt, HG. Reflection in medical diagnosis: a literature review. Heal Prof Educ 2017;3:15–25. https://doi.org/10.1016/j.hpe.2017.01.003.Suche in Google Scholar

29. Konopasky, AW, Ramani, D, Ohmer, M, Battista, A, Artino, AR, McBee, E, et al.. It totally possibly could Be: how A group of military physicians reflect on their clinical reasoning in the presence of contextual factors. Mil Med 2020;185:575–82. https://doi.org/10.1093/milmed/usz250.Suche in Google Scholar PubMed

30. Ng, SL, Kinsella, EA, Friesen, F, Hodges, B. Reclaiming a theoretical orientation to reflection in medical education research: a critical narrative review. Med Educ 2015;49:461–75. https://doi.org/10.1111/medu.12680.Suche in Google Scholar PubMed

31. Moulton, CA, Regehr, G, Mylopoulos, M, MacRae, HM. Slowing down when you should: a new model of expert judgment. Acad Med 2007;82:S109–16. https://doi.org/10.1097/ACM.0b013e3181405a76.Suche in Google Scholar PubMed

32. Trowbridge, RL. Twelve tips for teaching avoidance of diagnostic errors. Med Teach 2008;30:496–500. https://doi.org/10.1080/01421590801965137.Suche in Google Scholar PubMed

33. Boud, D, Walker, D. Promoting reflection in professional courses: the challenge of context. Stud High Educ 1998;23:191–206. https://doi.org/10.1080/03075079812331380384.Suche in Google Scholar

34. Epstein, RM. Reflection, perception and the acquisition of wisdom. Med Educ 2008;42:1048–50. https://doi.org/10.1111/j.1365-2923.2008.03181.x.Suche in Google Scholar PubMed

35. Ericsson, KA. The influence of experience and deliberate practice on the development of superior expert performance. In: Ericsson KA, Charnes N, Feltovich PJ, Hoffman RR, editors. The Cambridge handbook of expertise and expert performance. New York: Cambridge University Press; 2006:685–705.10.1017/CBO9780511816796.038Suche in Google Scholar

Received: 2020-01-27
Accepted: 2020-05-04
Published Online: 2020-07-11
Published in Print: 2020-08-27

© 2020 Walter de Gruyter GmbH, Berlin/Boston

Artikel in diesem Heft

  1. Frontmatter
  2. Editorials
  3. Progress understanding diagnosis and diagnostic errors: thoughts at year 10
  4. Understanding the social in diagnosis and error: a family of theories known as situativity to better inform diagnosis and error
  5. Sapere aude in the diagnostic process
  6. Perspectives
  7. Situativity: a family of social cognitive theories for understanding clinical reasoning and diagnostic error
  8. Clinical reasoning in the wild: premature closure during the COVID-19 pandemic
  9. Widening the lens on teaching and assessing clinical reasoning: from “in the head” to “out in the world”
  10. Assessment of clinical reasoning: three evolutions of thought
  11. The genealogy of teaching clinical reasoning and diagnostic skill: the GEL Study
  12. Study design and ethical considerations related to using direct observation to evaluate physician behavior: reflections after a recent study
  13. Focused ethnography: a new tool to study diagnostic errors?
  14. Phenomenological analysis of diagnostic radiology: description and relevance to diagnostic errors
  15. Original Articles
  16. A situated cognition model for clinical reasoning performance assessment: a narrative review
  17. Clinical reasoning performance assessment: using situated cognition theory as a conceptual framework
  18. Direct observation of depression screening: identifying diagnostic error and improving accuracy through unannounced standardized patients
  19. Understanding context specificity: the effect of contextual factors on clinical reasoning
  20. The effect of prior experience on diagnostic reasoning: exploration of availability bias
  21. The Linguistic Effects of Context Specificity: Exploring Affect, Cognitive Processing, and Agency in Physicians’ Think-Aloud Reflections
  22. Sequence matters: patterns in task-based clinical reasoning
  23. Challenges in mitigating context specificity in clinical reasoning: a report and reflection
  24. Examining the patterns of uncertainty across clinical reasoning tasks: effects of contextual factors on the clinical reasoning process
  25. Teamwork in clinical reasoning – cooperative or parallel play?
  26. Clinical problem solving and social determinants of health: a descriptive study using unannounced standardized patients to directly observe how resident physicians respond to social determinants of health
  27. Sociocultural learning in emergency medicine: a holistic examination of competence
  28. Scholarly Illustrations
  29. Expanding boundaries: a transtheoretical model of clinical reasoning and diagnostic error
  30. Embodied cognition: knowing in the head is not enough
  31. Ecological psychology: diagnosing and treating patients in complex environments
  32. Situated cognition: clinical reasoning and error are context dependent
  33. Distributed cognition: interactions between individuals and artifacts
Heruntergeladen am 15.10.2025 von https://www.degruyterbrill.com/document/doi/10.1515/dx-2020-0018/html?licenseType=free
Button zum nach oben scrollen