Abstract
Objectives
Idiosyncratic approaches to reasoning among teachers and limited reliable workplace-based assessment and feedback methods make teaching diagnostic reasoning challenging. The Assessment of Reasoning Tool (ART) was developed to fill this gap, but its utility and feasibility in providing feedback to residents has not been studied. We evaluated how the ART was used to assess, teach, and guide feedback on diagnostic reasoning to pediatric interns.
Methods
We used an integrated mixed-methods approach to evaluate how the ART facilitates the feedback process between clinical teachers and learners. We collected data from surveys of pediatric interns and interviews of hospital medicine faculty at Baylor College of Medicine from 2019 to 2020. Interns completed the survey each time they received feedback from their attending that was guided by the ART. The preliminary intern survey results informed the faculty interview questions. We integrated descriptive statistics of the survey with the thematic analysis of the transcribed interviews.
Results
Survey data (52 survey responses from 38 interns) and transcribed interviews (10 faculty) were analyzed. The ART framework provided a shared mental model which facilitated a feedback conversation. The ART-guided feedback was highly rated in terms of structure, content, and clarity in goal-setting while enabling new learning opportunities. Barriers to using the ART included limited time and inter-faculty variability of its use.
Conclusions
The ART facilitated effective and feasible faculty feedback to interns on their diagnostic reasoning skills.
Funding source: Center for Research, Innovation and Scholarship in Medical Education
Award Identifier / Grant number: R060320-I
Acknowledgments
The authors acknowledge the following members of the Society to Improve Diagnosis in Medicine Education Committee for their many contributions to the development of the Assessment of Reasoning Tool: William Follansbee, MD, Ethan Fried, MD, Andrew Olson, MD, Frank Papa, DO, PhD, Brent Smith, MD, and Robert Trowbridge, MD.
-
Research funding: This research was supported by an institutional Educational Research grant from the Center for Research, Innovation and Scholarship in Medical Education, Department of Pediatrics, Texas Children’s Hospital [grant number R060320-I].
-
Author contributions: All authors have accepted respon-sibility for the entire content of this manuscript and approved its submission.
-
Competing interests: Authors state no conflict of interest.
-
Informed consent: Informed consent was obtained from all individuals included in this study.
-
Ethical approval: The study was approved by the Baylor College of Medicine Institutional Review Board.
References
1. Bowen, JL. Educational strategies to promote clinical diagnostic reasoning. N Engl J Med 2006;355:2217–25. https://doi.org/10.1056/nejmra054782.Search in Google Scholar PubMed
2. Cantillon, P, Sargeant, J. Giving feedback in clinical settings. BMJ 2008;337:a1961. https://doi.org/10.1136/bmj.a1961.Search in Google Scholar PubMed
3. Ende, J. Feedback in clinical medical education. JAMA 1983;250:777–81. https://doi.org/10.1001/jama.1983.03340060055026.Search in Google Scholar
4. Hewson, MG, Little, ML. Giving feedback in medical education: verification of recommended techniques. J Gen Intern Med 1998;13:111–6. https://doi.org/10.1046/j.1525-1497.1998.00027.x.Search in Google Scholar PubMed PubMed Central
5. Ramani, S, Krackov, SK. Twelve tips for giving feedback effectively in the clinical environment. Med Teach 2012;34:787–91. https://doi.org/10.3109/0142159x.2012.684916.Search in Google Scholar PubMed
6. Bing-You, R, Hayes, V, Varaklis, K, Trowbridge, R, Kemp, H, McKelvy, D. Feedback for learners in medical education: what is known? A scoping review. Acad Med 2017;92:1346–54. https://doi.org/10.1097/acm.0000000000001578.Search in Google Scholar PubMed
7. Shaughness, G, Georgoff, PE, Sandhu, G, Leininger, L, Nikolian, VC, Reddy, R, et al.. Assessment of clinical feedback given to medical students via an electronic feedback system. J Surg Res 2017;218:174–9. https://doi.org/10.1016/j.jss.2017.05.055.Search in Google Scholar PubMed
8. Daniel, M, Rencic, J, Durning, SJ, Holmboe, E, Santen, SA, Lang, V, et al.. Clinical reasoning assessment methods a scoping review. Acad Med 2019;94:902–12. https://doi.org/10.1097/acm.0000000000002618.Search in Google Scholar PubMed
9. Kononowicz, AA, Hege, I, Edelbring, S, Sobocan, M, Huwendiek, S, Durning, SJ. The need for longitudinal clinical reasoning teaching and assessment: results of an international survey. Med Teach 2020;42:1–6.10.1080/0142159X.2019.1708293Search in Google Scholar PubMed
10. Kogan, JR, Holmboe, ES, Hauer, KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. J Am Med Assoc 2009;302:1316–26. https://doi.org/10.1001/jama.2009.1365.Search in Google Scholar PubMed
11. van der Vleten, C, Norman, G, Schuwirth, L. Assessing clinical reasoning. In: Higgs, J, Jones, M, Loftus, S, Christensen, N, editors. Clin Reason Heal Prof. Edinburgh: Elsevier; 2008:413–22 pp.Search in Google Scholar
12. Carter, C, Akar-Ghibril, N, Sestokas, J, Dixon, G, Bradford, W, Ottolini, M. Problem representation, background evidence, analysis, recommendation: an oral case presentation tool to promote diagnostic reasoning. Acad Pediatr 2018;18:228–30. https://doi.org/10.1016/j.acap.2017.08.002.Search in Google Scholar PubMed
13. Archer, JC, Norcini, J, Davies, HA. Use of SPRAT for peer review of paediatricians in training. Br Med J 2005;330:1251–3. https://doi.org/10.1136/bmj.38447.610451.8f.Search in Google Scholar
14. Norcini, JJ, Blank, LL, Duffy, FD, Fortna, GS. The mini-CEX: a method for assessing clinical skills. Ann Intern Med 2003;138:476–81. https://doi.org/10.7326/0003-4819-138-6-200303180-00012.Search in Google Scholar PubMed
15. Abouna, GM, Hamdy, H. The integrated direct observation clinical encounter examination (IDOCEE) – an objective assessment of students’ clinical competence in a problem-based learning curriculum. Med Teach 1999;21:67–72. https://doi.org/10.1080/01421599980066.Search in Google Scholar
16. Thammasitboon, S, Rencic, JJ, Trowbridge, RL, Olson, APJ, Sur, M, Dhaliwal, G. The assessment of reasoning tool (ART): structuring the conversation between teachers and learners. Diagnosis 2018;5:197–203. https://doi.org/10.1515/dx-2018-0052.Search in Google Scholar PubMed
17. Thammasitboon, S, Sur, M, Rencic, JJ, Dhaliwal, G, Kumar, S, Sundaram, S, et al.. Psychometric validation of the reconstructed version of the assessment of reasoning tool. Med Teach 2020;43:1–6.10.1080/0142159X.2020.1830960Search in Google Scholar PubMed
18. Creswell, JW, Clark, VLP. Designing and conducting mixed methods research, 3rd ed. Thousand Oaks, CA: SAGE Publications; 2017.Search in Google Scholar
19. Creamer, EG. An introduction to fully integrated mixed methods research, 1st ed. Thousand Oaks, CA: SAGE Publications; 2017.10.4135/9781071802823Search in Google Scholar
20. Rocco, T, Bliss, L, Gallagher, S, Perez-Prado, A, Alacacı, C, Dwyer, E, et al.. The pragmatic and dialectical lenses: two views of mixed methods use in education. In: Handbook of mixed methods in social & behavioral research. Thousand Oaks, CA: SAGE Publications; 2003.Search in Google Scholar
21. Morgan, DL. Paradigms lost and pragmatism regained. J Mix Methods Res 2007;1:48–76. https://doi.org/10.1177/2345678906292462.Search in Google Scholar
22. Bryman, A. Integrating quantitative and qualitative research: how is it done? Qual Res 2006;6:97–113. https://doi.org/10.1177/1468794106058877.Search in Google Scholar
23. Poth, CA. The contributions of mixed insights to advancing technology-enhanced formative assessments within higher education learning environments: an illustrative example. Int J Educ Technol High Educ 2018;15. https://doi.org/10.1186/s41239-018-0090-5.Search in Google Scholar
24. Guetterman, T, Creswell, J. Using joint displays and MAXQDA software to represent the results of mixed methods research. In: Use of visual displays in research and testing: coding, interpreting, and reporting data. Charlotte, NC: Information Age Publishing; 2015:145–75 pp.Search in Google Scholar
25. Cohen, A, Sur, M, Weisse, M, Moffett, K, Lancaster, J, Saggio, R, et al.. Teaching diagnostic reasoning to faculty using an assessment for learning tool: training the trainer. MedEdPORTAL 2020;16:10938. https://doi.org/10.15766/mep_2374-8265.10938.Search in Google Scholar PubMed PubMed Central
26. Kamarova, S, Chatzisarantis, NLD, Hagger, MS, Lintunen, T, Hassandra, M, Papaioannou, A. Effects of achievement goals on perceptions of competence in conditions of unfavourable social comparisons: the mastery goal advantage effect. Br J Educ Psychol 2017;87:630–46. https://doi.org/10.1111/bjep.12168.Search in Google Scholar PubMed
27. Nicol, D, MacFarlane-Dick, D. Formative assessment and selfregulated learning: a model and seven principles of good feedback practice. Stud High Educ 2006;31:199–218. https://doi.org/10.1080/03075070600572090.Search in Google Scholar
28. Fetters, MD, Curry, LA, Creswell, JW. Achieving integration in mixed methods designs - principles and practices. Health Serv Res 2013;48:2134–56. https://doi.org/10.1111/1475-6773.12117.Search in Google Scholar PubMed PubMed Central
29. LaDonna, KA, Artino, AR, Balmer, DF. Beyond the guise of saturation: rigor and qualitative interview data. J Grad Med Educ 2021;13:607–11. https://doi.org/10.4300/jgme-d-21-00752.1.Search in Google Scholar
30. Tavares, W, Eppich, W, Cheng, A, Miller, S, Teunissen, PW, Watling, CJ, et al.. Learning conversations: an analysis of the theoretical roots and their manifestations of feedback and debriefing in medical education. Acad Med 2020;95:1020–5. https://doi.org/10.1097/acm.0000000000002932.Search in Google Scholar PubMed
31. Dhaliwal, G, Ilgen, J. Clinical reasoning: talk the talk or just walk the walk? J Grad Med Educ 2016;8:274–6. https://doi.org/10.4300/jgme-d-16-00073.1.Search in Google Scholar
32. Coulter, SE. Using the retrospective pretest to get usable, indirect evidence of student learning. Assess Eval High Educ 2020;37:321–34. https://doi.org/10.1080/02602938.2010.534761.10.1080/02602938.2010.534761Search in Google Scholar
33. Accreditation Council for Graduate Medical Education. ACGME common program requirements; 2020. Available from: https://www.acgme.org/Portals/0/PFAssets/ProgramRequirements/CPRResidency2020.pdf [Accessed 5 Mar 2021].Search in Google Scholar
34. Royal College of Physicians and Surgeons of Canada. CanMEDS 2015 framework; 2015. Available from: https://www.royalcollege.ca/rcsite/documents/canmeds/canmeds-full-framework-e.pdf [Accessed 5 Mar 2021].Search in Google Scholar
35. Cumming, A, Ross, M. The tuning project for medicine – learning outcomes for undergraduate medical education in Europe. Med Teach 2007;29:636–41. https://doi.org/10.1080/01421590701721721.Search in Google Scholar PubMed
Supplementary Material
The online version of this article offers supplementary material (https://doi.org/10.1515/dx-2022-0020).
© 2022 Walter de Gruyter GmbH, Berlin/Boston
Articles in the same Issue
- Frontmatter
- Review
- Diagnostic and therapeutic approach to hypernatremia
- Opinion Papers
- The diagnostic potential and barriers of microbiome based therapeutics
- Pursuit of “endpoint diagnoses” as a cognitive forcing strategy to avoid premature diagnostic closure
- Guidelines and Recommendations
- The e-Autopsy/e-Biopsy: a systematic chart review to increase safety and diagnostic accuracy
- Original Articles
- Exploring procedure-based management reasoning: a case of tension pneumothorax
- A structured approach to EHR surveillance of diagnostic error in acute care: an exploratory analysis of two institutionally-defined case cohorts
- Human centered design workshops as a meta-solution to diagnostic disparities
- Longitudinal clinical reasoning theme embedded across four years of a medical school curriculum
- Using the Assessment of Reasoning Tool to facilitate feedback about diagnostic reasoning
- Evolution of throat symptoms during the COVID-19 pandemic in the US
- Evaluating the role of a fully automated SARS-CoV-2 antigen ECLIA immunoassay in the management of the SARS COV 2 pandemic on general population
- miR-21-3p and miR-192-5p in patients with type 2 diabetic nephropathy
- Letter to the Editors
- Convoluted molecular maze of neprilysin
- OPeNet: an AI-based platform implemented to facilitate clinical reasoning by primary care practitioners, as well as the virtuous co-management of chronic patients during and after the COVID-19 pandemic in Italy
- Letter to the Editor in reply to Diamandis “COVID-19 and the Le Chatelier’s principle”
Articles in the same Issue
- Frontmatter
- Review
- Diagnostic and therapeutic approach to hypernatremia
- Opinion Papers
- The diagnostic potential and barriers of microbiome based therapeutics
- Pursuit of “endpoint diagnoses” as a cognitive forcing strategy to avoid premature diagnostic closure
- Guidelines and Recommendations
- The e-Autopsy/e-Biopsy: a systematic chart review to increase safety and diagnostic accuracy
- Original Articles
- Exploring procedure-based management reasoning: a case of tension pneumothorax
- A structured approach to EHR surveillance of diagnostic error in acute care: an exploratory analysis of two institutionally-defined case cohorts
- Human centered design workshops as a meta-solution to diagnostic disparities
- Longitudinal clinical reasoning theme embedded across four years of a medical school curriculum
- Using the Assessment of Reasoning Tool to facilitate feedback about diagnostic reasoning
- Evolution of throat symptoms during the COVID-19 pandemic in the US
- Evaluating the role of a fully automated SARS-CoV-2 antigen ECLIA immunoassay in the management of the SARS COV 2 pandemic on general population
- miR-21-3p and miR-192-5p in patients with type 2 diabetic nephropathy
- Letter to the Editors
- Convoluted molecular maze of neprilysin
- OPeNet: an AI-based platform implemented to facilitate clinical reasoning by primary care practitioners, as well as the virtuous co-management of chronic patients during and after the COVID-19 pandemic in Italy
- Letter to the Editor in reply to Diamandis “COVID-19 and the Le Chatelier’s principle”