Home Assessing physical examination skills using direct observation and volunteer patients
Article
Licensed
Unlicensed Requires Authentication

Assessing physical examination skills using direct observation and volunteer patients

  • Bennett W. Clark EMAIL logo , Yi Zhen Joan Lee , Timothy Niessen , Sanjay V. Desai and Brian T. Garibaldi
Published/Copyright: March 13, 2020

Abstract

Background

Feedback based on direct observation of the physical examination (PE) is associated with enhanced educational outcomes, yet attending physicians do not frequently observe graduate trainees performing the PE.

Methods

We recruited volunteer patients (VPs), each with an abnormality of the cardiovascular, respiratory, or neurological system. Interns examined each VP, then presented a differential diagnosis and management plan to two clinician educators, who, themselves, had independently examined the VPs. The clinician educators assessed interns along five domains and provided post-examination feedback and teaching. We collected data on intern performance, faculty inter-rater reliability, correlation with a simulation-based measure of clinical skill, and resident and VP perceptions of the assessment.

Results

A total of 72 PGY-1 interns from a large academic training program participated. Performance on the cardiovascular and respiratory system was superior to performance on the neurologic exam. There was no correlation between results of an online test and directly observed cardiovascular skill. Interns preferred feedback from the direct observation sessions. VPs and faculty also rated the experience highly. Inter-rater reliability was good for the respiratory exam, but poor for the cardiovascular and neurologic exams.

Conclusions

Direct observation of trainees provides evidence about PE skill that cannot be obtained via simulation. Clinician educators’ ability to provide reliable PE assessment may depend on the portion of the PE being assessed. Our experience highlights the need for ongoing training of clinician educators in direct observation, standard setting, and assessment protocols. This assessment can inform summative or formative assessments of physical exam skill in graduate medical education.


Corresponding author: Bennett W. Clark, MD, Department of Internal Medicine, University of Minnesota School of Medicine, 420 Delaware St. SE, Minneapolis, MN 55455, USA; and Livio Health Group, 401, Harding St. NE, Minneapolis, MN 55413, USA

Acknowledgments

The authors would like to thank Dr. Andrew Elder from the Royal College of Physicians of Edinburgh and the University of Edinburgh for providing guidance related to establishing a summative assessment of clinical skills for graduate trainees in the US context.

  1. Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission.

  2. Research funding: This work was funded by the Johns Hopkins Institute for Excellence in Education (IEE) Berkheimer Faculty Scholars Award, the New York Academy of Medicine Jeremiah A. Barondess Fellowship in the Clinical Transaction (in collaboration with the ACGME), Funder Id: http://dx.doi.org/10.13039/100007261, and the American Board of Medical Specialties Visiting Scholar Program (with support from the Gordon and Betty Moore Foundation).

  3. Employment or leadership: None declared.

  4. Honorarium: None declared.

  5. Competing interests: The funding organization(s) played no role in the study design; in the collection, analysis, and interpretation of data; in the writing of the report; or in the decision to submit the report for publication.

References

1. Reilly BM. Physical examination in the care of medical inpatients: an observational study. Lancet 2003;362:1100–5.10.1016/S0140-6736(03)14464-9Search in Google Scholar

2. Singh H, Giardina TD, Meyer AN, Forjuoh SN, Reis MD, Thomas EJ. Types and origins of diagnostic errors in primary care settings. J Am Med Assoc Intern Med 2013;173:418–25.10.1001/jamainternmed.2013.2777Search in Google Scholar PubMed PubMed Central

3. Gandhi TK, Kachalia A, Thomas EJ, Puopolo AL, Yoon C, Brennan TA, et al. Missed and delayed diagnoses in the ambulatory setting: a study of closed malpractice claims. Ann Intern Med 2006;145:488–96.10.7326/0003-4819-145-7-200610030-00006Search in Google Scholar PubMed

4. Chimowitz MI, Logigian EL, Caplan LR. The accuracy of bedside neurological diagnoses. Ann Neurol 1990;28:78–85.10.1002/ana.410280114Search in Google Scholar PubMed

5. Clark BW, Derakhshan A, Desai SV. Diagnostic errors and the bedside clinical examination. Med Clin North Am 2018;102:453–64.10.1016/j.mcna.2017.12.007Search in Google Scholar PubMed

6. The Accreditation Council for Graduate Medical Education and The American Board of Internal Medicine. The Internal Medicine Milestone Project, 2015. https://www.acgme.org/Portals/0/PDFs/Milestones/InternalMedicineMilestones.pdf.Accessed 1 Sep 2019.Search in Google Scholar

7. Mookherjee S, Pheatt L, Ranji SR, Chou CL. Physical examination education in graduate medical education – a systematic review of the literature. J Gen Intern Med 2013;28:1090–9.10.1007/s11606-013-2380-xSearch in Google Scholar PubMed PubMed Central

8. Holmboe ES. Faculty and the observation of trainees’ clinical skills: problems and opportunities. Acad Med 2004;79:16–22.10.1097/00001888-200401000-00006Search in Google Scholar PubMed

9. Haber RJ, Avins AL. Do ratings on the American Board of Internal Medicine Resident Evaluation Form detect differences in clinical competence? J Gen Intern Med 1994;9:140–5.10.1007/BF02600028Search in Google Scholar PubMed

10. Thompson WG, Lipkin M, Gilbert DA, Guzzo RA, Roberson L. Evaluating evaluation: assessment of the American Board of Internal Medicine Resident Evaluation Form. J Gen Intern Med 1990;5:214–7.10.1007/BF02600537Search in Google Scholar PubMed

11. Epstein RM. Assessment in medical education. N Engl J Med 2007;356:387–96.10.1056/NEJMra054784Search in Google Scholar PubMed

12. USMLE Step 2 CS (Clinical Skills). https://www.usmle.org/step-2-cs/#format. Accessed 9 Dec 2018.Search in Google Scholar

13. Holmboe ES, Hawkins RE, Huot SJ. Effects of training in direct observation of medical residents’ clinical competence: a randomized trial. Ann Intern Med 2004;140:874–81.10.7326/0003-4819-140-11-200406010-00008Search in Google Scholar PubMed

14. Miller GE. The assessment of clinical skills/competence/performance. Acad Med 1990;65(9 Suppl):S63–67.10.1097/00001888-199009000-00045Search in Google Scholar PubMed

15. Sullivan GM. A primer on the validity of assessment instruments. J Grad Med Educ 2011;3:119–20.10.4300/JGME-D-11-00075.1Search in Google Scholar PubMed PubMed Central

16. Kogan JR, Holmboe ES, Hauer KE. Tools for direct observation and assessment of clinical skills of medical trainees: a systematic review. J Am Med Assoc 2009;302:1316–26.10.1001/jama.2009.1365Search in Google Scholar PubMed

17. Kogan JR, Conforti L, Bernabeo E, Iobst W, Holmboe E. Opening the black box of clinical skills assessment via observation: a conceptual model. Med Educ 2011;45:1048–60.10.1111/j.1365-2923.2011.04025.xSearch in Google Scholar PubMed

18. Elder A, McManus C, McAlpine L, Dacre J. What skills are tested in the new PACES examination? Ann Acad Med Singapore 2011;40:119–25.10.47102/annals-acadmedsg.V40N3p119Search in Google Scholar

19. Vukanovic-Criley JM, Hovanesyan A, Criley SR, Ryan TJ, Plotnick G, Mankowitz K, et al. Confidential testing of cardiac examination competency in cardiology and noncardiology faculty and trainees: a multicenter study. Clin Cardiol 2010;33:738–45.10.1002/clc.20851Search in Google Scholar PubMed PubMed Central

20. Vukanovic-Criley JM, Boker JR, Criley SR, Rajagopalan S, Criley JM. Using virtual patients to improve cardiac examination competency in medical students. Clin Cardiol 2008;31:334–9.10.1002/clc.20213Search in Google Scholar PubMed PubMed Central

21. Vukanovic-Criley JM, Criley S, Warde C. Competency in cardiac examination skills in medical students, trainees, physicians, and faculty: a multicenter study. Arch Intern Med 2006;166:610–6.10.1001/archinte.166.6.610Search in Google Scholar PubMed

22. Garibaldi BT, Niessen T, Gelber AC, Clark B, Lee Y, Madrazo JA, et al. A novel bedside cardiopulmonary physical diagnosis curriculum for internal medicine postgraduate training. BMC Med Educ 2017;17:182.10.1186/s12909-017-1020-2Search in Google Scholar PubMed PubMed Central

23. Kreiter C. When I say… response process validity. Med Educ 2015;49:247–8.10.1111/medu.12572Search in Google Scholar PubMed

24. Johnson JE, Carpenter JL. Medical house staff performance in physical examination. Arch Intern Med 1986;146:937–41.10.1001/archinte.1986.00360170163023Search in Google Scholar

25. Aloia JF, Jonas E. Skills in history-taking and physical examination. J Med Educ 1976;51:410–5.10.1097/00001888-197605000-00009Search in Google Scholar PubMed

26. Boerebach BC, Arah OA, Heineman MJ, Lombarts KM. Embracing the complexity of valid assessments of clinicians’ performance: a call for in-depth examination of methodological and statistical contexts that affect the measurement of change. Acad Med 2016;91:215–20.10.1097/ACM.0000000000000840Search in Google Scholar PubMed

27. Boulet JR, Murray D. Review article: assessment in anesthesiology education. Can J Anaesth 2012;59:182–92.10.1007/s12630-011-9637-9Search in Google Scholar PubMed

28. Colliver JA, Vu NV, Markwell SJ, Verhulst SJ. Reliability and efficiency of components of clinical competence assessed with five performance-based examinations using standardized patients. Med Educ 1991;25:303–10.10.1111/j.1365-2923.1991.tb00071.xSearch in Google Scholar PubMed

29. Kroboth FJ, Hanusa BH, Parker S, Coulehan JL, Kapoor WN, Brown FH, et al. The inter-rater reliability and internal consistency of a clinical evaluation exercise. J Gen Intern Med 1992;7:174–9.10.1007/BF02598008Search in Google Scholar PubMed

30. Cook DA, Dupras DM, Beckman TJ, Thomas KG, Pankratz VS. Effect of rater training on reliability and accuracy of mini-CEX scores: a randomized, controlled trial. J Gen Intern Med 2009;24:74–79.10.1007/s11606-008-0842-3Search in Google Scholar PubMed PubMed Central

31. Kogan JR, Bellini LM, Shea JA. Feasibility, reliability, and validity of the mini-clinical evaluation exercise (mCEX) in a medicine core clerkship. Acad Med 2003;78(10 Suppl):S33–35.10.1097/00001888-200310001-00011Search in Google Scholar PubMed

32. Noel GL, Herbers JE, Caplow MP, Cooper GS, Pangaro LN, Harvey J. How well do internal medicine faculty members evaluate the clinical skills of residents? Ann Intern Med 1992;117:757–65.10.7326/0003-4819-117-9-757Search in Google Scholar PubMed

33. Membership of the Royal College of Physicians. Exam Pass Rates. https://www.mrcpuk.org/mrcpuk-examinations/results/exam-pass-rates. Accessed 1 Sep 2019.Search in Google Scholar

34. Elder AT. Appeals from candidates in MRCP(UK) examinations. Br Med J 2014;349:g5209.10.1136/bmj.g5209Search in Google Scholar PubMed

35. Membership of the Royal College of Physicians. Become a PACES Examiner. https://www.mrcpuk.org/get-involved-examiners/paces-examiners/become-a-paces-examiner. Accessed 1 Sep 2019.Search in Google Scholar

Received: 2019-11-19
Accepted: 2020-02-15
Published Online: 2020-03-13
Published in Print: 2021-02-23

©2020 Walter de Gruyter GmbH, Berlin/Boston

Articles in the same Issue

  1. Frontmatter
  2. Editorial
  3. Driving on a highway to hell I found the stairway to heaven. A mentorship lecture intermixed with rock music and a quiz
  4. Review
  5. Updated overview on the interplay between obesity and COVID-19
  6. Mini Review
  7. Challenges and opportunities for integrating genetic testing into a diagnostic workflow: heritable long QT syndrome as a model
  8. Opinion Papers
  9. Making sense of rapid antigen testing in severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) diagnostics
  10. Interpreting clinical and laboratory tests: importance and implications of context
  11. Predicting mortality with cardiac troponins: recent insights from meta-analyses
  12. Guidelines and Recommendations
  13. Operational measurement of diagnostic safety: state of the science
  14. Original Articles
  15. Rate of diagnostic errors and serious misdiagnosis-related harms for major vascular events, infections, and cancers: toward a national incidence estimate using the “Big Three”
  16. Pyoderma gangrenosum underrepresentation in non-dermatological literature
  17. Assessing the utility of a differential diagnostic generator in UK general practice: a feasibility study
  18. Assessing physical examination skills using direct observation and volunteer patients
  19. Clinicians’ and laboratory medicine specialists’ views on laboratory demand management: a survey in nine European countries
  20. Letters to the Editor
  21. Frequency of repetitive laboratory testing in patients transferred from the Emergency Department to hospital wards: a 3-month observational study
  22. Letter in response to Vanstone paper on diagnostic intuition
  23. Corrigenda
  24. Corrigendum to: Serious misdiagnosis-related harms in malpractice claims: The “Big Three” – vascular events, infections, and cancers
  25. Clinical problem solving and social determinants of health: a descriptive study using unannounced standardized patients to directly observe how resident physicians respond to social determinants of health
  26. Acknowledgment
  27. Acknowledgment
Downloaded on 9.9.2025 from https://www.degruyterbrill.com/document/doi/10.1515/dx-2019-0089/html
Scroll to top button