Home Technology use and satisfaction among colleges/schools of osteopathic medical education
Article Open Access

Technology use and satisfaction among colleges/schools of osteopathic medical education

  • Machelle Linsenmeyer ORCID logo EMAIL logo and Lance Ridpath
Published/Copyright: September 15, 2025

Abstract

Context

In March 2020, as the COVID-19 pandemic became a national concern, quick assessment and rapid changes needed to be made in all areas of osteopathic medical education. The Technology in Medical Education (TIME) adaptive working group was formed by the American Association of Colleges of Osteopathic Medicine (AACOM) to analyze best practices in technology applications used in the schools and colleges of osteopathic medicine (COMs). In January 2023, post-pandemic data was collected to provide comparisons of technology use over time and determine the current landscape of technology.

Objectives

To determine the technological uses and satisfaction at COMs nationally and offer a comprehensive list of software being used in osteopathic medical education to help inform technology decisions.

Methods

Survey instruments were used to gather data from 34 COMs (main campuses, branch campuses, and additional locations) during COVID-19 and 49 COMs post-COVID-19. Data was collected during COVID-19 from April 2020 to December 2020 with data analysis through April 2021. Data was collected post-COVID-19 from January 2023 to April 2023 with data analysis through March 2024. Five questions were consistent across both surveys and used for comparisons. Descriptive statistics, Fisher’s exact tests, and thematic analysis were utilized in the data analysis for both surveys.

Results

Of the 57 COMs surveyed during COVID-19, 34 responses were received for an overall response rate of 59.6 % (34/57). Of the 62 COMs surveyed post-COVID-19, 49 responses were received for a response rate of 79.0 % (49/62). To capture changes across time, thirty-four (34) institutions responded to the five identical questions on both surveys. While software selection was diverse across institutions, overall satisfaction remained high with 75.0 % of COMs being extremely or moderately satisfied with their software selections. Popular software packages across the two survey periods included Canvas (36.4 %, 54.9 %), Panopto (29.0 %, 27.5 %), ExamSoft (81.8 %, 84.0 %), and eValue (44.8 %, 28.6 %). Software shifts saw increased usage of Canvas (36.4–54.9 %), Yuja (3.2–11.8 %), Zoom for remote proctoring (5.2–35.3 %), and internal/custom solutions for clinical education/scheduling (3.4–10.2 %) and decreased usage of Vimeo (9.7–0.0 %), Respondus (9.0–0.0 %) and ExamMonitor (63.2–20.5 %). Comments indicated an interest in the national resources related to topics such as telehealth, public health, health system science, and health informatics. Key takeaways included the need for shared online training material in both preclinical and clinical education; development of new virtual or gaming technologies; and training faculty and staff to support technology integration.

Conclusions

Technology selection at medical schools is on-going but relatively consistent across time. This was especially true through the COVID-19 pandemic and with the ongoing increase in new osteopathic medical schools. This research offers a comprehensive list of software being used in osteopathic medical education including a snapshot of changes during and post-COVID-19 to help inform technology decisions across osteopathic medical education. In addition, these results are driving the planning process for AACOM initiatives into the future.

In March 2020, as the COVID-19 pandemic became a national concern, the world of medical education was greatly tested and institutions had to act quickly to account for current capacities and enact rapid changes to meet the needs of faculty, staff, and students. It is typical that trends in the use of technology primarily develop in response to challenges facing medical education [1]; however, during the pandemic, changes happened even more rapidly than normal. Regarding osteopathic medical education (OME), concerns had to be met quickly in admissions, clinical education experiences, and graduation [2]. Schools and colleges of osteopathic medicine (COMs) had to rapidly adjust to remote applications for learning and assessment. As a result of this stress, the profession had to come together to generate and share ideas and solutions, offer best practices, and share resources. The American Association of Colleges of Osteopathic Medicine (AACOM) responded quickly with the creation of four working groups (workgroups): Mobilizing the Future Healthcare Workforce; Clinical Education Alternatives; Pathways to Practice; and Transformational Technology in Medical Education (TIME).

The TIME adaptive working group was formed specifically to assist member schools and educators by learning, sharing, and investigating means by which to utilize technology to rapidly adapt and transform OME [3]. Specifically, TIME addressed: preclinical and clinical training assistance; the implementation of new technology; training of faculty and staff to support technology integration; and determining how technology could be better integrated in the future of OME. As part of the TIME charge, a survey was utilized to capture the landscape of technology at the onset of the COVID-19 pandemic.

Although the COVID-19 pandemic invoked change, it is valuable for medical education to continuously and constructively reevaluate current technologies to discover innovations that inform more effective educational practices [4], [5], [6]. A framework to better understand transformative change is Normalization Process Theory (NPT) [7]. This sociological theoretical framework has been increasingly utilized to understand how a new practice, such as the use of technology, becomes embedded within a social system (“normalization”) through an active process, both individually and collectively, that occurs over a period of time [8]. The new practice becomes embedded when it is routinely incorporated in the everyday work of individuals and groups.

To monitor this normalization of technology after COVID-19, AACOM created and distributed a second set of questions regarding technology use at COMs through its 2023 Annual Survey. This paper provides an essential, and lacking, comparison of the technology landscape at the COMs from when the COVID-19 pandemic hit to now. It also documents the satisfaction of COMs with technology applications for various purposes to determine how AACOM could support technology efforts, policy updates, and decision making into the future. Additionally, it follows the “normalization” effect as a way to show how these adaptions might influence the future and continuing use of technology in OME. The findings of this article will inform technological decisions across not only OME but also all medical or health profession schools making similar decisions.

Methods

This study was reviewed by the West Virginia School of Osteopathic Medicine Institutional Review Board and approved as exempt from review (IRB 2022-18). This study received no funding, and informed consent was captured through study participants’ willingness to complete the surveys. Participants for this study were solicited through the AACOM database of schools and COMs, including additional locations and branch campuses, with permission. While deans were the point of contact for receipt of the survey, the dean could assign the most appropriate person(s)/role(s) at their institution to best answer the questions in each specific section. Participants of each survey received no compensation for participation.

An initial survey instrument distributed in April 2020, at the onset of the COVID-19 pandemic, was designed by a subset of the TIME adaptive working group members consisting of four COM faculty representing both preclinical and clinical aspects and three AACOM staff members. Survey questions were developed by a panel of experts following a review of the literature. The final survey was sent to the entire TIME adaptive working group for comprehensive review and feedback. After full review, the final survey consisted of 25 questions (Supplementary Material) including one identifying question (school or college) and 24 multiple-choice or open-comment questions. The follow-up survey sent in January 2023 included 23 questions and incorporated five identical questions to the 2020 survey. These questions were developed by a similar expert group that included members from the original TIME adaptive working group, and they were reviewed by AACOM’s Research and Survey Advisory group for inclusion on the AACOM annual survey (Supplementary Material). The question topics in both instruments ranged from software utilized in specific areas (eg, learning management system [LMS], recording of lectures, examination administration, examination proctoring, virtual anatomy, etc.) to critical components for teaching and assessment in both preclinical and clinical education. Descriptive statistics and Fisher’s exact tests were utilized to analyze the quantitative survey data. Inductive thematic analysis was utilized to analyze the qualitative survey data. Utilizing the six-step process developed by Braun and Clarke [9], the textual comments were placed into specific themes, reviewed, defined, and named, and then summarized to specify the types of responses supporting each theme.

Results

During COVID-19, responses were obtained from 34 COMs (main campuses, branch campuses, and additional locations). At the time of the 2020 survey, there were 36 main campuses, 6 branch campuses, and 15 additional locations for a total of 57 sites [10]. The response rate during COVID-19 was 60 % (34/57, 59.6 %). After COVID-19, responses were obtained from 49 COMs. At the time of the survey, there were 41 main campuses, 8 branch campuses, and 18 additional locations for a total of 67 sites [10]. Five sites were excluded because they were new campuses that were therefore not required to participate in the annual survey. These were defined as having their first class in 2023 or after. The response rate post-COVID-19 was 79 % (49/62, 79.0 %). Responses for the five identical questions asked in both surveys were compared utilizing total numbers. Institutions that did not respond to a question were excluded from that question only. Institutions that responded with more than one answer to a question had each response counted for the software noted.

The most significant findings from this study included the recognition that overall satisfaction with technologies remained high, with 75.0 % of COMs being extremely or moderately satisfied with their software selections. Software shifts saw increased usage of Canvas (36.4–54.9 %), Yuja (3.2–11.8 %), Zoom for remote proctoring (5.2–35.3 %), and internal/custom (solutions for clinical education/scheduling (3.4–10.2 %), and there was decreased usage of Vimeo (9.7–0.0 %), Respondus (for examinations 9.0–0.0 %) and ExamMonitor (for Remote/Virtual Proctoring 63.2–20.5 %).

During COVID-19 (2020) and post-COVID-19 (2023) comparisons of five identical questions

Learning management systems

Regarding learning management system (LMS) software, Canvas (12/33, 36.4 %), Blackboard (6/33, 18.2 %), and DaVinci (5/33, 15.2 %) were the most popular software utilized in 2020. In 2023, the usage demonstrated a shift, with Canvas increasing in usage, DaVinci reducing in usage, and Blackboard remaining nearly the same (Canvas: 28/51, 54.9 %; Blackboard: 7/51, 13.7 %; DaVinci: 3/51, 5.8 %). Table 1 shows the overall LMS software usage between 2020 and 2023.

Table 1:

A comparison of software by category during COVID-19 (2020) and post-COVID-19 (2023).

Learning management
Product 2020 (n=33) 2023 (n=51) Fisher’s exact p-Value
Canvas 12 28 0.1095
Blackboard 6 7 0.7571
DaVinci (LCMS+) 5 3 0.2516
Moodle 3 1 0.2940
eMedley 3 2 0.3755
Jenzabar 2 1 0.5578
Brightspace 1 3 1.0000
Elentra 1 1 1.0000
Desire2Learn 0 2 0.5170
InteDashboard 0 1 1.0000
eValue 0 1 1.0000
Campus management 0 1 1.0000
Audio/video recording/posting
Product 2020 (n=31) 2023 (n=51) Fisher’s exact p-Value
Panopto 9 14 1.0000
Mediasite 6 9 1.0000
Echo360 5 5 0.5023
Vimeo 3 0 0.0568
Kaltura 2 3 1.0000
CaptureSpace 1 0 0.3924
Collaborate 1 1 1.0000
Internal/Custom 1 0 0.3924
Haivision 1 0 0.3924
Screencast-O-Matic 1 0 0.3924
Yuja 1 6 0.2359
Microsoft stream 0 2 0.5170
SharkMedia 0 1 1.0000
Zoom 0 10 a0.0052
Exam software/management
Product 2020 (n=33) 2023 (n=50) Fisher’s exact p-Value
ExamSoft 27 42 1.0000
Respondus 3 0 0.0568
eMedley 2 2 0.6431
Internal/custom 1 2 1.0000
ProgressIQ 0 1 1.0000
TurningPoint 0 1 1.0000
Turnitin 0 1 1.0000
Quizlet 0 1 1.0000
Remote/virtual proctoring
Product 2020 (n=19) 2023 (n=34) Fisher’s exact p-Value
ExamMonitor 12 7 a0.0292
Respondus monitor 2 0 0.1509
eProctor 1 0 0.3924
ProctorU 1 0 0.3924
Examplify 1 0 0.3924
ExamSoft 1 9 0.0791
Zoom 1 12 a0.0122
NBOME portal 0 3 0.2756
eMedley (ExamN) 0 1 1.0000
MonitorEDU 0 1 1.0000
Microsoft teams 0 1 1.0000
Clinical scheduling/evaluations
Product 2020 (n=29) 2023 (n=49) Fisher’s exact p-Value
eValue 13 14 0.3316
New innovations 8 9 0.5766
eMedley 3 4 1.0000
DaVinci (LCMS+) 1 3 1.0000
Core ELMS 1 1 1.0000
EDURotations 1 1 1.0000
one45 1 1 1.0000
Internal and custom 1 5 0.3948
Banner 0 3 0.2756
Clinician nexus 0 1 1.0000
Elentra 0 1 1.0000
EXXAT 0 2 0.5170
Medtrics 0 1 1.0000
Rotation management system 0 1 1.0000
Visiting student application service 0 1 1.0000
  1. aDenotes significance of 0.05 or below.

Audio/video recording and posting software

The software most utilized for audio/video recordings and postings in 2020 were Panopto (9/31, 29.0 %), Mediasite (6/31, 19.4 %) and Echo360 (5/31, 16.1 %). Again, there were similarities as well as shifts in usage in 2023, with Panopto still being the top software utilized in 2023 (14/51, 27.5 %) but Zoom jumping to second (10/51, 19.6 %) and Mediasite dropping to third (9/51, 17.6 %). Yuja (6/51, 11.8 %) passed Echo360 (5/51, 9.8 %) to round out the top five. Table 1 also illustrates the comparison of audio/video recording and posting software usage between 2020 and 2023.

Examination software

Regarding the examination software utilized in 2020, the results were strongly tilted toward the use of Examplify (ExamSoft; 27/33, 81.8 %), with others indicating the use of Respondus (3/33, 9.1 %) and eMedley/ExamN (2/33, 6.1 %). In 2023, results remained in favor of ExamSoft as the clear leader for examination administration software (42/50, 84.0 %), while Respondus dropped to no users and eMedley/ExamN remained with two users (2/50, 4.0 %). In 2023, we saw Internal or custom systems (2/50, 4.0 %) being noted. Table 1 also shows the comparisons of examination administration software usage between 2020 and 2023. Note that all terms related to ExamSoft software (ie, ExamSoft, ExamMonitor, Examplify) were included in the ExamMonitor category, although ExamMonitor is the technical name for the ExamSoft remote/virtual proctoring solution.

Remote/virtual proctoring software

For the use of proctoring software in 2020, a majority of COMs utilized ExamMonitor (14/19, 73.7 %), with the second largest being Respondus Monitor (2/19, 10.5 %). In 2023, there was a slight jump in usage of ExamMonitor (16/34, 47.1 %) and a large increase in the use of Zoom for testing (12/34, 35.3 %). Table 1 also shows the comparison of remote/virtual proctoring software usage between 2020 and 2023.

Results regarding clinical scheduling/evaluation software utilized

The top software packages utilized for clinical scheduling (CS)/clinical evaluation (CE) in 2020 were eValue (13/29, 44.8 %), New Innovations (8/29, 27.6 %), and eMedley (3/29, 10.3 %). In 2023, the CS software and CE software were evaluated utilizing separate questions. There was more variety in the software reported in 2023, with the top software packages for both CS and CE remaining consistent with eValue (CS=14/49, 28.6 %; CE=18/49, 36.7 %), New Innovations (CS=9/49, 18.4 %; CE=9/49, 18.4 %), and eMedley (CS=5/49, 10.2 %; CE=5/49, 10.2 %) rounding out the top. However, internal and custom creation of software (5/49, 10.2 %) increased greatly in 2023, tying for the third slot. Table 1 also illustrates the comparison of CS/CE software usage between 2020 and 2023.

During COVID-19 (2020) and post-COVID-19 (2023) additional technology usage

In response to inquiries regarding additional software not analyzed in 2020, it was found that there was a variety of software reported including several instances of internal/custom creation of solutions. Table 2 shows the top choices for each software category providing the total number of responses for each software package.

Table 2:

The top software choices by software category.a

Software Top choice Additional top choices
Virtual anatomy software (2020 only) Complete anatomy (n=3) VH Dissector and anatomage (n=2 each) Acland’s video atlas, histology virtual labs, body Viz, CyberAnatomy, Microsoft hololense, 4D anatomy, sectra, Bline/SimCapture, human anatomy atlas by visible body (n=1 each)
Question banks (2020 only) COMBANK (n=15) UWorld (n=7) Kaplan (n=6)
Clinical cases (2020 only) Aquifer (n=18) No other software noted.
Curriculum management/mapping Excel (n=7) DaVinci (n=5) Altus/one45; manual creation; and Microsoft (n=4 each)
Preclinical grading/gradebook Canvas (n=21) Blackboard (n=7) ProgressIQ, brightspace, Excel, and eMedley (n=3 each)
Portfolios/student reflections eValue (n=8) Manual creation and ProgressIQ (n=4 each) Blackboard, canvas, and eMedley (n=3 each)
Course/faculty evaluations Anthology and eValue (n=5) Blue, canvas, and qualtrics/Watermark (n=4 each) SurveyMonkey and eMedley (n=3 each)
MSPE letters Internal/custom creation (n=13) Word (n=7) Excel (n=5)
Dashboard/student snapshot/performance tracking ProgressIQ (n=7) Excel (n=5) eValue (n=4)
Clinical grading/gradebook eValue (n=11) Canvas (n=7) Internal/custom creation (n=6)
Case logs eValue (n=11) New innovations (n=7) Custom creation and human DX (n=4 each)
Entrustable professional activities eValue (n=9) Internal/custom creation (n=5) New innovations (n=4)
Preceptor payment eValue (n=7) Internal/custom creation, google sheets, and manual verification and payment (n=4 each) UniMarket (n=3)
Immunization/certification tracking eValue and CastleBranch (n=11 each) Internal/custom creation (n=5) Medicat (n=3)
Data export management/Reporting Excel (n=7) Canvas (n=5) Internal/custom creation, snowflake, SQL, tableau, and eMedley (n=3 each)
Secure data collection & repository software Banner (n=5) QuestionPro, VEEAM, nimble, and Exagrid campus (n=4 each) Internal/custom creation and snowflake (n=3 each)
Medical education simulation software/system CAE LearningSpace (n=13) EMS (SimulationIQ) (n=11) Laerdal SimCapture (B-line) (n=9)
  1. aThe number of responses provided for each choice varies due to multiple responses being allowed for software categories. If “2020 only” is not noted, the data were for post-COVID-19 (2023) only.

Results regarding question banks during COVID-19 (2020) and post-COVID-19 (2023) with preclinical and clinical education usage

During COVID-19, respondents indicated the use of question banks both in preclinical and clinical education (18/31, 58.1 %). Respondents indicated the use of Aquifer (18/31, 58.1 %) the most for clinical education only. Fewer institutions indicated purchasing question banks post-COVID-19, and the ones who did purchased Aquifer (9/31, 29.0 %) for both preclinical and clinical education. Questions banks with the highest usage during COVID-19 were COMBANK (15/31, 48.4 %), UWorld (7/31, 22.6 %), and Kaplan (6/31, 19.4 %). Question banks that were identified as being purchased post-COVID-19 were Kaplan and TrueLearn (both 2/31, 6.5 %).

Post-COVID-19 (2023) overall satisfaction with software by category

In order to assess the number of Institutions that noted being satisfied with software in each category, we combined “moderately satisfied” and “extremely satisfied” into a single value. The majority of institutions showed satisfaction at 75 % or higher in 10/20 categories, 50–74 % satisfaction in 9/10 categories, and below 49 % satisfaction in one category. The category with the least amount of satisfaction was the curriculum management/mapping category. The category with the highest satisfaction was the Secure Data Collection & Repository Software category. Figure 1 shows the overall satisfaction by software category.

Figure 1: 
The overall satisfaction by software category (2023 only).
Figure 1:

The overall satisfaction by software category (2023 only).

Results regarding telehealth programs

Respondents in 2023 were queried on the current telehealth programs at their school, specifically whether there was a telehealth program at their school for: 1) teaching and training students; 2) physical healthcare services for students; and/or 3) mental healthcare services for students. Results indicated that most existing telehealth programs were oriented toward providing healthcare services to students rather than providing teaching and training to students. Among the respondents, 85.7 % indicated that they had a telehealth program for the purpose of providing mental healthcare services for students, 66.6 % indicated that they had a telehealth program for providing physical healthcare services to students, and 48.3 % indicated that they had a telehealth program for teaching and training students as indicated in Figure 2.

Figure 2: 
The number of telehealth programs by purpose (2023 only).
Figure 2:

The number of telehealth programs by purpose (2023 only).

Discussion

These survey results show that technology usage in OME remains fairly consistent from COVID-19 until now, with the exception of alterations that stemmed from the pandemic, which is consistent with experiences from others in medical education and more globally [7]. These changes were not only influenced by the changing healthcare environment, movement in instructional techniques, and emersion of new technology but also societal and global shifts. The pandemic pushed medical education around the world and across multiple broader sectors (including all health professions, commercial organizations, and professional bodies) to scale up technology use by incorporating more virtual and blended learning, telemedicine, mobile access to materials, virtual reality, lecture recording, and so on [7], [11], [12], [13], [14], [15]. This was consistent with the shifts in software and technology usage in OME, with the most significant shifts being in the areas of virtual learning, audio/video recordings, and heightened interest in telemedicine. The pandemic also pushed education with evolving assessment practices such as flexible examinations, online proctoring, artificial intelligence (AI)–assisted simulations, and skill building without patients through methods such as virtual anatomy, virtual surgeries, and diagnostic training [16], [17], [18]. This is also consistent with findings from these surveys showing changes in remote/virtual proctoring and moves to virtual skill building with movement into virtual anatomy.

Without the quick adaptions during COVID-19 by COMs, the education of osteopathic medical students certainly would have been a challenge. These adaptions have fed into the technology landscape that we see today with new opportunities for collaboration, learning experiences, and instructional strategies that were not felt possible prior to the COVID-19 pandemic. The pandemic pushed osteopathic medical schools to think outside the box. This has opened the door for even more discussions in areas such as real-time mobile video tools and applications, AI, virtual reality, virtual anatomy, and so on. Although some of the advances (AI, in particular) will require additional skills and training on the effective use of AI and the appropriate use of the vast knowledge attained by AI, there are still areas of medical education that would be difficult to replace with technology, especially areas related to the “art of care.” [19], 20] AI may help with memorization and analysis, but medical education will still need to focus on the “art of care.” [21] It is these changes that osteopathic medical schools will need to monitor as technology continues to grow. Staying abreast of these shifts in technology will be important not only for maintaining a high level of education but also for making decisions regarding the best use of technology in the educational environment.

Although the specific software selected by COMs for different functions varies widely, there is currently a general sense of satisfaction with the technology that COMs have selected. Some key ideas that emerged from these surveys centered around the sharing of resources and the development of resources. There was a strong sense that the profession could come together as a whole to develop shared resources for training in both preclinical and clinical education, to develop new virtual or gaming technologies, and to create training for faculty and staff to support technology integration. These ideas will certainly inform discussions across AACOM councils and committees in the coming years and can also serve as support for discussions in medical education and health professions education more broadly.

Limitations and future directions

While necessary for the purpose and aim of this study, the study may lack generalizability because the study was limited to osteopathic medical schools, branches, and additional locations within the United States and because of the fact that answers may have been tailored to what participants felt were important factors for the AACOM organization to hear. Surveys were sent during the height of the COVID-19 pandemic (2020) and post-pandemic (2023) and, thus, may not reflect the changing circumstances in OME between those years or since then. There may have been some response bias due to the 2023 survey being attached to a required AACOM survey in which participants may have felt obligated to respond but were not necessarily thoughtful in their responses, or they may have withheld honest feedback due to concerns over anonymity.

As we progress and adapt post-COVID-19, these needs and results will continue to evolve. We are already seeing significant and rapid changes with the introduction of AI into the landscape, which is poised to significantly transform medical education, both in how students learn and how educators teach. AI will strengthen medical education in areas such as personalized learning paths, virtual patients and simulations, AI-powered tutoring and feedback, natural language processing for clinical note taking, AI curricular support for mapping and data analytics, faculty support for content creation, student engagement, and so on [22], [23], [24]. It is suggested that a future, 10-year study should be conducted to evaluate these trends, monitor technology use and needs of COMs, and further explore the normalization process, the creation of shared resources, and shifts in training toward a focus on the “art of care.”

Conclusions

Technology selection at medical schools is ongoing but relatively consistent across time. This was especially true through the COVID-19 pandemic and with the ongoing increase in new osteopathic medical schools. This research offers a comprehensive list of software being utilized in OME including a snapshot of changes during and post-COVID-19 to help inform technology decisions across OME. These findings will help inform policies and decision making by supporting data-driven predictions and forecasting trends, and by continuing to monitor shifts to better inform change at the national level. Because the national changes are happening rapidly, trends will continue to evolve, and monitoring through these results and future initiatives will better inform standardization, equity, and the planning process for AACOM initiatives into the future.


Corresponding author: Machelle Linsenmeyer, EdD, Office of Assessment and Educational Development, West Virginia School of Osteopathic Medicine, 400 Lee Street North, Lewisburg, WV 24910, USA E-mail:

Acknowledgements

The authors would like to acknowledge the American Association of Colleges of Osteopathic Medicine’s (AACOM) employees Jeff Tjiputra, DSc, MLS, Aisha Ali, MHRM, and Erik Guercio, MA, who each assisted in the development, administration and data collection of the survey data used herein. The authors acknowledge the work of the members of AACOM’s Transformational Technology in Medical Education group for their work on survey development and administration. Finally, the authors acknowledge and express our appreciation to Mary Norris, MS (Director of Assessment and Quality Improvement at West Virginia School of Osteopathic Medicine), for her contributions to the qualitative analysis of data during COVID-19.

  1. Research ethics: This study was reviewed by the West Virginia School of Osteopathic Medicine Institutional Review Board and approved as exempt from review (IRB 2022-18).

  2. Informed consent: Informed consent was captured through an introductory statement at the beginning of the survey explaining their informed consent and the study participant’s willingness to move forward and complete the survey.

  3. Author contributions: All authors have accepted responsibility for the entire content of this manuscript and approved its submission.

  4. Use of Large Language Models, AI and Machine Learning Tools: None declared.

  5. Conflict of interest: None declared.

  6. Research funding: None declared.

  7. Data availability: Raw data may be obtained on request from the corresponding author.

References

1. Guze, PA. Using technology to meet the challenges of medical education. Trans Am Clin Climatol Assoc 2015;126:260–70.Search in Google Scholar

2. Commission on Osteopathic College Accreditation (COCA). Guidance for colleges of osteopathic medicine related to coronavirus (COVID-19); 2020. https://osteopathic.org/wp-content/uploads/COCA-Guidance-regarding-Coronavirus.pdf [Accessed 22 March 2024].Search in Google Scholar

3. AACOM. New workgroup focused on improving how colleges train the next generation of doctors. Inside OME. 2020;1. https://www.aacom.org/news-reports/news/2020/10/01/new-workgroup-focused-on-improving-how-colleges-train-the-next-generation-of-doctors [Accessed 22 March 2024].Search in Google Scholar

4. Han, H, Resch, DS, Kovach, RA. Educational technology in medical education. Teach Learn Med 2013;25:S39–43. https://doi.org/10.1080/10401334.2013.842914.Search in Google Scholar PubMed

5. Irby, DM, Wilkerson, L. Educational innovations in academic medicine and environmental trends. J Gen Intern Med 2003;18:370–6. https://doi.org/10.1046/j.1525-1497.2003.21049.x.Search in Google Scholar PubMed PubMed Central

6. Irby, DM, Cooke, M, O’Brien, BC. Calls for reform of medical education by the Carnegie foundation for the advancement of teaching: 1910 and 2010. Acad Med 2010;85:220–7. https://doi.org/10.1097/ACM.0b013e3181c88449.Search in Google Scholar PubMed

7. Goh, PS, Sandars, J. A vision of the use of technology in medical education after the COVID-19 pandemic. MedEdPublish 2016 2020;9:49. https://doi.org/10.15694/mep.2020.000049.1.Search in Google Scholar PubMed PubMed Central

8. Scantlebury, A, Sheard, L, Watt, I, Cairns, P, Wright, J, Adamson, J. Exploring the implementation of an electronic record into a maternity unit: a qualitative study using normalisation process theory. BMC Med Inform Decis Mak 2017;17. https://doi.org/10.1186/s12911-016-0406-0.Search in Google Scholar PubMed PubMed Central

9. Braun, V, Clarke, V. Using thematic analysis in psychology. Qual Res Psychol 2006;3:77–101. https://doi.org/10.1191/1478088706qp063oa.Search in Google Scholar

10. Guercio, EAACOM. U.S. osteopathic medical schools dashboard; 2025. https://www.aacom.org/searches/reports/report/US-osteopathic-medical-schools-dashboard [Accessed on 9 July 2025].Search in Google Scholar

11. Alsoufi, A, Alsuyihili, A, Msherghi, A, Elhadi, A, Atiyah, H, Ashini, A, et al.. Impact of the COVID-19 pandemic on medical education: medical students’ knowledge, attitudes, and practices regarding electronic learning. PLoS One 2020;15:e0242905. https://doi.org/10.1371/journal.pone.0242905.Search in Google Scholar PubMed PubMed Central

12. Woolliscroft, JO. Innovation in response to the COVID-19 pandemic crisis. Acad Med 2020;95:1140–2. https://doi.org/10.1097/ACM.0000000000003402.Search in Google Scholar PubMed PubMed Central

13. Muntz, M, Franco, J, Ferguson, C, Ark, T, Kalet, A. Telehealth and medical student education in the time of COVID-19 – and beyond. Acad Med 2021;96:1655–9. https://doi.org/10.1097/ACM.0000000000004014.Search in Google Scholar PubMed PubMed Central

14. Theoret, C, Ming, X. Our education, our concerns. Med Educ 2020;54:591–2. https://doi.org/10.1111/medu.14181.Search in Google Scholar PubMed PubMed Central

15. Rose, S. Medical student education in the time of COVID-19. JAMA 2020;323:2131–2. https://doi.org/10.1001/jama.2020.5227.Search in Google Scholar PubMed

16. Zern, NK, Yale, LA, Whipple, ME, Allen, SM, Wood, DE, Tatum, RP, et al.. The impact of the COVID-19 pandemic on medical student education: implementation and outcome of a virtual general surgery curriculum. Am J Surg 2022;224:612–16. https://doi.org/10.1016/j.amjsurg.2022.03.035.Search in Google Scholar PubMed PubMed Central

17. Grady, ZJ, Gallo, LK, Lin, HK, Magod, BL, Coulthard, SL, Flink, BJ, et al.. From the operating room to online: medical student surgery education in the time of COVID-19. J Surg Res 2022;270:145–50. https://doi.org/10.1016/j.jss.2021.08.020.Search in Google Scholar PubMed PubMed Central

18. Prigoff, J, Hunter, M, Nowygrod, R. Medical student assessment in the time of COVID-19. J Surg Educ 2021;78:370–4. https://doi.org/10.1016/j.jsurg.2020.07.040.Search in Google Scholar PubMed PubMed Central

19. Wartman, SA, Combs, CD. Reimagining medical education in the age of AI. AMA J Ethics 2019;21:146–52. https://doi.org/10.1001/amajethics.2019.146.Search in Google Scholar PubMed

20. Johnston, SC. Anticipating and training the physician of the future: the importance of caring in an age of artificial intelligence. Acad Med 2018;93:1105–6. https://doi.org/10.1097/ACM.0000000000002175.Search in Google Scholar PubMed

21. Tokuç, B, Varol, G. Medical education in the era of advancing technology. Balkan Med J 2023;40:395–9. https://doi.org/10.4274/balkanmedj.galenos.2023.2023-7-79.Search in Google Scholar PubMed PubMed Central

22. Preiksaitis, C, Rose, C. Opportunities, challenges, and future directions of generative artificial intelligence in medical education: scoping review. JMIR Med Educ 2023;9:e48785. https://doi.org/10.2196/48785.Search in Google Scholar PubMed PubMed Central

23. Pandurangam, G, Gurajala, S, Nagajyothi, D. Artificial intelligence in anatomy teaching and learning: a literature review. Natl J Clin Anat 2024;13:158–63. https://doi.org/10.4103/NJCA.NJCA_103_24.Search in Google Scholar

24. Powell, A. How AI is transforming medicine and health care. Harvard Gazette 2025. https://news.harvard.edu/gazette/story/2025/03/how-ai-is-transforming-medicine-healthcare/ [Accessed 15 April 2025].Search in Google Scholar


Supplementary Material

This article contains supplementary material (https://doi.org/10.1515/jom-2024-0217).


Received: 2024-10-02
Accepted: 2025-07-31
Published Online: 2025-09-15

© 2025 the author(s), published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Downloaded on 23.9.2025 from https://www.degruyterbrill.com/document/doi/10.1515/jom-2024-0217/html
Scroll to top button