Abstract
As artificial intelligence (AI) becomes increasingly integrated into healthcare, osteopathic medicine faces a critical inflection point. This commentary examines how AI integration can augment osteopathic practice while addressing potential challenges to its foundational tenets. The four tenets of osteopathic medicine – unity of body, mind, and spirit; the body’s capacity for self-regulation and self-healing; the interrelationship of structure and function; and rational treatment based on these principles – must guide AI integration to preserve osteopathic medicine’s holistic approach. By reaffirming osteopathy’s mind-body-spirit philosophy and relational approach to care, osteopathic physicians can help shape a future in which AI serves – not supplants – the human connection that is essential to healing. Successful AI implementation requires intentional strategies to preserve therapeutic presence, maintaining the centrality of osteopathic manipulative treatment (OMT), and ensuring that technology enhances rather than replaces the physician–patient relationship. Medical education, research, and governance must evolve to prepare osteopathic physicians to work ethically alongside AI while maintaining fidelity to osteopathic principles.
The four fundamental tenets of osteopathic medicine guide holistic, person-centered care: (1) the body is a unit, and the person is a unity of body, mind, and spirit; (2) the body is capable of self-regulation and self-healing; (3) structure and function are interrelated; and (4) rational treatment is based upon these principles and the understanding of the first three [1], [2], [3]. These tenets emphasize therapeutic presence and human connection [1], [2], [3], [4], [5].
As healthcare transforms with artificial intelligence (AI) tools offering enhanced diagnostic capabilities, predictive analytics, and administrative efficiency, there is a risk that overreliance on empirical data may undermine the osteopathic patient care [6]. This commentary examines how physicians can ethically integrate AI systems while preserving osteopathic medicine’s humanistic core.
The osteopathic foundation
Osteopathic medicine is fundamentally relational – connecting bodily systems, patients and physicians, and communities. Core beliefs include the body’s inherent self-healing capacity and the principle that structure and function are interrelated. Osteopathic practitioners serve individual patient and communities, particularly those most in need.
Osteopathic medicine views health as the dynamic equilibrium of body, mind, and spirit, not merely the absence of disease. This orientation – informed by cultural, indigenous, and holistic traditions – prioritizes both physical examination and attunement to patients’ psychological and spiritual states [2], [3], [4].
The physician–patient relationship remains central, cultivated through touch, empathy, and presence [3], 5]. Osteopathic manipulative treatment (OMT) exemplifies this relationship, integrating physical assessment with therapeutic intervention, associated with improved functional outcomes, pain reduction, and enhanced patient satisfaction [2], 3]. In our technology-saturated landscape, hands-on care becomes increasingly vital.
AI’s promise for osteopathic practice
Properly implemented, AI can enhance osteopathic practice, rather than replacing the physician’s role. Ambient scribing exemplifies this potential, utilizing audio and text-based systems to reduce documentation time (“pajama time”) [7] and minimize screen interaction. These tools return time to physicians for attentive listening, relational presence, and palpatory diagnosis – freeing hands for healing rather than typing.
AI excels in diagnostic support, processing diverse datasets – clinical histories, laboratory values, imaging, and patient-reported outcomes – to create comprehensive patient profiles that enhance care coordination [8]. This analytic capability enhances communication, care continuity, and incorporates social determinants of health [8].
Importantly, AI can identify subtle structure–function interdependencies that may be missed in time-constrained clinical encounters. Pattern recognition across complex physiological systems aligns with the osteopathic understanding of bodily interconnectedness [9].
By reducing the cognitive load of administrative tasks that add little clinical value, AI can help osteopathic physicians refocus on individual patients and their unique healing trajectories.
Personalized medicine
AI supports personalized medicine, complementing osteopathic individualized care and self-healing principles. Predictive analytics can guide interventions that resonate with each person’s unique presentation and healing capacity. AI-enhanced mind–body tools – such as wearable-guided breathing platforms (e.g., Breathwrk, Apollo Neuro) [10], 11] and real-time biofeedback devices (e.g., Muse, HeartMath) [12], 13] – extend psychosomatic care beyond clinic visits. These tools adapt to individual physiological patterns (heart rate variability, electroencephalography [EEG] signals) supporting autonomic balance and stress resilience while reinforcing mind-body-spirit unity.
Remote patient monitoring (RPM) improves patient mobility and functional status, and reduces hospitalizations and healthcare costs [14]. Although quality-of-life outcomes data (physical and mental health symptoms) remain inconclusive [14], RPM offers evidence-based tools aligned with osteopathic priorities like functional improvement. Continuous glucose monitoring (CGM) exemplifies this potential, improving glycemic control through precise and real-time monitoring within patients’ daily environments, enabling data-driven personalized feedback [15], [16], [17].
Risks to osteopathic care
While AI offers transformative potential for healthcare delivery, these emerging technologies are not inherently neutral. Without ethical guardrails and adequate user training, AI systems can erode osteopathic medicine’s core principles [8], 18]. Integration without deliberate attention to foundational principles risks undermining both the philosophy and clinical effectiveness [8], 18].
Erosion of clinical skills and diagnostic competence
Recent studies raise concerns about diagnostic performance when physicians lack adequate AI training [19], [20], [21], [22]. Meta-analyses reveal that while AI models like Generative Pre-trained Transformer 4 (GPT-4) can outperform conventional resources independently, they do not consistently improve diagnostic accuracy as decision-support tools [19], 22]. This paradox suggests that without proper training, physicians may over-rely on AI, potentially reducing rather than enhancing diagnostic accuracy.
Over-reliance on AI leads to cognitive atrophy and deskilling – diminished clinical reasoning skills, pattern recognition, and physical diagnostic skills. For osteopathic physicians, this erosion threatens more than medical judgment; it impacts core competencies like diagnostic palpation and OMT. Increased technological dependence reduces verbal and physical patient engagement, compromising the astute palpatory perception and OMT [5].
The cost of omission: missing language inputs and the displacement of osteopathic practice and sustainability
Current AI systems fundamentally mismatch with osteopathic philosophy and practice. Key elements of osteopathic care – somatic dysfunction, OMT, body-mind-spirit integration–are underrepresented in large language models and clinical algorithms due to limited osteopathic literature in training datasets. While PubMed contains over 38 million citations [23], osteopathic publications comprise only a small fraction. AI models trained on clinical documentation, coding data, and medical guidelines may not capture osteopathic terminology or practices, compounding this issue.
This underrepresentation creates risks: missed diagnostic opportunities, incomplete assessments, and inadequate documentation that fails to capture the full context of the healing process [2], 4]. AI-enabled ambient systems may miss osteopathic terminology or diagnostic codes, resulting in undercoding and inaccurate records that threaten care quality and reimbursement. Feeding osteopathic-specific language into AI clinical decision-support tools is fundamentally important for sustainable osteopathic practice.
Without intentional design integrating osteopathic language, diagnostic frameworks, and treatment modalities, AI risk marginalizes osteopathic perspectives. Developers and medical institutions must proactively preserve human-centered care and osteopathic principles in AI development and deployment [2], 4], 8].
Impact on therapeutic relationships and patient trust
The physician–patient relationship serves as a cornerstone of osteopathic care, yet increased reliance on AI-driven systems risks creating barriers to this therapeutic connection if thoughtful attention to preserving patient trust is overlooked during clinical support deployment. While ambient scribing and AI-assisted messaging reduces physician screen time, automation must enhance – not diminish – meaningful patient interaction.
Patients remain wary of AI’s role in clinical encounters. Recent findings reveal consistent anti-AI bias toward physicians disclosing any use of AI – whether administrative, diagnostic, or therapeutic [24]. Although modest, these effects carry potential clinical significance given the centrality of patient trust. Concerns include overreliance on algorithms, erosion of the personal touch, data privacy, and cost [24]. These fears underscore the importance of maintaining transparent, rational, and individualized care.
This matters given osteopathic medicine’s historical commitment to underserved communities. Although osteopathic physicians represent just 4.9 % of clinically active physicians, osteopathic physicians account for 10.4 % of rural primary care physicians and 15.3 % of physicians in small rural counties [25]. Here, osteopathic physicians critically maintain trust, support technology literacy, and ensure equitable access to care.
If AI tools exacerbate health inequities or erode patient autonomy, they may disproportionately harm the populations that are most dependent on osteopathic services [8], 18], 26]. Privacy concerns threaten therapeutic alliance when sensitive information is processed by opaque systems, causing patient hesitation to disclose emotional, social, or spiritual concerns [24], all of which are insights central to osteopathic assessment. Designing AI platforms without mechanisms to monitor and mitigate bias, protect patient privacy, and include osteopathic terminology risks perpetuating these harms to vulnerable communities [4], 8], 18], 26].
AI-generated recommendations built with limited inputs may exert undue influence on clinical decisions, pressuring patients toward standardized pathways that ignore social drivers of health critical to improved outcomes for patient populations served by osteopathic physicians.
Preserving the physician’s role
As algorithms assume diagnostic and administrative functions, there’s risk of role drift, in which physicians shift from healers to technicians [8]. For osteopathic physicians grounded in direct patient relationships, protecting space for compassion, human judgment, and humanistic high-touch care remains essential [27]. Although AI augments clinical reasoning, it cannot replicate wisdom, empathy, intuition, and a human connection. Physicians remain irreplaceable as interpreters of suffering, witnesses to vulnerability, and partners in healing.
Guiding principles for AI integration
To ethically integrate AI into osteopathic care, the profession must articulate principles grounded in its unique philosophy and clinical approach. These must go beyond generic AI ethics to reflect osteopathic-specific needs regarding representation, clinical reasoning, and professional identity preservation.
Osteopathic-specific principles for AI integration
We propose these guidelines to ensure that AI integration aligns with the foundational tenets as outlined in the American Osteopathic Association’s Artificial Intelligence in Healthcare Report and Action Plan (Policy H429-A-24) [8]:
Representation requirement: AI systems must recognize osteopathic diagnostic terminology, OMT, and holistic clinical reasoning to prevent exclusion from AI-supported care pathways.
Skills preservation: Clinical decision support should enhance – not replace – osteopathic diagnostic and palpatory skills. AI systems should prompt relevant structural examinations and incorporate osteopathic-specific terminology and billing codes to accurately document and bill for services rendered.
Treatment parity: AI recommendation engines must include OMT as standard treatment when appropriate, rather than defaulting to pharmaceutical or surgical interventions.
Shared governance at institutional and national levels: Osteopathic physicians need voices in AI policy, development, and oversight to address bias and safeguard patient autonomy and privacy.
Defined scope of practice: AI platform deployment must include inputs allowing for the full scope of osteopathic care. Osteopathic-specific terminology and descriptors addressing somatic dysfunction and treatment modalities are critical for accurate coding, billing, and preservation of osteopathic medicine.
Shared decision-making: AI-supported care must prioritize shared decision-making between osteopathic physicians and patients, supporting precision medicine that is individually tailored to meet the patient’s needs within their environmental context.
Osteopathic-led innovation: Encourage osteopathic research on AI applications in OMT, psychosomatic medicine, and person-centered outcomes.
Educational imperatives for the AI era
Medical education must prepare future Doctors of Osteopathic Medicine (DOs) for AI-shaped healthcare while emphasizing palpatory ability, clinical reasoning, and empathic communication [5], 8], 27].
Dual literacy requirement: Students need proficiency in both traditional osteopathic diagnostic skills and AI-generated information interpretation.
Critical evaluation skills: Learners must critically appraise AI outputs, recognizing biases – such as prioritization of pharmacologic interventions over manual medicine – and identifying misalignment with osteopathic philosophy.
Professional identity formation: Curricula should emphasize how osteopathic principles remain essential in technology-enhanced environments.
Research priorities for osteopathic integration into AI
Osteopathic medicine must engage in targeted research to ensure that its values, methods, and outcomes are accurately reflected in AI systems.
Outcome studies: Demonstrating osteopathic intervention effectiveness utilizing AI-measurable outcomes (functional improvement, patient satisfaction, and reduced healthcare utilization) supports algorithm inclusion.
Diagnostic algorithm development: Translating osteopathic diagnostic reasoning into AI-suitable data formats ensures inclusion in future clinical tools.
Implementation research: Studies assessing how AI integration affects osteopathic clinical skills, patient care, and professional identity in real-world practice settings.
Conclusions
AI will inevitably influence medicine’s future. How it shapes osteopathic practice depends on today’s choices. By grounding innovation in osteopathic tenets – unity, self-healing, structure-function relationships, and holistic care – the community can ensure that AI enhances rather than erodes core values.
AI offers an opportunity to preserve and strengthen osteopathic identity in a profession born from innovation. A future integrating technology and humanity with intention is critical to preserving human-centered care in our increasingly digitized healthcare ecosystem.
The osteopathic community must act decisively. By establishing clear guidelines, investing in osteopathic-specific AI research, and ensuring voices in healthcare technology governance, we can ensure that AI strengthens, rather than supplants, the human-centered care defining osteopathic medicine.
-
Research ethics: Not applicable.
-
Informed consent: Not applicable.
-
Author contributions: All authors have accepted responsibility for the entire content of this manuscript and approved its submission.
-
Use of Large Language Models, AI and Machine Learning Tools: None declared.
-
Conflict of interest: None declared.
-
Research funding: None declared.
-
Data availability: Not applicable.
References
1. Tenets of Osteopathic Medicine. American osteopathic association; 2018. https://osteopathic.org/about/leadership/aoa-governance-documents/tenets-of-osteopathic-medicine/ [Accessed 23 July 2025].Search in Google Scholar
2. Misra, S. Osteopathic principles and practice: essential training for the primary care physician of today and tomorrow. Fam Med 2021;53:544–7. https://doi.org/10.22454/FamMed.2021.123494.Search in Google Scholar PubMed
3. Davis, GE, Hartwig, WC, Riemer, RB, Char, C, McTighe, A, Kremelberg, D. Assessing patient experience of the tenets of osteopathic medicine. J Osteopath Med 2023;123:371–8. https://doi.org/10.1515/jom-2023-0038.Search in Google Scholar PubMed
4. Fahlgren, E, Nima, AA, Archer, T, Garcia, D. Person-centered osteopathic practice: patients’ personality (body, mind, and soul) and health (ill-being and well-being). PeerJ 2015;3:e1349. https://doi.org/10.7717/peerj.1349.Search in Google Scholar PubMed PubMed Central
5. Noll, DR, Ginsberg, T, Elahi, A, Cavalieri, TA. Effective patient-physician communication based on osteopathic philosophy in caring for elderly patients. J Am Osteopath Assoc 2016;116:42–7. https://doi.org/10.7556/jaoa.2016.005.Search in Google Scholar PubMed
6. Akingbola, A, Adeleke, O, Idris, A, Adewole, O, Adegbesan, A. Artificial intelligence and the dehumanization of patient care. J Med Surg Public Health 2024;3:100138. https://doi.org/10.1016/j.glmedi.2024.100138.Search in Google Scholar
7. Shin, HS, Braykov, NP, Jahan, A, Meller, J, Orenstein, EW. The influence of artificial intelligence scribes on clinician experience and efficiency among pediatric subspecialists: a rapid randomized quality improvement trial. Appl Clin Inf 2025;16:1041–52. https://doi.org/10.1055/a-2657-8087.Search in Google Scholar PubMed PubMed Central
8. American Osteopathic Association. Artificial intelligence in healthcare report and action plan. Policy H429-A-24. https://osteopathic.org/wp-content/uploads/policies/Policy_H429-A-24-Artificial-Intelligence-in-Healthcare-Reprt-and-Action-Plan-H424-A23.pdf [Accessed 24 June 2025].Search in Google Scholar
9. Cascella, M, Leoni, MLG, Shariff, MN, Varrassi, G. Artificial intelligence-driven diagnostic processes and comprehensive multimodal models in pain medicine. J Pers Med 2024;14:983. https://doi.org/10.3390/jpm14090983.Search in Google Scholar PubMed PubMed Central
10. Number one health and performance app. https://www.breathwrk.com [Accessed July 22 2025].Search in Google Scholar
11. Touch therapy tech for stress, sleep & performance. Apollo Neuro. https://apolloneuro.com [Accessed 22 July 2025].Search in Google Scholar
12. Muse: the brain sensing headband Store with worldwide shipping. Muse: the brain sensing headband. https://choosemuse.com [Accessed July 22 2025].Search in Google Scholar
13. Inner balance coherence plus. HeartMath Store. https://www.heartmath.com/inner-balance [Accessed 22 July 2025].Search in Google Scholar
14. Tan, SY, Sumner, J, Wang, Y, Wenjun Yip, A. A systematic review of the impacts of remote patient monitoring (RPM) interventions on safety, adherence, quality-of-life and cost-related outcomes. NPJ Digit Med 2024;7:192. https://doi.org/10.1038/s41746-024-01182-w.Search in Google Scholar PubMed PubMed Central
15. Scheinker, D, Gu, A, Grossman, J, Ward, A, Ayerdi, O, Miller, D, et al.. Algorithm-enabled, personalized glucose management for type 1 diabetes at the population scale: prospective evaluation in clinical practice. JMIR Diabetes 2022;7:e27284. https://doi.org/10.2196/27284.Search in Google Scholar PubMed PubMed Central
16. Ferstad, JO, Vallon, JJ, Jun, D, Gu, A, Vitko, A, Morales, DP, et al.. Population-level management of type 1 diabetes via continuous glucose monitoring and algorithm-enabled patient prioritization: precision health meets population health. Pediatr Diabetes 2021;22:982–91. https://doi.org/10.1111/pedi.13256.Search in Google Scholar PubMed PubMed Central
17. Kumar, S, Raymond, AM, Sequeira, A, Jeny, JJ, Goyal, MC. 966-P: real-world impact of AI-driven CGM platform on glycemic status in type 2 diabetes – a retrospective study. Diabetes 2025;74. https://doi.org/10.2337/db25-966-p.Search in Google Scholar
18. Shumway, DO, Hartman, HJ. Medical malpractice liability in large language model artificial intelligence: legal review and policy recommendations. J Osteopath Med 2024;124:287–90. https://doi.org/10.1515/jom-2023-0229.Search in Google Scholar PubMed
19. Ranji, SR. Large language models-misdiagnosing diagnostic excellence? JAMA Netw Open 2024;7:e2440901. https://doi.org/10.1001/jamanetworkopen.2024.40901.Search in Google Scholar PubMed
20. Jabbour, S, Fouhey, D, Shepard, S, Valley, TS, Kazerooni, EA, Banovic, N, et al.. Measuring the impact of AI in the diagnosis of hospitalized patients: a randomized clinical vignette survey study. JAMA 2023;330:2275–84. https://doi.org/10.1001/jama.2023.22295.Search in Google Scholar PubMed PubMed Central
21. Goh, E, Gallo, R, Hom, J, Strong, E, Weng, Y, Kerman, H, et al.. Large language model influence on diagnostic reasoning: a randomized clinical trial: a randomized clinical trial. JAMA Netw Open 2024;7:e2440969. https://doi.org/10.1001/jamanetworkopen.2024.40969.Search in Google Scholar PubMed PubMed Central
22. D’Adderio, L, Bates, DW. Transforming diagnosis through artificial intelligence. NPJ Digit Med 2025;8:54. https://doi.org/10.1038/s41746-025-01460-1.Search in Google Scholar PubMed PubMed Central
23. About. PubMed. 2025. https://pubmed.ncbi.nlm.nih.gov/about/ [Accessed July 30 2025].Search in Google Scholar
24. Reis, M, Reis, F, Kunde, W. Public perception of physicians who use artificial intelligence. JAMA Netw Open 2025;8:e2521643. https://doi.org/10.1001/jamanetworkopen.2025.21643.Search in Google Scholar PubMed PubMed Central
25. Fordyce, MA, Doescher, MP, Chen, FM, Hart, LG. Osteopathic physicians and international medical graduates in the rural primary care physician workforce. Fam Med 2012;44:396–403.Search in Google Scholar
26. Orom, H, Underwood, WIII, Cheng, Z, Homish, DL, Scott, I. Relationships as medicine: quality of the physician–patient relationship determines physician influence on treatment recommendation adherence. Health Serv Res 2018;53:580–96. https://doi.org/10.1111/1475-6773.12629.Search in Google Scholar PubMed PubMed Central
27. Patel, S, Pelletier-Bui, A, Smith, S, Roberts, MB, Kilgannon, H, Trzeciak, S, et al.. Curricula for empathy and compassion training in medical education: a systematic review. PLoS One 2019;14:e0221412. https://doi.org/10.1371/journal.pone.0221412.Search in Google Scholar PubMed PubMed Central
© 2025 the author(s), published by De Gruyter, Berlin/Boston
This work is licensed under the Creative Commons Attribution 4.0 International License.