Abstract
Background
Diagnostic imaging decision support (DI-DS) systems could be effective tools for reducing inappropriate diagnostic imaging examinations. Since effective design and evaluation of these systems requires in-depth understanding of their features and functions, the present study aims to map the existing literature on DI-DS systems to identify features and functions of these systems.
Methods
The search was performed using Scopus, Embase, PubMed, Web of Science, and Cochrane Central Registry of Controlled Trials (CENTRAL) and was limited to 2000 to 2021. Analytical studies, descriptive studies, reviews and book chapters that explicitly addressed the functions or features of DI-DS systems were included.
Results
A total of 6,046 studies were identified. Out of these, 55 studies met the inclusion criteria. From these, 22 functions and 22 features were identified. Some of the identified features were: visibility, content chunking/grouping, deployed as a multidisciplinary program, clinically valid and relevant feedback, embedding current evidence, and targeted recommendations. And, some of the identified functions were: displaying an appropriateness score, recommending alternative or more appropriate imaging examination(s), providing recommendations for next diagnostic steps, and providing safety alerts.
Conclusions
The set of features and functions obtained in the present study can provide a basis for developing well-designed DI-DS systems, which could help to improve adherence to diagnostic imaging guidelines, minimize unnecessary costs, and improve the outcome of care through appropriate diagnosis and on-time care delivery.
Introduction
Diagnostic imaging plays a prominent role in clinical diagnosis [1], but recent research findings have shown that a significant fraction of diagnostic imaging done worldwide is inappropriate [2], [3], [4], [5], [6], [7], [8], [9], [10]. Appropriate imaging means that the expected health advantages of imaging, such as improved diagnostic performance and patient outcomes, outweigh its negative consequences, such as anxiety, pain, cost and any complications related to pursuit of inconsequential findings [11]. Overusing, underusing and choosing the wrong modality are examples of inappropriate imaging [7, 12]. Inappropriate diagnostic imaging can lead to unjustified radiation exposure, overdiagnosis and overtreatment, and waste of financial resources [12, 13]. To reduce the adverse consequences and support the effective use of imaging resources, evidence-based solutions such as the use of diagnostic imaging decision support (DI-DS) systems have attracted attention [14], [15], [16]. A DI-DS system is a clinical decision support (CDS) system that helps physicians determine the necessity of a diagnostic imaging examination and choose the most appropriate imaging modality based on the patient’s symptoms and best evidence. These systems could reduce unnecessary imaging [17], [18], [19] and lead physicians towards more appropriate choices [17, 20, 21]. ESR iGuide, ACR Select, and iRefer developed by the European Society of Radiology (ESR), the American College of Radiology (ACR), and the Royal College of Radiologists (RCR), respectively, are some examples of these systems [14], [15], [16]. Furthermore, in January 2021, the Canadian Association of Radiologists (CAR) started a project to develop an internal CDS system for appropriate medical imaging [22].
To develop a well-designed DI-DS system it is necessary to identify its functional and non-functional requirements. Functional requirements are focused on system functions and specify what the system should do. Non-functional requirements, also referred to as variables of quality, are features considered in system development and as a constraint over functional requirements, they cause system functions to be performed with higher quality [23]. In the context of DI-DS systems, an example of a functional requirement is “displaying an appropriateness score for each imaging examination”, and an example of a non-functional requirement is “being rapid”. A faster DI-DS system could display the appropriateness score/category of an imaging examination more quickly. The present study aimed to map the existing literature on DI-DS systems to identify the functional requirements and features important for both the design and evaluation of these systems.
Methods
This study, was performed according to the Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR 2018) statements [24]. Searching was performed using Scopus, Embase, PubMed, Web of Science, and Cochrane Central Registry of Controlled Trials (CENTRAL) and was limited to 2000 to 2021. The search terms were different synonyms of “appropriate”, “diagnostic imaging” and “clinical decision support system” extracted from MESH, EMTREE, and similar articles. The search was performed by combining these three concepts using the Boolean operator “AND”. The detailed search strategy for each data base is presented in the Supplemental Material, Appendix 1 . Screening and selection of primary studies were made based on the study inclusion and exclusion criteria and with the aim of answering the question: What are the desired features and functions of a DI-DS system? The following is an explanation of steps of the study considering the PRISMA-ScR.
Eligibility criteria
Inclusion criteria
Studies published between 01-01-2000 and 08-31-2021 which addressed features or functions of DI-DS systems were eligible. Only studies which their full text was available in English were included, but no geographical restrictions were applied. In terms of study design, analytical studies (experimental, cohort, case-control and cross-sectional) were included. Descriptive studies, reviews and book chapters were also included if they specifically discussed the features or functions of such systems. In terms of system architecture, any DI-DS system (independent or linked to other systems, such as computerized provider order entry (CPOE)) was considered.
Exclusion criteria
Reports, letters to the editor, conference papers, and protocols were excluded. Studies of systems designed solely for the purpose of preventing duplicate imaging, and studies that used or described systems designed in laboratory settings and solely for the purpose of training were excluded. The studies did not focus on systems designed specifically for selecting appropriate imaging examination, as well as the studies which used a DI-DS system but did not describe the features or functions of these systems were excluded.
Screening and selection
In the first step, screening was performed based on the title and abstract of the studies by three authors. The selection of studies based on the full text was carried out by them independently and based on the inclusion and exclusion criteria. In case of inconsistency between the selected studies, first, the three researchers reviewed the cases with each other, and if the disagreement was not resolved, the opinion of the fourth researcher was sought.
Data charting
Data charting was conducted using an Excel spreadsheet with the following categories of data: author, year, title, country, study design, guideline, features, and functions. Four researchers independently applied the data charting to each source and recorded the data in a spreadsheet. In case of data discrepancies, the four researchers first checked the cases with each other, and if the disagreement was not resolved, the opinion of the fifth researcher was also sought. The data charted from each source were then categorized and themed inductively according to the main concepts that emerged from the data.
Quality assessment
The selected studies were not assessed for quality, as the aim was to provide a comprehensive overview of the available literature.
Ethical approval
The research conducted did not involve human or animal subjects.
Results
Search results
A total of 6,142 studies (6,046 studies after removing duplicates) were identified. We screened the 6,046 records based on their title and abstract manually. We excluded 5,796 records that did not meet our inclusion criteria. We retrieved the remaining 250 records. The full text of 223 studies were available and we read them in full. We excluded 168 records that did not meet our inclusion criteria based on the full text. Finally, 55 studies (including articles and book chapters) met the inclusion criteria and were included in the study (Figure 1).

Search flow.
Characteristics of the included studies
Most of the included studies (45 studies, 82 %) were conducted in the United States of America. Out of 55 studies, 39 reported using diagnostic imaging guidelines as the embedded evidence in their DI-DS systems, and 11 reported using diagnostic decision rules. One study did not mention the details of the embedded evidence, one study used public literature, and three did not implement any DI-DS system. The most frequently reported diagnostic imaging guideline was the American College of Radiologists Appropriateness Criteria (ACR-AC), which was applied in 25 studies (45.45 %).
In terms of study design, one study is a randomized control trial (1.8 %), 28 are before-after studies (50 %), four are time series studies (7.2 %), four are cohort studies (7.2 %), nine are cross-sectional studies (16.3 %), two are case-control studies (3.6 %), three are descriptive studies (7.2 %), two are reviews (3.6 %), and two are book chapters (3.6 %). In designing or evaluating the DI-DS systems identified in these studies, imaging examinations related to different anatomical locations have been considered, including brain/head and neck, thyroid, chest, cardiovascular, breast, abdomen, pelvis, genitourinary system, skeletal-muscular, pediatrics, lower back, non-malignant diseases, liver, bile ducts, prostate, shoulder, sinus, wrist, appendix, cervical and lumbosacral spine. Most of these systems were related to the electronic prescribing process (Table 1).
Characteristics of the included studies.
Author, publication year | Country | Study design | Guideline/decision rule | Imaging modality/clinical problem | Implementation | |
---|---|---|---|---|---|---|
1 | Lee [25] | USA | Before-after | ACR-ACa | Low back pain | A CDS system integrated into ordering process of EHRb |
2 | Hayatghaibi [26] | USA | Cross-sectional | ACR-AC | Non-contrast head CTc exam (pediatrics) | A CDS system integrated into ordering process of EHR |
3 | Gaskin [27] | USA | Cross-sectional | ACR-AC | Advanced imaging (i.e., MRId, CT, PETe, and nuclear medicine) | A CDS system integrated into ordering process of EHR |
4 | Fried [28] | USA | Cross-sectional | ACR-AC | Advanced imaging (including enhanced CT abdomen/pelvis and unenhanced CT head) | A CDS system integrated into ordering process of EHR |
5 | Chepelev [29] | USA | Cross-sectional | ACR-AC | Advanced imaging (CT, MRI, ultrasound, and nuclear medicine) | A CDS system integrated into the ordering process of EMRf |
6 | Chen [30] | USA | Cross-sectional | ACCFg and ASEh AUCi | Echocardiography | A CDS system integrated into the ordering process of EMR |
7 | Wu [31] | USA | Before-after | Not mentioned the details | Chest imaging | A CDS system integrated into the ordering process of EMR |
8 | Rehani [32] | USA | Cohort | ACR-AC, ACCj and NCCNk | CT exams for patients with non-malignant diseases | A CDS system integrated into the ordering process of EMR |
9 | Hynes [33] | Ireland | Before-after | NEXUSl criteria and CCSRm | Cervical spine trauma imaging | A CDS system within the electronic image ordering system |
10 | Gabelloni [34] | Italy | Case control | European imaging referral guidelines (developed based on ACR-AC) | Imaging for patients with HCC nor CCo | A CDS system (not mentioned the details) |
11 | Ciprut [35] | USA | Before-after | NCCN guidelines | Staging imaging of low-risk prostate cancers | AN EMR-based clinical reminder order check (CROC) |
12 | Carayon [36] | USA | Case control | Wells’ criteria for PE and the PERC rule | Pulmonary embolism imaging | A CDS system (ran in the epic ‘playground’, an electronic environment that mimics the actual EHR) |
13 | Stopyra [37] | USA | Before-after | The HEART pathway | Stress testing or angiography for patients with possible acute coronary syndrome | A CDS system integrated into ordering process of EHR |
14 | Raja [38] | USA | Before-after | Local guidelines | CT for patients with suspected nephrolithiasis | A CDS system integrated into ordering process of EHR |
15 | Poeran [20] | USA | Time series | ACR-AC | CT and MRI | A CDS system integrated into ordering process of EHR |
16 | Palen [39] | USA | Before-after | ACR-AC | Advanced imaging (CT and MRI) | A CDS system integrated into ordering process of EMR |
17 | Mulders [40] | Netherland | Before-after | AWRp | Wrist radiography | A mobile based CDSS |
18 | Lee [41] | Korea | Descriptive | Korean clinical imaging guidelines | Brain/head and neck, thyroid, chest, cardiovascular, breast, abdomen, genitourinary, musculoskeletal, pediatric and interventional imaging | A web-based mobile CDSS |
19 | Chan [42] | USA | Descriptive | ACR-AC | Pediatric imaging | A CDS system integrated into ordering process of EMR |
20 | Doyle [43] | USA | RCT | ACR-AC | High-cost imaging | A CDS system integrated into ordering process of EMR |
21 | Mills [44] | USA | Before-after | Wells criteria | Pulmonary embolism imaging (CTPE) | A CDS system integrated into the EMR and CPOE system |
22 | Huber [21] | USA | Before-after | ACR-AC | Advanced imaging (i.e., MRI, CT, nuclear medicine, PET, and ultrasound) | A CDS system integrated into ordering process of EHR |
23 | Cochon [45] | USA | Book chapter | – | – | – |
24 | Calcaterra [46] | Italy | Before-after | Italian DIRGsq | Multiple imaging examinations | A CDS system (not mentioned the details) |
25 | Moriarity [47] | USA | Before-after | ACR-AC | Advanced imaging (nuclear medicine, CT, and MRI) | A CDS system integrated into EHR ordering process |
26 | Min [18] | Canada | Before-after | Choosing Wisely Canada guidelines | Low back pain | A CDS system integrated into electronic order entry forms |
27 | Lacson [48] | USA | Before-after | Professional society guidelines that were preselected for use by the CMSr, including ACR-AC | MRI of the brain, knees, lumbar spine, or shoulders; CT of the abdomen, abdomen and pelvis, brain, lumbar spine, pelvis, sinus, or thorax; and SPECT myocardial perfusion imaging | A CDS system (not mentioned the details) |
28 | Gupta [49] | USA | Cross-sectional | ACR-AC | Outpatient CT, MR, and most nuclear medicine examinations, including noninvasive cardiac studies | Embedded CDS in radiology order entry system |
29 | I. K. Ip [50] | USA | Before-after | Professional society guidelines that had been preselected by CMS, primarily from the ACR and the ACC | 12 common outpatient advanced diagnostic imaging (MRI of the brain, knee, lumbar spine, and shoulder; CT of the abdomen, abdomen and pelvis, brain, lumbar spine, pelvis, sinus, and thorax; and SPECT myocardial perfusion imaging) | A CDS system integrated into computerized ordering process |
30 | Drescher [51] | USA | Before-after | Evidence-based clinical protocol that included the PE rule-out criteria (PERC) rule | CTPAs for acute pulmonary embolism | A CDS system integrated into computerized ordering process |
31 | Prabhakar [52] | USA | Cross-sectional | ACR-AC | Advance imaging (CT and MRI) | A CDS system integrated into computerized ordering process |
32 | Depinet [53] | USA | Time series | A clinical pathway | Acute appendicitis | A CDS system integrated into EMR |
33 | Bookman [54] | USA | Before_ after | The canadian head injury rules, NEXUS C-spine rules, the pulmonary embolism rule-out criteria (PERC), and wells scores | High-cost CT imaging studies: Brain, C-spine, and PE | A CDS system integrated into EHR |
34 | Sistrom [55] | USA | Cross-sectional | ACR-AC | Advanced imaging (nuclear medicine, CT, and MRI) | A CDS system integrated into radiology order entry system |
35 | Schneider [56] | USA | Cross-sectional | ACR-AC, published evidence-based medicine from other societies, and best-practice guidelines developed by a local clinical advisory board | Outpatient MRI and CT | A CDS system integrated into computerized ordering process |
36 | Moriarity [57] | USA | Before-after | ACR-AC | Advanced imaging (nuclear medicine, CT, and MRI) | A CDS system integrated into computerized ordering process |
37 | I. K. Ip [58] | USA | Before-after | The New Orleans Criteria, the canadian CT head rule, and the CT in head injury patients prediction rule | Head CT in patients with mild traumatic brain injury (adults) | A CDS system integrated into computerized ordering process |
38 | Dunne [59] | USA | Before-after | A previously validated decision rule | CTPA for acute pulmonary embolism | A CDS system integrated into computerized ordering process |
39 | Broder [13] | USA | Review | – | – | – |
40 | Ranta [60] | New Zealand | Before-after | The New Zealand TIA guideline as well as expert clinician experience | TIA/stroke management (head CT and carotid ultrasound (if anatomic localization supports a carotid territory TIA) | A CDS system integrated into ordering process of EMR |
41 | Thrall 2014 [61] | USA | Descriptive | ACR-AC | Different imaging examinations | A CDS system integrated into computerized ordering process |
42 | Sistrom [62] | USA | Book chapter | ACR-AC | Outpatient imaging | Embedded CDS into radiology order entry system _ integrated into EMR |
43 | Khorasani [63] | USA | Review | – | – | – |
44 | I. K. Ip [19] | USA | Before-after | ACP/APSt guidelines | Lumbosacral MRI | A CDS system integrated into computerized ordering process |
45 | Gupta [64] | USA | Cohort | Wells criteria | CTPA for acute pulmonary embolism | A CDS system integrated into computerized ordering process |
46 | Gupta [65] | USA | Before-after | The New Orleans Criteria, the canadian CT head rule, and the CT in head injury patients prediction rule | Head CT | A CDS system integrated into computerized ordering process |
47 | Lin [66] | USA | Cohort | ACC AUC | Suspected coronary artery disease | A computerized point-of-order decision support system (DSS) |
48 | I. K. Ip [67] | USA | Before-after | Public domain literature | High-cost imaging | A CDS system integrated into computerized ordering process |
49 | Curry [68] | Canada | Before-after | CARu DIRGs | Diagnostic imaging | A CDS system integrated into computerized ordering process |
50 | Bowen [69] | Canada | Before-after | Pediatric section of CAR guidelines | Pediatric diagnostic imaging | A CDS system integrated into computerized ordering process |
51 | Blackmore [70] | USA | Cohort | Locally derived evidence-based decision rules | Lumbar MRI, brain MRI, and sinus CT | A CDS system integrated into computerized ordering process |
52 | Vartanians [71] | USA | Before-after | ACR-AC | Outpatient CT/MRI and nuclear medicine examinations | A CDS system integrated into radiology order entry system |
53 | Solberg [17] | USA | Before-after | ACR-AC | High-tech diagnostic imaging (CT and MRI of the head, and MRI of the lumbar spine) | A CDS system integrated into EHR |
54 | Sistrom [72] | USA | Time series | ACR-AC and locally developed criteria | Outpatient CT, MRI, and USv procedure | Embedded CDS in radiology order entry system |
55 | Rosenthal [73] | USA | Time series | ACR-AC | CT/MRI, nuclear cardiology imaging | Embedded CDS in radiology order entry system |
-
aThe American College of Radiologists Appropriateness Criteria, bElectronic Health Record, ccomputed tomography, dmagnetic resonance imaging, epositron emission tomography, fElectronic Medical Record, gThe American College of Cardiology Foundation, hThe American Society of Echocardiography, iAppropriate Use Criteria, jThe American College of Cardiology, kThe National Comprehensive Cancer Network, lThe National Emergency X-Radiography Utilization Study, mCanadian C-Spine Rule, nHepatocellular carcinoma, oCholangiocarcinoma, pThe Amsterdam Wrist Rules, qDiagnostic Imaging referral guidelines, rCenters for Medicare & Medicaid Services, scomputed tomographic (CT) pulmonary angiography, tThe American College of Physicians and the American Pain Society, uThe Canadian Association of Radiologists, vultrasonography.
As depicted in Figure 2, there has been a growing interest in DI-DS systems since 2006, with the highest number of studies published in 2014 and 2019, with eight studies each.

Published studies per year.
Features and functions of DI-DS systems
Features
Out of 55 studies, four explicitly discussed the most prominent features of an effective or ideal DI-DS system (Table 2). The other 51 studies are also outlined in Table 2, if they have reported any features of their implemented DI-DS systems. In total, 22 features were identified for DI-DS systems:
Sensitive and specific A sensitive DI-DS system captures nearly all cases without missing diseases or injuries, and a specific DI-DS system avoids inappropriate imaging in a significant fraction of patients. A DI-DS system should be highly sensitive and specific enough [13]. This feature was proposed in one study [13].
Integrated with other systems in physician work flow A DI-DS system can work within the ordering process of the EHR or CPOE [13]. This feature was emphasized in four studies [13, 36, 45, 63]. Most of the reviewed studies reported that their implemented DI-DS system has this feature [17], [18], [19], [20], [21, 25], [26], [27], [28], [29], [30], [31], [32, 35, 37], [38], [39, 42], [43], [44, 47, 49], [50], [51], [52], [53], [54, 56], [57], [58], [59], [60], [61], [62, 64, 65, 67], [68], [69], [70], [71], [72], [73].
Deployed as a multidisciplinary program Technical implementation of a DI-DS system is important but insufficient to optimize the use of imaging. A team with leadership of physicians should decide on clinical aspects such as setting clinical goals and checking and approving the embedded evidence. Along with considering the clinical aspects, the leaders of the practice and the institution need to support a quality improvement and change-management culture, such as mandating the use of DI-DS systems as a requirement for payment [63]. This feature was proposed in one study [63].
Diverse sources of embedded evidence Embedding different sources of evidence that are reviewed and vetted by a team of experts helps ensure that the system is clinically valid. The evidence used in the DI-DS system should not be in conflict with the local healthcare standards [63]. This feature was proposed in one study [63].
Transparent strength of evidence at the time of order entry Different sources of evidence embedded in the system may overlap or contradict each other. Therefore, it is helpful to rate the strength of each evidence source to compare competing recommendations that might emerge [63]. This feature was proposed in two studies [45, 63]. Also, two studies have reported that their implemented DI-DS systems have this feature [41, 46].
Targeted A DI-DS system could be more effective when targeting specific clinical situations and settings where there is strong and high-quality evidence. Applying A DI-DS system to all imaging examinations may result in weak recommendations that may reduce the trust and acceptance of the system [63]. This feature was proposed in two studies [45, 63].
Respecting physician work flow A DI-DS system should be designed and implemented in a way that aligns with the sequence of tasks and processes in the practice. For example, the system should support proxy ordering. Proxy ordering is the practice of delegating the task of ordering diagnostic imaging examinations to support staff, such as nurses. The DI-DS system should provide the possibility of approving these orders by the physician through the system [63]. This feature was proposed in two studies [45, 63].
Content chunking/grouping was proposed in one study [36]. Presenting different diagnostic rules separately, or placing DI-DS software in the relevant department navigator of EHR (such as emergency department navigator) are some examples of content chunking in the context of DI-DS systems [36]. Also, one study reported grouping the software content based on the 10 different subspecialties (e.g. pediatrics, cardiovascular, etc.) [41].
Visualized values of information using colors This feature has been used in the DI-DS systems implemented in seven of the reviewed studies [21, 29, 32, 41, 49, 56, 57]. Color classification of appropriateness level of each imaging examination, or the use of simple colored bar plots to show the strength of evidence are some examples of visualizing information using colors [41].
A systematic structure for data entry The review of existing DI-DS systems shows that these systems have a systematic structure for data entry, including: “structured options (such as a drop down list/a list box) for indication selection” [18, 26], [27], [28, 33, 39, 42, 45], “options such as “not found” or “others” for cases where the reason for imaging considered by the physician is not among the structured indications” [18, 26], “predefined options or a text field to enter other physician explanations, such as the reason for choosing an imaging” [18, 25, 26, 28, 42, 45, 66], and “options for entering information after performing imaging (such as imaging success/failure, subsequent diagnostic plans, etc.)” [35, 46, 66].
Features of a DI-DS system.
Features | Studies that explicitly have discussed about features of an effective or ideal DI-DS system | Studies that have reported one or more features of their DI-DS system | ||||
---|---|---|---|---|---|---|
Khorasani et al. [63] | Broder et al. [13] | Cochon et al. [45] | Carayon et al. [36] | |||
1 | Evidence-based | * | * | * | [17], [18], [19], [20], [21, 25], [26], [27], [28], [29], [30], [31], [32], [33], [34], [35], [36], [37], [38], [39], [40], [41], [42], [43], [44], [45], [46], [47], [48], [49], [50], [51], [52], [53], [54], [55], [56], [57], [58], [59], [60], [61], [62, 64], [65], [66], [67], [68], [69], [70], [71], [72], [73] | |
2 | Sensitive and specific | * | ||||
3 | Cost effective | * | ||||
4 | Rapid and easy to use | * | * | * | ||
5 | Radiation effective | * | ||||
6 | Integrated with other systems in work flow | * | * | * | * | [17], [18], [19], [20], [21, 25], [26], [27], [28], [29], [30], [31], [32, 35, 37], [38], [39, 42], [43], [44, 47, 49], [50], [51], [52], [53], [54, 56], [57], [58], [59], [60], [61], [62, 64, 65, 67], [68], [69], [70], [71], [72], [73] |
7 | Deployed as a multidisciplinary program | * | ||||
8 | Enable measurement of its impact | * | * | |||
9 | Diverse sources of embedded evidence | * | ||||
10 | Clinically valid and relevant feedbacks | * | * | |||
11 | Clinical recommendations/alerts be brief, unambiguous, and actionable | * | * | |||
12 | Embedded evidence be current | * | ||||
13 | Transparent strength of evidence at the time of order entry | * | * | [41, 46] | ||
14 | Targeted | * | * | |||
15 | Respecting physician work flow | * | * | |||
16 | Consistency | * | ||||
17 | Flexibility/customizability | * | * | |||
18 | Visibility | * | ||||
19 | Minimization of workload | * | ||||
20 | Content chunking/grouping | * | [41] | |||
21 | Visualized values of information using colors | [21, 29, 32, 41, 49, 56, 57] | ||||
22 | A systematic structure for data entry | [18, 25], [26], [27], [28, 33, 35, 39, 42, 45, 46, 66] |
-
*This symbol shows which studies have mentioned each of the features.
Functions
None of the reviewed studies explicitly addressed the important functions for an effective or ideal DI-DS system. Therefore, we collected the functions outlined in Table 3 by reviewing the functions of each of the DI-DS systems that was designed, implemented, or evaluated by the included studies. Out of 55 studies, 52 reported at least one function of their DI-DS system. In total, 22 functions were identified (Table 3). Some of the most important functions of a DI-DS system (that deal with the provision of evidence-based recommendations) are:
Recognizing unnecessary imaging examination Some DI-DS systems guide physicians to use other diagnostic approaches such as laboratory tests rather than unnecessary imaging. The ability to determine an imaging examination necessity was reported in 18 studies [19, 31, 33, 35], [36], [37], [38, 40, 44, 51, 53, 54, 58], [59], [60, 64, 65, 69].
Displaying an appropriateness score/category A DI-DS system can display an appropriateness degree of each imaging examination by the use of an appropriateness score or category (e.g. inappropriate, may be appropriate, and appropriate). This function was reported in 31 Studies [17, 18, 20, 21, 25], [26], [27], [28], [29], [30, 32, 34, 39, 41], [42], [43, 46], [47], [48], [49], [50, 52, 55], [56], [57, 61, 62, 66, 68, 71], [72], [73].
Recommending alternative or more appropriate imaging examination(s) A DI-DS system can suggest alternative or more appropriate imaging examinations if the physician’s initial choice is not the best for the patient’s condition. This function was reported in 17 studies [31, 39, 42, 43, 46, 48, 52, 54, 57, 61, 62, 66, 68], [69], [70, 72, 73].
Characterizing the orders/patients not addressed by the system The DI-DS system should notify the physician in situations that is not able to analyze some scenarios. Seven studies reported that their DI-DS systems were not able to analyze some scenarios and provide any recommendations for some conditions, because of reasons such as not addressing the scenario by their deployed imaging guideline [29, 32, 48, 50, 56, 57, 66].
Functions of a DI-DS system.
Functions | Studies that have reported one or more functions of their DI-DS system | |
---|---|---|
1 | Assigning different levels of access to users according to their roles | [46] |
2 | Login | [46, 66] |
3 | Automatic text generation | [36] |
4 | Automatic data analysis | [36, 64, 66] |
5 | Error prevention | [36] |
6 | Report creation | [42] |
7 | Search by symptoms or clinical problem | [41, 46] |
8 | Storing information related to interaction with users | [52, 66] |
9 | Using a unique identifier for saving every case data (case id) | [46] |
10 | Recognizing unnecessary imaging examination | [19, 31, 33, 35–38, 40, 44, 51, 53, 54, 58–60, 64, 65, 69] |
11 | Displaying an appropriateness score/category | [17, 20, 21, 25], [26], [27], [28], [29], [30, 32, 34, 39, 41], [42], [43, 46], [47], [48], [49], [50, 52, 55], [56], [57, 61, 62, 66, 68, 71], [72], [73] |
12 | Recommending alternative or more appropriate imaging examination(s) | [31, 39, 42, 43, 46, 48, 52, 54, 57, 61, 62, 66, 68], [69], [70, 72, 73] |
13 | Providing recommendation for next diagnostic steps (support of next decision selection) | [36] |
14 | Providing educational feedback | [66] |
15 | Alerting physician if a similar or identical imaging has been done or ordered recently | [31, 61] |
16 | Enabling to request a radiologist consultation | [46] |
17 | Characterizing the orders/patients not addressed by the system | [29, 32, 48, 50, 56, 57, 66] |
18 | Displaying relative radiation level (RRL) of each imaging examination | [34, 39, 41] |
19 | Displaying cost of each imaging examination | [34] |
20 | Displaying the necessity of ordering the use of contrast medium in each imaging examination | [46] |
21 | Display waiting time | [46] |
22 | Providing safety alerts (e.g. flag issues such as implanted devices in patient’s body) | [61] |
Discussion
There is a lack of comprehensive understanding of the features and functions of DI-DS systems; this review aimed to address this knowledge gap that prevents the effective development and implementation of these systems. In terms of features of a DI-DS system, Khorasani et al. and Cochon et al. focused on the dimensions of technology and quality improvement for increasing the effectiveness of the systems, and suggested the following features: “developed as a multidisciplinary program”, “enabling measurement of its impact”, “diversity of sources of embedded evidence”, “clinically valid and relevant feedback”, “embedding current evidence”, “transparent strength of evidence at the time of order entry”, “targeted”, “respecting physician work flow”, “integrated with other systems in work flow”, and “brief, clear, and actionable clinical recommendations” [45, 63]. On the other hand, Carayon et al. addressed the human-machine or user-system interaction as a dimension by introducing the following features: “visibility”, “minimization of workload”, “chunking/grouping”, “consistency”, and “flexibility” [36]. Considering these features along with the features mentioned by Khorasani and Cochon could be helpful, because human-machine related features play a vital role in improving system usability and thus increasing user satisfaction [74]. In addition, Broder et al. suggested patient safety related features such as “evidence-based”, “sensitivity” and “radiation effective” along with other features such as being “cost effective”, “rapid and easy to use”, and “integrated with other systems in work flow” [13]. A synthesis of the mentioned paradigms could help to develop and implement more advantageous DI-DS systems in terms of utilization management, patient safety and quality of care. However, the present study has indicated a gap in the literature regarding the extent to which the above features have been considered in the development and implementation of the existing DI-DS systems, except for the features “evidence based” [17], [18], [19], [20], [21, 25], [26], [27], [28], [29], [30], [31], [32], [33], [34], [35], [36], [37], [38], [39], [40], [41], [42], [43], [44], [45], [46], [47], [48], [49], [50], [51], [52], [53], [54], [55], [56], [57], [58], [59], [60], [61], [62, 64], [65], [66], [67], [68], [69], [70], [71], [72], [73] and “integration with other systems in work flow” [17], [18], [19], [20], [21, 25], [26], [27], [28], [29], [30], [31], [32, 35, 37], [38], [39, 42], [43], [44, 47, 49], [50], [51], [52], [53], [54, 56], [57], [58], [59], [60], [61], [62, 64, 65, 67], [68], [69], [70], [71], [72], [73]. Unlike for the features, the gap in the literature is less evident for the functions. All the identified functions have been extracted from the implemented DI-DS systems that were discussed by the reviewed studies, indicating that they have been considered in the design and implementation of these systems. However, some functions, such as providing recommendations for next diagnostic steps [36] or providing educational feedback [66], have been reported by only one or two studies, suggesting that they are less common or less reported in the current literature.
The literature of DI-DS systems is vast and diverse, and there are other gaps that need to be addressed. First, it seems that few studies [36, 69] have explored user acceptance and satisfaction with features and functions of DI-DS systems. Also, exploring patient-reported outcomes (PROs) may provide valuable information about the patients’ conditions, needs, preferences, and expectations that may not be captured by other sources of data. As the second gap in the literature, there was no strong evidence in support of optimization of the developed DI-DS systems using emerging technologies and methods. DI-DS systems may could benefit from the use of new technologies and methods that can enhance their security and performance. For example, the use of blockchain technology to ensure the security and integrity of patient data and imaging results, or equipping these systems with a machine learning (ML) algorithm (along with the guideline-based algorithm) in order to update the guidelines. By storing the results of the performed imaging as feedback in a database and applying ML techniques to extract lessons and trends, updating medical imaging guidelines may be possible. Finally, some studies reported that their DI-DS systems could not address some clinical scenarios [29, 32, 48, 50, 56, 57, 66]. A systematic search to identify which imaging guidelines were used in these systems, and which factors may contribute to this failure may be helpful.
In total, the findings of the present study have important implications for the design, development and evaluation of DI-DS systems in healthcare settings. For the design and development of DI-DS systems, the findings could help to define the scope and objectives of the software project, and to prioritize the features and functions that are essential for meeting the user’s needs and expectations. For the evaluation of DI-DS systems, the findings could help to measure the quality and effectiveness of these systems, and to identify and resolve any issues or gaps in the functional and non-functional requirements. This study could also help to improve clear communication amongst the stakeholders, such as DI-DS system developers and evaluators, physicians, and policy makers.
This study has some limitations. First, we only included studies published in English, which may have excluded relevant studies in other languages. Second, we did not include a search of gray literature (e.g. dissertations or conference papers), which may have contained valuable insights for our topic. Third, we did not assess the quality of the studies we reviewed. Although this is not a requirement for scoping reviews, it may affect the ability to identify gaps in the literature where evidence exists, but it is of low quality, and more high-quality studies are needed.
Conclusions
This study mapped the literature of DI-DS systems to identify the features and functions of these systems. Given the importance of developing well-designed DI-DS systems to reduce inappropriate imaging and improve outcomes of care, the features and functions specified in this study should be considered in DI-DS system design, development and evaluation.
This review also identified gaps in the literature regarding the extent to which each of the identified features has been considered in the development and implementation of the existing DI-DS systems. Also, some studies have reported that their DI-DS systems could not address some clinical scenarios. Efforts to identify which imaging guidelines were used in these systems, and which factors may contribute to this failure may be helpful. Future studies will be need to examine the sensitivity and specificity of deployed DI-DS systems, to evaluate the best methods and metrics used for assessing the DI-DS systems, and to study the challenges and barriers for their implementation and adoption. Ultimately, studies will be needed to explore the impact of DI-DS systems on patient outcomes, including satisfaction, safety, and quality of care.
-
Research ethics: Not applicable.
-
Informed consent: Not applicable.
-
Author contributions: All authors have accepted responsibility for the entire content of this manuscript and approved its submission.
-
Competing interests: Authors state no conflict of interest.
-
Research funding: None declared.
-
Data availability: Data sharing is not applicable to this article as no datasets were generated or analyzed during the current study.
References
1. MedlinePlus. Diagnostic imaging, U.S. National library of medicine; 2016. https://medlineplus.gov/diagnosticimaging.html [Accessed 23 Jun 2021].Search in Google Scholar
2. Salari, H, Ostovar, R, Esfandiari, A, Keshtkaran, A, Akbari Sari, A, Yousefi Manesh, H, et al.. Evidence for policy making: clinical appropriateness study of lumbar spine MRI prescriptions using RAND appropriateness method. Int J Health Pol Manag 2013;1:17–21. https://doi.org/10.15171/ijhpm.2013.04.Search in Google Scholar PubMed PubMed Central
3. Bouëtté, A, Karoussou-Schreiner, A, Ducou Le Pointe, H, Grieten, M, de Kerviler, E, Rausin, L, et al.. National audit on the appropriateness of CT and MRI examinations in Luxembourg. Insights Imaging 2019;10:54. https://doi.org/10.1186/s13244-019-0731-9.Search in Google Scholar PubMed PubMed Central
4. De Roo, B, Hoste, P, Stichelbaut, N, Annemans, L, Bacher, K, Verstraete, K. Belgian multicentre study on lumbar spine imaging: radiation dose and cost analysis; Evaluation of compliance with recommendations for efficient use of medical imaging. Eur J Radiol 2020;125:1–5. https://doi.org/10.1016/j.ejrad.2020.108864.Search in Google Scholar PubMed
5. Drumm, BR, Cronin, H. Role of Euroupean Society of Cardiology (ESC) syncope guidelines in reducing Ssyncope related admissions. Age Ageing 2016;45:i6. https://doi.org/10.1093/ageing/afw024.24.Search in Google Scholar
6. Callaghan, BC, Kerber, KA, Pace, RJ, Skolarus, L, Cooper, W, Burke, JF. Headache neuroimaging: routine testing when guidelines recommend against them. Cephalalgia 2015;35:1144–52. https://doi.org/10.1177/0333102415572918.Search in Google Scholar PubMed PubMed Central
7. Mendelson, RM. Diagnostic imaging: doing the right thing. J Med Imaging Radiat Oncol 2020;64:353–60. https://doi.org/10.1111/1754-9485.13004.Search in Google Scholar PubMed
8. Lamb, CR, David, FH. Advanced imaging: use and misuse. J Feline Med Surg 2012;14:1532–2750. https://doi.org/10.1177/1098612X12451550.Search in Google Scholar PubMed
9. Otero, HJ, Ondategui-Parra, S, Nathanson, EM, Erturk, SM, Ros, PR. Utilization management in radiology: basic concepts and applications. J Am Coll Radiol 2006;3:351–7. https://doi.org/10.1016/j.jacr.2006.01.006.Search in Google Scholar PubMed
10. Owlia, M, Yu, L, Deible, C, Hughes, MA, Jovin, F, Bump, GM. Head CT scan overuse in frequently admitted medical patients. Am J Med 2014;127:406–10. https://doi.org/10.1016/j.amjmed.2014.01.023.Search in Google Scholar PubMed
11. American college of Radiologists. ACR Appropriateness Criteria rating round information. 2017. https://www.acr.org [Accessed 23 Jun 2021].Search in Google Scholar
12. Mendelson, R, Montgomery, B. Towards appropriate imaging: tips for practice. Aust Fam Physician 2016;45:391–5.Search in Google Scholar
13. Broder, JS, Halabi, SS. Improving the application of imaging clinical decision support tools: making the complex simple. J Am Coll Radiol 2014;11:257–61. https://doi.org/10.1016/j.jacr.2013.10.007.Search in Google Scholar PubMed
14. Allen, BJr., Prabhakar Reddy, K, Miller, W, Casale Menier, DR, Lubinus, FG, et al., European Society of Radiology. Summary of the proceedings of the international forum 2016: “Imaging referral guidelines and clinical decision support – how can radiologists implement imaging referral guidelines in clinical routine?”. Insights Imaging 2017;8:1–9. https://doi.org/10.1007/s13244-016-0523-4.Search in Google Scholar PubMed PubMed Central
15. European Society of Radiology. Methodology for ESR iGuide content. Insights Imaging 2019;10:1–5. https://doi.org/10.1186/s13244-019-0720-z.Search in Google Scholar PubMed PubMed Central
16. The Royal college of radiologists. Making the best use of clinical radiology; 2022. https://www.irefer.org.uk/ [Accessed 13 Jan 2022].Search in Google Scholar
17. Solberg, LI, Wei, FF, Butler, JC, Palattao, KJ, Vinz, CA, Marshall, MA. Effects of electronic decision support on high-tech diagnostic imaging orders and patients. Am J Manag Care 2010;16:102–6.Search in Google Scholar
18. Min, ACV, Aristizabal, R, Peramaki, ER, Agulnik, DB, Strydom, N, Ramsey, D, et al.. Clinical decision support decreases volume of imaging for low back pain in an urban emergency department. J Am Coll Radiol 2017;14:889–99. https://doi.org/10.1016/j.jacr.2017.03.005.Search in Google Scholar PubMed
19. Ip, IK, Gershanik, EF, Schneider, LI, Raja, AS, Mar, W, Seltzer, S, et al.. Impact of IT-enabled intervention on MRI use for back pain. Am J Med 2014;127:512–8.e1. https://doi.org/10.1016/j.amjmed.2014.01.024.Search in Google Scholar PubMed PubMed Central
20. Poeran, J, Mao, LJ, Zubizarreta, N, Mazumdar, M, Darrow, B, Genes, N, et al.. Effect of clinical decision support on appropriateness of advanced imaging use among physicians-in-training. Am J Roentgenol 2019;212:859–66. https://doi.org/10.2214/ajr.18.19931.Search in Google Scholar
21. Huber, TC, Krishnaraj, A, Patrie, J, Gaskin, CM. Impact of a commercially available clinical decision support program on provider ordering habits. J Am Coll Radiol 2018;15:951–7. https://doi.org/10.1016/j.jacr.2018.03.045.Search in Google Scholar PubMed
22. Canadian Association of Radiologists. Referral guidelines. 2022. https://car.ca/patient-care/referral-guidelines/ [Accessed 26 Jun 2021].Search in Google Scholar
23. Gondal, M, Qureshi, N, Mukhtar, H, Ahmed, H. An engineering approach to integrate non-functional requirements (NFR) to achieve high quality software process. ICEIS 2020;2:377–84. https://doi.org/10.5220/0009568503770384.Search in Google Scholar
24. Tricco, AC, Lillie, E, Zarin, W, O’Brien, KK, Colquhoun, H, Levac, D, et al.. PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med 2018;169:467–73. https://doi.org/10.7326/m18-0850.Search in Google Scholar
25. Lee, B, Mafi, J, Patel, MK, Sorensen, A, Vangala, S, Wei, E, et al.. Quality improvement time-saving intervention to increase use of a clinical decision support tool to reduce low-value diagnostic imaging in a safety net health system. BMJ Open Qual 2021;10:1–5. https://doi.org/10.1136/bmjoq-2020-001076.Search in Google Scholar PubMed PubMed Central
26. Hayatghaibi, SE, Sammer, MBK, Varghese, V, Seghers, VJ, Sher, AC. Prospective cost implications with a clinical decision support system for pediatric emergency head computed tomography. Pediatr Radiol 2021;51:1–7. https://doi.org/10.1007/s00247-021-05159-9.Search in Google Scholar PubMed PubMed Central
27. Gaskin, CM, Ellenbogen, AL, Parkhurst, KL, Matsumoto, AH. Use of a commercially available clinical decision support tool to expedite prior authorization in partnership with a private payer. J Am Coll Radiol 2021;18:857–63. https://doi.org/10.1016/j.jacr.2021.01.009.Search in Google Scholar PubMed
28. Fried, JG, Pakpoor, J, Kahn, CEJr., Zafar, HM. Lessons from the free-text epidemic: opportunities to optimize deployment of imaging clinical decision support. J Am Coll Radiol 2021;18:467–74. https://doi.org/10.1016/j.jacr.2021.01.002.Search in Google Scholar PubMed PubMed Central
29. Chepelev, LL, Wang, X, Gold, B, Bonzel, CL, Rybicki, F, Uyeda, JW, et al.. Improved appropriateness of advanced diagnostic imaging after implementation of Clinical Decision Support Mechanism. J Digit Imag 2021;34:397–403. https://doi.org/10.1007/s10278-021-00433-6.Search in Google Scholar PubMed PubMed Central
30. Chen, WH, Saxon, DT, Henry, MP, Herald, JR, Holleman, R, Zawol, D, et al.. Effects of an electronic medical record intervention on appropriateness of transthoracic echocardiograms: a prospective study. J Am Soc Echocardiogr 2021;34:176–84. https://doi.org/10.1016/j.echo.2020.09.010.Search in Google Scholar PubMed PubMed Central
31. Wu, Y, Rose, MQ, Freeman, ML, Richard-Lany, NP, Spaulding, AC, Booth, SC, et al.. Reducing chest radiography utilization in the medical intensive care unit. J Am Assoc Nurse Pract 2020;32:390–9. https://doi.org/10.1097/jxx.0000000000000256.Search in Google Scholar
32. Rehani, MM, Melick, ER, Alvi, RM, Khera, RD, Batool-Anwar, S, Neilan, TG, et al.. Patients undergoing recurrent CT exams: assessment of patients with non-malignant diseases, reasons for imaging and imaging appropriateness. Eur Radiol 2020;30:1839–46. https://doi.org/10.1007/s00330-019-06551-8.Search in Google Scholar PubMed
33. Hynes, JP, Hunter, K, Rochford, M. Utilization and appropriateness in cervical spine trauma imaging: implementation of clinical decision support criteria. Ir J Med Sci 2020;189:333–6. https://doi.org/10.1007/s11845-019-02059-8.Search in Google Scholar PubMed
34. Gabelloni, M, Di Nasso, M, Morganti, R, Faggioni, L, Masi, G, Falcone, A, et al.. Application of the ESR iGuide clinical decision support system to the imaging pathway of patients with hepatocellular carcinoma and cholangiocarcinoma: preliminary findings. Radiol Med 2020;125:531–7. https://doi.org/10.1007/s11547-020-01142-w.Search in Google Scholar PubMed
35. Ciprut, SE, Kelly, MD, Walter, D, Hoffman, R, Becker, DJ, Loeb, S, et al.. A clinical reminder order check intervention to improve guideline- concordant imaging practices for men with prostate cancer: a pilot study. Urology 2020;145:113–8. https://doi.org/10.1016/j.urology.2020.05.101.Search in Google Scholar PubMed
36. Carayon, P, Hoonakker, P, Hundt, AS, Salwei, M, Wiegmann, D, Brown, RL, et al.. Application of human factors to improve usability of clinical decision support for diagnostic decision-making: a scenario-based simulation study. BMJ Qual Saf 2020;29:329–40. https://doi.org/10.1136/bmjqs-2019-009857.Search in Google Scholar PubMed PubMed Central
37. Stopyra, JP, Snavely, A, Lenoir, K, Wells, BJ, Herrington, DM, Hiestand, BC, et al.. Heart pathway implementation safely reduces hospitalizations at one-year in patients with acute chest pain. Ann Emerg Med 2020;76:555–65.10.1016/j.annemergmed.2020.05.035Search in Google Scholar PubMed PubMed Central
38. Raja, AS, Pourjabbar, S, Ip, IK, Baugho, CW, Sodickson, AD, O’Leary, M, et al.. Impact of a health information technology-enabled appropriate use criterion on utilization of emergency department CT for renal colic. Am J Roentgenol 2019;212:142–5. https://doi.org/10.2214/ajr.18.19966.Search in Google Scholar
39. Palen, TE, Sharpe, REJr., Shetterly, SM, Steiner, JF. Randomized clinical trial of a clinical decision support tool for improving the appropriateness scores for ordering imaging studies in primary and specialty care ambulatory clinics. Am J Roentgenol 2019;213:1015–20. https://doi.org/10.2214/ajr.19.21511.Search in Google Scholar PubMed
40. Mulders, MAM, Walenkamp, MMJ, Sosef, NL, Ouwehand, F, van Velde, R, Goslings, CJ, et al.. The Amsterdam Wrist Rules to reduce the need for radiography after a suspected distal radius fracture: an implementation study. Eur J Trauma Emerg Surg 2019;46:573–82. https://doi.org/10.1007/s00068-019-01194-2.Search in Google Scholar PubMed PubMed Central
41. Lee, JH, Ha, EJ, Baek, JH, Choi, M, Jung, SE, Yong, HS. Implementation of Korean clinical imaging guidelines: a mobile app-based decision support system. Korean J Radiol 2019;20:182–9. https://doi.org/10.3348/kjr.2018.0621.Search in Google Scholar PubMed PubMed Central
42. Chan, SS, Francavilla, ML, Iyer, RS, Rigsby, CK, Hernanz-Schulman, M. Clinical decision support: practical implementation at two pediatric hospitals. Pediatr Radiol 2019;49:486–92. https://doi.org/10.1007/s00247-018-4322-6.Search in Google Scholar PubMed
43. Doyle, J, Abraham, S, Feeney, L, Reimer, S, Finkelstein, A. Clinical decision support for high-cost imaging: a randomized clinical trial. PLoS One 2019;14:1–13. https://doi.org/10.1371/journal.pone.0213373.Search in Google Scholar PubMed PubMed Central
44. Mills, AM, Ip, IK, Langlotz, CP, Raja, AS, Zafar, HM, Khorasani, R. Clinical decision support increases diagnostic yield of computed tomography for suspected pulmonary embolism. Am J Emerg Med 2018;36:540–4. https://doi.org/10.1016/j.ajem.2017.09.004.Search in Google Scholar PubMed PubMed Central
45. Cochon, L, Khorasani, R. Clinical decision support tools for order entry. In: Quality and safety in imaging. Cham: Springer; 2018:21–34 pp.10.1007/174_2017_162Search in Google Scholar
46. Calcaterra, D, Di Modica, G, Tomarchio, O, Romeo, P. A clinical decision support system to increase appropriateness of diagnostic imaging prescriptions. J Netw Comput Appl 2018;117:17–29. https://doi.org/10.1016/j.jnca.2018.05.0.11.Search in Google Scholar
47. Moriarity, AK, Green, A, Klochko, C, O’Brien, M, Halabi, S. Evaluating the effect of unstructured clinical information on clinical decision support appropriateness ratings. J Am Coll Radiol 2017;14:737–43. https://doi.org/10.1016/j.jacr.2017.02.003.Search in Google Scholar PubMed
48. Lacson, R, Ip, I, Hentel, KD, Malhotra, S, Balthazar, P, Langlotz, CP, et al.. Medicare imaging demonstration: assessing attributes of appropriate use criteria and their influence on ordering behavior. Am J Roentgenol 2017;208:1051–7. https://doi.org/10.2214/ajr.16.17169.Search in Google Scholar
49. Gupta, S, Klein, K, Singh, AH, Thrall, JH. Analysis of low appropriateness score exam trends in decision support-based radiology order entry system. J Am Coll Radiol 2017;14:615–21. https://doi.org/10.1016/j.jacr.2016.12.011.Search in Google Scholar PubMed
50. Ip, IK, Lacson, R, Hentel, K, Malhotra, S, Darer, J, Langlotz, C, et al.. Predictors of provider response to clinical decision support: lessons learned from the medicare imaging demonstration. Am J Roentgenol 2017;208:351–7. https://doi.org/10.2214/AJR.16.16373.Search in Google Scholar PubMed
51. Drescher, MJ, Fried, J, Brass, R, Medoro, A, Murphy, T, Delgado, J. Knowledge translation of the PERC rule for suspected pulmonary embolism: a blueprint for reducing the number of CT pulmonary angiograms. West J Emerg Med 2017;18:1091–7. https://doi.org/10.5811/westjem.2017.7.34581.Search in Google Scholar PubMed PubMed Central
52. Prabhakar, AM, Harvey, HB, Misono, AS, Erwin, AE, Jones, N, Heffernan, J, et al.. Imaging decision support does not drive out-of-network leakage of referred imaging. J Am Coll Radiol 2016;13:606–10. https://doi.org/10.1016/j.jacr.2016.01.004.Search in Google Scholar PubMed
53. Depinet, H, von Allmen, D, Towbin, A, Hornung, R, Ho, M, Alessandrini, E. Risk stratification to decrease unnecessary diagnostic imaging for acute appendicitis. Pediatrics 2016;138:e1–10. https://doi.org/10.1542/peds.2015-4031.Search in Google Scholar PubMed
54. Bookman, K, West, D, Ginde, A, Wiler, J, McIntyre, R, Hammes, A, et al.. Embedded clinical decision support in electronic health record decreases use of high-cost imaging in the emergency department: EmbED study. Acad Emerg Med 2017;24:839–45. https://doi.org/10.1111/acem.13195.Search in Google Scholar PubMed PubMed Central
55. Sistrom, CL, Weilburg, JB, Dreyer, KJ, Ferris, TG. Provider feedback about imaging appropriateness by using scores from order entry decision support: raw rates misclassify outliers. Radiology 2015;275:469–79. https://doi.org/10.1148/radiol.14141092.Search in Google Scholar PubMed
56. Schneider, E, Zelenka, S, Grooff, P, Alexa, D, Bullen, J, Obuchowski, NA. Radiology order decision support: examination-indication appropriateness assessed using 2 electronic systems. J Am Coll Radiol 2015;12:349–57. https://doi.org/10.1016/j.jacr.2014.12.005.Search in Google Scholar PubMed
57. Moriarity, AK, Klochko, C, O’Brien, M, Halabi, S. The effect of clinical decision support for advanced inpatient imaging. J Am Coll Radiol 2015;12:358–63. https://doi.org/10.1016/j.jacr.2014.11.013.Search in Google Scholar PubMed
58. Ip, IK, Raja, AS, Gupta, A, Andruchow, J, Sodickson, A, Khorasani, R. Impact of clinical decision support on head computed tomography use in patients with mild traumatic brain injury in the. Am J Emerg Med 2015;33:320–5. https://doi.org/10.1016/j.ajem.2014.11.005.Search in Google Scholar PubMed
59. Dunne, RM, Ip, IK, Abbett, S, Gershanik, EF, Raja, AS, Hunsaker, A, et al.. Effect of evidence-based clinical decision support on the use and yield of CT pulmonary angiographic imaging in hospitalized patients. Radiology 2015;276:167–74. https://doi.org/10.1148/radiol.15141208.Search in Google Scholar PubMed PubMed Central
60. Ranta, A, Yang, CF, Funnell, M, Cariga, P, Murphy-Rahal, C, Cogger, N. Utility of a primary care based transient ischaemic attack electronic decision support tool: a prospective sequential comparison. BMC Fam Pract 2014;15:86. https://doi.org/10.1186/1471-2296-15-86.Search in Google Scholar PubMed PubMed Central
61. Thrall, JH. Appropriateness and imaging utilization: “computerized provider order entry and decision support”. Acad Radiol 2014;21:1083–7. https://doi.org/10.1016/j.acra.2014.02.019.Search in Google Scholar PubMed
62. Sistrom, CL, Weilburg, JB, Rosenthal, DI, Dreyer, KJ, Thrall, JH. Use of imaging appropriateness criteria for decision support during radiology order entry: the MGH experience. In: Radiological safety and quality paradigms in leadership and innovation. Dordrecht: Springer; 2014.10.1007/978-94-007-7256-4_7Search in Google Scholar
63. Khorasani, R, Hentel, K, Darer, J, Langlotz, C, Ip, IK, Manaker, S, et al.. Ten commandments for effective clinical decision support for imaging: enabling evidence-based practice to improve quality and reduce waste. Am J Roentgenol 2014;203:945–51. https://doi.org/10.2214/AJR.14.13134.Search in Google Scholar PubMed
64. Gupta, A, Raja, AS, Khorasani, R. Examining clinical decision support integrity: is clinician self-reported data entry accurate? J Am Med Inf Assoc 2014;21:23–6. https://doi.org/10.1136/amiajnl-2013-001617.Search in Google Scholar PubMed PubMed Central
65. Gupta, A, Ip, IK, Raja, AS, Andruchow, JE, Sodickson, A, Khorasani, R. Effect of clinical decision support on documented guideline adherence for head CT in emergency department patients with mild traumatic brain injury. J Am Med Inf Assoc 2014;21:e347–51. https://doi.org/10.1136/amiajnl-2013-002536.Search in Google Scholar PubMed PubMed Central
66. Lin, FY, Dunning, AM, Narula, J, Shaw, LJ, Gransar, H, Berman, DS, et al.. Impact of an automated multimodality point-of-order decision support tool on rates of appropriate testing and clinical decision making for individuals with suspected coronary artery disease: a prospective multicenter study. J Am Coll Cardiol 2013;62:308–16. https://doi.org/10.1016/j.jacc.2013.04.059.Search in Google Scholar PubMed
67. Ip, IK, Schneider, L, Seltzer, S, Smith, A, Dudley, J, Menard, A, et al.. Impact of provider-led, technology-enabled radiology management program on imaging. Am J Med 2013;126:687–92. https://doi.org/10.1016/j.amjmed.2012.11.034.Search in Google Scholar PubMed
68. Curry, L, Reed, MH. Electronic decision support for diagnostic imaging in a primary care setting. J Am Med Inf Assoc 2011;18:267–70. https://doi.org/10.1136/amiajnl-2011-000049.Search in Google Scholar PubMed PubMed Central
69. Bowen, S, Johnson, K, Reed, MH, Zhang, LP, Curry, L. The effect of incorporating guidelines into a computerized order entry system for diagnostic imaging. J Am Coll Radiol 2011;8:251–8. https://doi.org/10.1016/j.jacr.2010.11.020.Search in Google Scholar PubMed
70. Blackmore, CC, Mecklenburg, RS, Kaplan, GS. Effectiveness of clinical decision support in controlling inappropriate imaging. J Am Coll Radiol 2011;8:19–25. https://doi.org/10.1016/j.jacr.2010.07.009.Search in Google Scholar PubMed
71. Vartanians, VM, Sistrom, CL, Weilburg, JB, Rosenthal, DI, Thrall, JH. Increasing the appropriateness of outpatient imaging: effects of a barrier to ordering low-yield examinations. Radiology 2010:842–9. https://doi.org/10.1148/radiol.10091228.Search in Google Scholar PubMed
72. Sistrom, CL, Dang, PA, Weilburg, JB, Dreyer, KJ, Rosenthal, DI, Thrall, JH. Effect of computerized order entry with integrated decision support on the growth of outpatient procedure volumes. Radiology 2009;251:147–55. https://doi.org/10.1148/radiol.2511081174.Search in Google Scholar PubMed
73. Rosenthal, DI, Weilburg, JB, Schultz, T, Miller, JC, Nixon, V, Dreyer, KJ, et al.. Radiology order entry with decision support: initial clinical experience. J Am Coll Radiol 2006;3:799–806. https://doi.org/10.1016/j.jacr.2006.05.006.Search in Google Scholar PubMed
74. Kushniruk, AW, Borycki, EM. Human factors in healthcare IT: management considerations and trends. Healthc Manag Forum 2022;36:72–8. https://doi.org/10.1177/08404704221139219.Search in Google Scholar PubMed PubMed Central
Supplementary Material
This article contains supplementary material (https://doi.org/10.1515/dx-2023-0083).
© 2023 Walter de Gruyter GmbH, Berlin/Boston
Articles in the same Issue
- Frontmatter
- Editorial
- The physical exam and telehealth: between past and future
- Review
- Features and functions of decision support systems for appropriate diagnostic imaging: a scoping review
- Mini Reviews
- The PRIDx framework to engage payers in reducing diagnostic errors in healthcare
- Tumor heterogeneity: how could we use it to achieve better clinical outcomes?
- Original Articles
- Factors influencing diagnostic accuracy among intensive care unit clinicians – an observational study
- Prevalence of atypical presentations among outpatients and associations with diagnostic error
- Preferred language and diagnostic errors in the pediatric emergency department
- Diurnal temperature variation and the implications for diagnosis and infectious disease screening: a population-based study
- What’s going well: a qualitative analysis of positive patient and family feedback in the context of the diagnostic process
- Assessing clinical reasoning skills following a virtual patient dizziness curriculum
- Interleukin-6, tumor necrosis factor-α, and high-sensitivity C-reactive protein for optimal immunometabolic profiling of the lifestyle-related cardiorenal risk
- Effect of syringe underfilling on the quality of venous blood gas analysis
- Short Communications
- How do patients and care partners describe diagnostic uncertainty in an emergency department or urgent care setting?
- Enhancing clinical reasoning with Chat Generative Pre-trained Transformer: a practical guide
- Letters to the Editor
- How to overcome hurdles in holding mortality and morbidity conferences on diagnostic error cases in Japan
- Medical history-taking by highlighting the time course: PODCAST approach
- Journal Reputation Factor
- Case Report
- Pre-analytical errors in coagulation testing: a case series
- Acknowledgement
- Acknowledgement
Articles in the same Issue
- Frontmatter
- Editorial
- The physical exam and telehealth: between past and future
- Review
- Features and functions of decision support systems for appropriate diagnostic imaging: a scoping review
- Mini Reviews
- The PRIDx framework to engage payers in reducing diagnostic errors in healthcare
- Tumor heterogeneity: how could we use it to achieve better clinical outcomes?
- Original Articles
- Factors influencing diagnostic accuracy among intensive care unit clinicians – an observational study
- Prevalence of atypical presentations among outpatients and associations with diagnostic error
- Preferred language and diagnostic errors in the pediatric emergency department
- Diurnal temperature variation and the implications for diagnosis and infectious disease screening: a population-based study
- What’s going well: a qualitative analysis of positive patient and family feedback in the context of the diagnostic process
- Assessing clinical reasoning skills following a virtual patient dizziness curriculum
- Interleukin-6, tumor necrosis factor-α, and high-sensitivity C-reactive protein for optimal immunometabolic profiling of the lifestyle-related cardiorenal risk
- Effect of syringe underfilling on the quality of venous blood gas analysis
- Short Communications
- How do patients and care partners describe diagnostic uncertainty in an emergency department or urgent care setting?
- Enhancing clinical reasoning with Chat Generative Pre-trained Transformer: a practical guide
- Letters to the Editor
- How to overcome hurdles in holding mortality and morbidity conferences on diagnostic error cases in Japan
- Medical history-taking by highlighting the time course: PODCAST approach
- Journal Reputation Factor
- Case Report
- Pre-analytical errors in coagulation testing: a case series
- Acknowledgement
- Acknowledgement