Home Informatics External Quality Assurance (IEQA) Down Under: evaluation of a pilot implementation
Article Publicly Available

Informatics External Quality Assurance (IEQA) Down Under: evaluation of a pilot implementation

  • Rae-Anne Hardie , Donna Moore , Derek Holzhauser , Michael Legg , Andrew Georgiou and Tony Badrick EMAIL logo
Published/Copyright: September 11, 2018
Become an author with De Gruyter Brill

Abstract

External quality assurance (EQA) provides ongoing evaluation to verify that laboratory medicine results conform to quality standards expected for patient care. While attention has focused predominantly on test accuracy, the diagnostic phases, consisting of pre- and post-laboratory phases of testing, have thus far lagged in the development of an appropriate diagnostic-phase EQA program. One of the challenges faced by Australian EQA has been a lack of standardisation or “harmonisation” resulting from variations in reporting between different laboratory medicine providers. This may introduce interpretation errors and misunderstanding of results by clinicians, resulting in a threat to patient safety. While initiatives such as the Australian Pathology Information, Terminology and Units Standardisation (PITUS) program have produced Standards for Pathology Informatics in Australia (SPIA), conformity to these requires regular monitoring to maintain integrity of data between sending (laboratory medicine providers) and receiving (physicians, MyHealth Record, registries) organisations’ systems. The PITUS 16 Informatics EQA (IEQA) Project together with the Royal College of Pathologists of Australasia Quality Assurance Programs (RCPAQAP) has created a system to perform quality assurance on the electronic laboratory message when the laboratory sends a result back to the EQA provider. The purpose of this study was to perform a small scale pilot implementation of an IEQA protocol, which was performed to test the suitability of the system to check compliance of existing Health Level-7 (HL7 v2.4) reporting standards localised and constrained by the RCPA SPIA. Here, we present key milestones from the implementation, including: (1) software development, (2) installation, and verification of the system and communication services, (3) implementation of the IEQA program and compliance testing of the received HL7 v2.4 report messages, (4) compilation of a draft Informatics Program Survey Report for each laboratory and (5) review consisting of presentation of a report showing the compliance checking tool to each participating laboratory.

Introduction

Researchers from the Royal College of Pathologists of Australasia Quality Assurance Programs (RCPAQAP) and Macquarie University recently published a paper describing the importance of pre- and post-laboratory phases which form the diagnostic phases of laboratory medicine testing [1]. The issues, however, have been known of in Australia since at least 1996 [2]. These phases “beyond the lab” constitute a major source of errors that reduce laboratory effectiveness and threaten patient safety [1], [3]. External quality assurance (EQA) ensures that verification is performed on a recurring basis, and that laboratory results conform to expectations for quality required for patient care [4]; however, most Australian laboratories have previously focused narrowly on laboratory processes such as test accuracy and precision [1]. Until recently, it was very difficult to quantify the number and magnitude of errors in the extra-analytical phases due to a lack of formal EQA measures in these areas.

Detection of these errors requires reliable quality indicators during the total testing process (TTP), from the time the laboratory request is determined, until the clinician receives the final report, makes a diagnosis and decides on the appropriate action [3], [5], [6]. Launched in 2008, the Key Incident Monitoring and Management System (KIMMS) is an Australasian-developed quality improvement (QI) program that records incidents (process defects) and episodes (occasions where incidents may occur) while also assigning quantified risk to each incident type (by multiplying by harm rating and detection difficulty score) using failure mode effects analysis (FMEA) [5]. By 2016 KIMMS had detected over 200 million episodes and 2.9 million incidents, detecting an overall TTP incident rate of 1.75% [5]. Some incident rates may appear low, but when taking into account risks and their frequencies, critical incident types emerge that require improvements in management. For example, haemolysis had both the highest incidence (22.6% of total incidents) and highest risk (26.68% of total risk). However, incidents that have low frequency (e.g. “sample suspected to be from wrong patient” had the second lowest score) but high harm rating (e.g. 10/10) and detection difficulty scores (10/10) end up having a relatively higher risk to reflect the severity of potential risk to the patient [5]. Until recently, KIMMS has focused mainly on the pre-laboratory phase of the TTP cycle, thus there has been a need to create an EQA program that encompasses the post-laboratory phase.

One of the major post-laboratory areas that has been under urgent pressure for improved EQA measures has been laboratory reporting, especially given the widespread adoption of electronic health records which aggregate reports from multiple laboratories, such as MyHealth Record in Australia [7]. The importance for standardisation or “harmonisation” of the formats and styles used in clinical chemistry reporting is key to interoperability and safety for electronic health records [8], [9], [10]. Significant variations in reporting policies between different Australian laboratory medicine providers, or even within the same provider, result in different styles of reports for different customers [11]. The clinical chemistry report needs to be clear and unambiguous; however, in Australia there are still differences in reporting, e.g. different names for the same test, different units, different tests included in panels with the same names, differences in reference intervals (RIs) and flagging of results outside limits [12]. These differences may introduce misunderstanding resulting in interpretation errors by clinicians or patients at the post-laboratory phase [10], [13], which is a patient safety issue. Activities aiming to increase harmonisation in laboratory medicine include improving metrological comparability of results, as well as reducing unnecessary between-laboratory variation in test requesting and reporting [9], which is the focus of our trial.

The Australian Pathology Units and Terminology Standardisation (APUTS) project [12] began in 2011 and was the first of three projects completed in a program of laboratory medicine informatics standardisation led by the RCPA but which had active involvement from many organisations and individuals. The program is now called PITUS (Pathology Information Terminology and Units Standardisation) – a fourth phase has been planned but awaits funding. The consensus standards developed in PITUS have been endorsed and published by the RCPA as college policy – the version at the time of writing was called Standards for Pathology Informatics in Australia (SPIA) v3.0 [14], [15]. It includes requesting and reporting terminology including preferred Australian terms, standardised units, safe report rendering, information models and harmonised RIs and best practice guidance for safe laboratory medicine requesting and reporting. One of the six sub-projects for PITUS 16 was a trial implementation of an Informatics EQA (IEQA) program, therefore the purpose of this study was to evaluate a small scale pilot implementation of an IEQA program in order to study the feasibility for large scale implementation. This sub-project endeavoured to create a system to perform quality assurance on the electronic laboratory message when the laboratory sends a result back to the EQA provider itself. This was a follow-on to a previous (PITUS-14) sub project which investigated a more manual evaluation of instances of the implementation of standards for requesting and reporting working with the largest private and largest public laboratories in Australia and with active co-operation from the Medical Software Industry Association and the National EHealth Transition Authority [8], [11], [16].

Pre- and post-laboratory errors

Pre- and post-laboratory errors may have serious consequences for patients and place unnecessary cost on the medical system. Pre-laboratory errors include errors in ordering tests, preparing patients and processing samples, and post-laboratory errors may occur during reporting results to physicians, interpretation of results by physicians, notification of results to patients, administration and communication [17]. A study by the American Academy of Family Physicians reported that at least 18% of patients experienced some form of harm in a study of 966 pre- and post-laboratory errors [17]. In addition to harm, further outcomes included delays in care (24%), financial consequences and time wasted (22%), pain and suffering (11%) and adverse clinical consequence (2%) [17].

An Australian study has outlined performance criteria for the post-laboratory phase, highlighting a need for the post-laboratory phase to take “quality technical results and provide the means for clinical interpretation in the report” [18]. For example, RIs are often used in interpretation particularly at diagnosis, but different laboratories may use different RIs, even in cases of similar methods [18], [19]. An evidence-based approach was developed by scientific consensus at workshops of the Australasian Association of Clinical Biochemists (AACB) between 2012 and 2014, resulting in the development of the “AACB Harmonised Reference Intervals” [19]. While laboratories were consulted, adoption of RIs still lies with each laboratory. Reaching harmonisation would result in consistency of RIs across Australia and New Zealand. Sikaris has expanded on the ISO15189 standard definition of post-laboratory phase (the processes following the examination, including review of results) [20] to incorporate the quality of clinical chemistry reports, including formatting, releasing, reporting and retention of examination results for future access [18]. It is also recommended that quality in post-laboratory interpretation should take into account not only quality analytical data, but also its interpretation against the patient’s clinical context [18]. This is crucial because the misinterpretation of test results may have many contributors, namely cognitive factors, RIs, clinical interpretations and notifications from laboratory specialists, all of which may contribute to misdiagnosis [21], [22], [23].

An Informatics EQA (IEQA) program for the post-laboratory phases

With physicians often receiving clinical chemistry reports electronically from different laboratories, there is a risk that variations in reporting formats may add a layer of complexity in interpreting the report, or result in errors which can result in a risk to patient safety [24]. Audit and dissemination of harmonisation guidelines on their own have been shown to be insufficient in managing quality of results interpretation in general practice, and EQA studies have shown variability between clinicians’ interpretation of clinical chemistry results for specific tests [25], [26], [27]. Therefore, it is crucial that: 1) standards are actually implemented; and 2) conformity to the standards and guidelines is continuously assessed. The RCPAQAP aims to implement an IEQA program to ensure ongoing quality and safety in reporting.

A trial implementation of the IEQA program

Compliance and standardisation of laboratory medicine terminology are needed to maintain integrity of data shared between sending (laboratory medicine providers) and receiving (physicians, MyHealth Record, registries) organisations’ digital health information systems. The RCPA PITUS 16 Project Working Group 6 collaborated with RCPAQAP to design and analyse a system for reporting data using an IEQA program, the architecture of which is outlined in Figure 1.

Figure 1: High-level IEQA architecture. IVD, in vitro diagnostic; MCT, message conformance testing.
Figure 1:

High-level IEQA architecture. IVD, in vitro diagnostic; MCT, message conformance testing.

In 2015, as part of the RCPAQAP Liquid Serum Chemistry (LSC), program, laboratories were invited to supply a routine paper report displaying results. The LSC program is a commutable frozen patient serum program used to assess method differences. The RCPAQAP then analysed these reports against the SPIA (formerly known as the RCPA APUTS v2.3 standard) [28], and variations were identified [29]. This provided validation for the rationale for development and trial of an IEQA protocol to test compliance of existing Health Level-7 (HL7) reporting standards with reporting standards developed by the RCPA [1], [30]. Accrediting bodies such as the National Association of Testing Authorities, Australia (NATA) could then use this to assist with compliance. Medical Objects Pty Ltd [31], a medical software vendor, was selected after formal evaluation of responses to a call for expressions of interest. The evaluation software was co-designed. The system built was capable of sending standardised electronic request messages and receiving electronic report messages and then analysing the message received. Message services supported Secure Message Delivery (SMD)-based secure messaging. Two laboratories volunteered to send HL7 v2 report messages with atomic clinical chemistry results for the RCPAQAP LSC program. These sites represented two of the major laboratory information systems (LIS) in use in Australia and serviced both hospital and community patients and hence were sending results to multiple Hospital Information Systems and General Practitioner practice systems. The Medical Objects’ software tool was used by RCPAQAP to test compliance of the HL7 v2 report messages received from each laboratory against the HL7 Messaging Standard Version 2.4 [32], Australian Standard AS4700.2:2012 [33] and RCPA APUTS v2.3 standards [34] (an earlier version of SPIA [28] but with most of the same compliance points). A mock RCPAQAP Informatics Program Survey Report was designed and then compiled for each of the laboratories, reporting on an assessment of the validity, integrity and rendered form of the data received.

Key milestones of the IEQA trial

Key milestones of the IEQA trial are listed and described in Table 1.

Table 1:

Key milestones of the IEQA trial.

Key milestoneStages
1. Development of the software by Medical Objects. Two new software modules for the trial implementation, including:1. Multi-component test requests (e.g. liver function test) (electronic requesting of clinical chemistry request orders)

2. Quality assurance-compliance rule checking module for HL7 v2.4 report messages against HL7 Messaging Standard v2.4 and AS4700:2:2012 standard; atomic data in HL7 v2.4 report messages against SPIA, including checking LOINC code, preferred term, reference interval, flagging, alignment and units
2. Installation, setup and verification of the system software and communication servicesa) Medical Objects “Explorer” software application installed on computers used for compliance testing

b) Medical Objects’ “Eclipse” communication services used during the trial implementation to electronically send HL7 v2.4 request messages to the participating laboratories as well as receive HL7 v2.4 report messages from the participating laboratories
3. Implementation of the IEQA program and compliance testing of the received HL7 v2.4 report messagesa) Using bulk orders module, a clinical chemistry test request for Liquid Serum Chemistry program was created and an electronic HL7 v2.4 request was electronically transmitted to the two laboratories. Also sent a PDF version of clinical chemistry request form via email

b) The participating laboratories electronically transmitted HL7 v2.4 report messages with the results to RCPAQAP for analysis

c) Quality assurance module performed compliance rule checks on each received HL7 v2.4 report messages. Assessed

 a. Compliance of HL7 v2.4 report message against the HL7 Messaging Standard v2.4 and AS4700.2: 2012 standard, including compliance to conformance points in the AS4700:2:2012 standard

 b. Compliance of atomic result data in the HL7 v2.4 report messages against terminology standards (LOINC code, preferred term, reference interval and units) and harmonised reference intervals described within SPIA [15] (Figure 2)

 c. In the quality assurance module, windows were also provided for the tester to perform manual comparison of the rendered clinical chemistry report against the expected SPIA format, which is important for certain SPIA standards that require manual checking (Figure 3)
4. Compilation of a draft Informatics Program Survey Reporta) These were provided to each laboratory that participated, to assist them in identifying compliant areas and areas requiring further improvement
5. Review of trial implementationa) Presentation of report showing the compliance checking tool to each participating laboratory
  1. LOINC, Logical Observation Identifiers Names and Codes.

Issues around the implementation

In the IEQA trial, the participating laboratories were sent and could receive the electronic request message, but their LIS were not configured to process it. Electronic laboratory requesting was not the main focus of the trial and although electronic laboratory requesting is technically possible, more work is required by the RCPAQAP to find the best solution for how to communicate the 60+ systematized nomenclature of medicine-clinical terms Australia codes for the individual analytes of the RCPAQAP LSC program in an electronic request message. Participating laboratories would also need to configure their laboratory information system to process these codes.

As for other EQA programs the IEQA still needs a subject matter expert to be involved in running the EQA round. In this case that means a laboratory medicine informatician familiar with messaging standards and reporting requirements.

Figure 2: Example of compliance rule check for the atomic result data against the RCPA published SPIA.
Figure 2:

Example of compliance rule check for the atomic result data against the RCPA published SPIA.

Figure 3: Example of quality assurance software windows, with report format using SPIA rendered report rules (left) and rendered clinical chemistry report from the HL7 v2.4 report message (right).
Figure 3:

Example of quality assurance software windows, with report format using SPIA rendered report rules (left) and rendered clinical chemistry report from the HL7 v2.4 report message (right).

The lack of a significant industry driver to encourage laboratories to configure their LIS to receive a standardised electronic request message remains a barrier. This barrier would be overcome if there was an IEQA in place to identify those laboratories that were not using the SPIA standards.

Another issue is that patients have the freedom to select which collection centre they have their samples collected, and that collection centre may not belong to the laboratory suggested by the requestor. The lack of a centralised laboratory medicine order message broker means there is no direct link between requestors (e.g. physicians) and the laboratory performing the service. The adoption of standards for laboratory medicine informatics by the clinical chemistry accreditation scheme [National Pathology Accreditation Advisory Council (NPAAC)] and/or for requesting a common request hub would drive standardisation in the right direction.

Use of ICT to support EQA in diagnostic phase and following up test results – does information technology (IT) enhance follow-up?

Our team’s previous research has demonstrated that health Information and Communications Technology (ICT) can be used to support EQA by standardised result reporting, but it can also be useful in other clinical applications. For example, electronic test acknowledgement systems may help to reduce incidence of missed test results [35] and electronic decision support systems can be used to improve the quality of result interpretation, by alerts specific to the patient on adhering to clinical guidelines or protocols [36]. Thus, we believe that it is feasible to use EQA to study compliance to standards on requesting and reporting lab tests.

Conclusions

For laboratory medicine services to provide quality post-laboratory services to clinicians and patients, it is essential that programs are in place to ensure ongoing proficiency of test result reporting as well as standardisation of test results. Removing barriers to interoperability, both between sending (laboratories) and receiving (clinicians, MyHealth Record, patients or registries) organisations is particularly important with the implementation of laboratory medicine reports into electronic health records. This paper demonstrates the feasibility of an IEQA program supported by ICT which has the potential to be used by accrediting bodies to assist with compliance with PITUS standards and HL7 v2 messaging. This is a multi-step process which first drives standardisation and that in turn reduces variation which in turn reduces error and thereby harm. Expansion of this IEQA program to a large scale implementation across Australia will reveal its true benefits in improving communication, standardisation and patient safety in laboratory medicine in the era of electronic health records.

The described IEQA model could be used in any country where there is electronic transmission of requests and results. In Australia, there are guidelines for the format of reports and the transmission of results, these would need to be in place as well. This is a key initiative to reduce this under-recognised post-laboratory error. We believe that EQA providers in each country could develop a similar IEQA in the interests of patient safety.


Corresponding author: Dr. Tony Badrick, The Royal College of Pathologists of Australasia Quality Assurance Programs (RCPAQAP), Suite 201/8 Herbert Street, St Leonards, NSW 2065, Australia

Acknowledgements

ML designed and led the IEQA implementation project and co-designed the evaluation software with Andrew McIntyre and Jared Davidson of Medical Objects and Ray Oreo of RCPAQAP. ML and DM undertook the manual aspects of the conformance testing and the subsequent IEQA reporting.

  1. Author contributions: All the authors have accepted responsibility for the entire content of this submitted manuscript and approved submission.

  2. Research funding: RCPAQAP was funded from the Australian Government PITUS funding. ML was partly funded and DM fully funded by the PITUS Grant.

  3. Employment or leadership: None declared.

  4. Honorarium: None declared.

  5. Competing interests: The funding organisation(s) played no role in the study design; in the collection, analysis, and interpretation of data; in the writing of the report; or in the decision to submit the report for publication.

References

1. Badrick T, Gay S, McCaughey EJ, Georgiou A. External Quality Assessment beyond the analytical phase: an Australian perspective. Biochem Med (Zagreb) 2017;27:73–80.10.11613/BM.2017.009Search in Google Scholar PubMed PubMed Central

2. Khoury M, Burnett L, Mackay MA. Error rates in Australian chemical pathology laboratories. Med J Aust 1996;165:128–30.10.5694/j.1326-5377.1996.tb124883.xSearch in Google Scholar PubMed

3. Badrick T, Gay S, Mackay M, Sikaris K. The key incident monitoring and management system – history and role in quality improvement. Clin Chem Lab Med 2018;56:264–72.10.1515/cclm-2017-0219http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000419865000022&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=b7bc2757938ac7a7a821505f8243d9f3Search in Google Scholar PubMed

4. Miller WG, Jones GR, Horowitz GL, Weykamp C. Proficiency testing/external quality assessment: current challenges and future directions. Clin Chem 2011;57:1670–80.10.1373/clinchem.2011.168641http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000298119600008&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=b7bc2757938ac7a7a821505f8243d9f3Search in Google Scholar PubMed

5. Meier FA, Badrick TC, Sikaris KA. What’s to be done about laboratory quality? Process indicators, laboratory stewardship, the outcomes problem, risk assessment, and economic value: responding to contemporary global challenges. Am J Clin Pathol 2018;149:186–96.http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000425618700001&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=b7bc2757938ac7a7a821505f8243d9f310.1093/ajcp/aqx135Search in Google Scholar PubMed

6. World Health Organization. Classification of Digital Health Interventions v1.0. Geneva, 2018.Search in Google Scholar

7. Australia’s National Digital Health Strategy. Safe, seamless and secure: evolving health and care to meet the needs of modern Australia. 2017.Search in Google Scholar

8. Legg M. Standardisation of test requesting and reporting for the electronic health record. Clin Chim Acta 2014;432:148–56.10.1016/j.cca.2013.12.007http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000338611800024&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=b7bc2757938ac7a7a821505f8243d9f3Search in Google Scholar PubMed

9. Jones GR. The role of EQA in harmonization in laboratory medicine – a global effort. Biochem Med (Zagreb) 2017;27:23–9.10.11613/BM.2017.004Search in Google Scholar PubMed PubMed Central

10. The Royal College of Pathologists of Australasia. Pathology Information, Terminology and Units Standardisation project (PITUS 15-16) Final Report to Commonwealth of Australia Department of Health. 2017.Search in Google Scholar

11. Flatman R, Legg M, Jones GR, Graham P, Moore D, Tate J. Recommendations for reporting and flagging of reference limits on pathology reports. Clin Biochem Rev 2014;35:199–202.Search in Google Scholar PubMed

12. Legg M, Swanepoel C. The Australian Pathology Units and Terminology Standardisation Project – an overview. Clin Biochem Rev 2012;33:103–8.Search in Google Scholar PubMed

13. Valenstein PN. Formatting pathology reports: applying four design principles to improve communication and patient safety. Arch Pathol Lab Med 2008;132:84–94.10.5858/2008-132-84-FPRAFDSearch in Google Scholar PubMed

14. The Royal College of Pathologists of Australasia. PITUS UPDATE: newsletter of the RCPA Pathology Information Terminology and Units Standardisation Project. March 2017. Issue 7.Search in Google Scholar

15. The Royal College of Pathologists of Australasia. Standards for Pathology Informatics in Australia (SPIA) (v3.0) Superseding and incorporating the Australian Pathology Units and Terminology Standards and Guidelines (APUTS). 2017. p. 92.Search in Google Scholar

16. The Royal College of Pathologists of Australasia. PITUS UPDATE: newsletter of the RCPA Pathology Information Terminology and Units Standardisation Project. Australia: 2014.Search in Google Scholar

17. Hickner J, Graham DG, Elder NC, Brandt E, Emsermann CB, Dovey S, et al. Testing process errors and their harms and consequences reported from family medicine practices: a study of the American Academy of Family Physicians National Research Network. Qual Saf Health Care 2008;17:194–200.http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000256366500009&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=b7bc2757938ac7a7a821505f8243d9f310.1136/qshc.2006.021915Search in Google Scholar PubMed

18. Sikaris K. Performance criteria of the post-analytical phase. Clin Chem Lab Med 2015;53:949–58.http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000353794600019&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=b7bc2757938ac7a7a821505f8243d9f310.1515/cclm-2015-0016Search in Google Scholar PubMed

19. Tate JR, Sikaris KA, Jones GR, Yen T, Koerbin G, Ryan J, et al. Harmonising adult and paediatric reference intervals in Australia and New Zealand: an evidence-based approach for establishing a first panel of chemistry analytes. Clin Biochem Rev 2014;35:213–35.Search in Google Scholar PubMed

20. International Organization for Standardization. Medical laboratories – requirements for quality and competence. ISO 15189:2012. Geneva, Switzerland, 2012.Search in Google Scholar

21. Poon EG, Kachalia A, Puopolo AL, Gandhi TK, Studdert DM. Cognitive errors and logistical breakdowns contributing to missed and delayed diagnoses of breast and colorectal cancers: a process analysis of closed malpractice claims. J Gen Intern Med 2012;27:1416–23.10.1007/s11606-012-2107-4http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000310161500008&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=b7bc2757938ac7a7a821505f8243d9f3Search in Google Scholar PubMed PubMed Central

22. Gandhi TK, Kachalia A, Thomas EJ, Puopolo AL, Yoon C, Brennan TA, et al. Missed and delayed diagnoses in the ambulatory setting: a study of closed malpractice claims. Ann Intern Med 2006;145:488–96.10.7326/0003-4819-145-7-200610030-00006Search in Google Scholar PubMed

23. Kachalia A, Gandhi TK, Puopolo AL, Yoon C, Thomas EJ, Griffey R, et al. Missed and delayed diagnoses in the emergency department: a study of closed malpractice claims from 4 liability insurers. Ann Emerg Med 2007;49:196–205.http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000243957800012&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=b7bc2757938ac7a7a821505f8243d9f310.1016/j.annemergmed.2006.06.035Search in Google Scholar PubMed

24. The Royal College of Pathologists of Australasia. Extract from the PITUS 15-16 Report: trial implementation of an Informatics External Quality Assurance (IEQA) Program.Search in Google Scholar

25. Kristoffersen AH, Thue G, Ajzner E, Claes N, Horvath AR, Leonetti R, et al. Interpretation and management of INR results: a case history based survey in 13 countries. Thromb Res 2012;130:309–15.http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000308078800036&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=b7bc2757938ac7a7a821505f8243d9f310.1016/j.thromres.2012.02.014Search in Google Scholar PubMed

26. Skeie S, Perich C, Ricos C, Araczki A, Horvath AR, Oosterhuis WP, et al. Postanalytical external quality assessment of blood glucose and hemoglobin A1c: an international survey. Clin Chem 2005;51:1145–53.10.1373/clinchem.2005.048488Search in Google Scholar PubMed

27. Aakre KM, Thue G, Subramaniam-Haavik S, Bukve T, Morris H, Muller M, et al. Postanalytical external quality assessment of urine albumin in primary health care: an international survey. Clin Chem 2008;54:1630–6.http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000259939900009&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=b7bc2757938ac7a7a821505f8243d9f310.1373/clinchem.2007.100917Search in Google Scholar PubMed

28. Royal College of Pathologists of Australasia. Pathology Terminology and Information Standardisation Downloads 2018 [cited 2018 27 April]. Available from: https://www.rcpa.edu.au/Library/Practising-Pathology/PTIS/APUTS-Downloads.Search in Google Scholar

29. Koetsier S, Jones GR, Badrick T. Safe reading of chemical pathology reports: the RCPAQAP Report Assessment Survey. Pathology 2016;48:357–62.10.1016/j.pathol.2016.02.018http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000377233500010&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=b7bc2757938ac7a7a821505f8243d9f3Search in Google Scholar PubMed

30. Royal College of Pathologists of Australasia. Pathology Terminology and Information Standardisation Standards and Guidelines Australia, 2018 [cited 2018 4 May]. Available from: https://www.rcpa.edu.au/Library/Practising-Pathology/PTIS/APUTS-Downloads/Standards-and-Guidelines.Search in Google Scholar

31. Medical-Objects. Medical-Objects Clinical Applications, Messaging and Integration, 2018 [cited 2018 27 April]. Available from: https://www.medical-objects.com.au.Search in Google Scholar

32. Health Level Seven International. HL7 Messaging Standard Version 2.4. Section 3: Clinical and Administrative Domains, 2018.Search in Google Scholar

33. Standards Australia. AS 4700.2-2012 Implementation of Health Level Seven (HL7) version 2.4 – pathology and diagnostic imaging (diagnostics). Standards Australia, 2012.Search in Google Scholar

34. The Royal College of Pathologists of Australasia. Australian Pathology Units and Terminology (APUTS) Standards and Guidelines (v2.3). 2014. p. 56.Search in Google Scholar

35. Georgiou A, Lymer S, Forster M, Strachan M, Graham S, Hirst G, et al. Lessons learned from the introduction of an electronic safety net to enhance test result management in an Australian mothers’ hospital. J Am Med Inform Assoc 2014;21:1104–8.10.1136/amiajnl-2013-002466http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000343776700024&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=b7bc2757938ac7a7a821505f8243d9f3Search in Google Scholar PubMed PubMed Central

36. Georgiou A, Williamson M, Westbrook J, Ray S. The impact of computerised physician order entry systems on pathology services: a systematic review. Int J Med Inf 2007;76:514–29.10.1016/j.ijmedinf.2006.02.004Search in Google Scholar PubMed

Received: 2018-05-07
Accepted: 2018-06-12
Published Online: 2018-09-11
Published in Print: 2018-12-19

©2018 Walter de Gruyter GmbH, Berlin/Boston

Articles in the same Issue

  1. Frontmatter
  2. Editorial
  3. Shaping the digital transformation of laboratory medicine
  4. Shaping the digital transformation of laboratory medicine
  5. Mini Review
  6. Challenges in data storage and data management in a clinical diagnostic setting
  7. Challenges in data storage and data management in a clinical diagnostic setting
  8. Review
  9. Continuous glucose monitoring: data management and evaluation by patients and health care professionals – current situation and developments
  10. Continuous glucose monitoring: data management and evaluation by patients and health care professionals – current situation and developments
  11. Opinion Paper
  12. Long-term medical data storage: challenges with test results obtained by direct-to-consumer testing
  13. Long-term medical data storage: challenges with test results obtained by direct-to-consumer testing
  14. Legal aspects of storage and transmission of laboratory results
  15. Legal aspects of storage and transmission of laboratory results
  16. Review
  17. Interoperability of laboratory data in Switzerland – a spotlight on Bern
  18. Interoperability of laboratory data in Switzerland – a spotlight on Bern
  19. Mini Review
  20. Using HL7 CDA and LOINC for standardized laboratory results in the Austrian electronic health record
  21. Using HL7 CDA and LOINC for standardized laboratory results in the Austrian electronic health record
  22. NPU, LOINC, and SNOMED CT: a comparison of terminologies for laboratory results reveals individual advantages and a lack of possibilities to encode interpretive comments
  23. NPU, LOINC, and SNOMED CT: a comparison of terminologies for laboratory results reveals individual advantages and a lack of possibilities to encode interpretive comments
  24. Review
  25. Laboratory information system and necessary improvements in function and programming
  26. Laboratory information system and necessary improvements in function and programming
  27. Percentiler and Flagger – low-cost, on-line monitoring of laboratory and manufacturer data and significant surplus to current external quality assessment
  28. Percentiler and Flagger – low-cost, on-line monitoring of laboratory and manufacturer data and significant surplus to current external quality assessment
  29. Review
  30. Informatics External Quality Assurance (IEQA) Down Under: evaluation of a pilot implementation
  31. Informatics External Quality Assurance (IEQA) Down Under: evaluation of a pilot implementation
  32. Acknowledgment
  33. Acknowledgment
Downloaded on 17.9.2025 from https://www.degruyterbrill.com/document/doi/10.1515/labmed-2018-0050/html
Scroll to top button