Home Efficiency Measurement in Healthcare: The Foundations, Variables, and Models – A Narrative Literature Review
Article Open Access

Efficiency Measurement in Healthcare: The Foundations, Variables, and Models – A Narrative Literature Review

Published/Copyright: January 18, 2024

Abstract

Efficiency and productivity analysis have been critical in healthcare and economics literature. Despite the tremendous innovation in methodology and data availability, a comprehensive literature review on this topic has not been conducted recently. This article provides a three-part literature review of healthcare efficiency and productivity studies. It begins by reviewing the two primary empirical methods used in healthcare efficiency studies, emphasising the treatment of inefficiency persistence. Second, previous contributions to healthcare productivity research are discussed with a focus on methodology and findings. In the third section, various measures of outputs, inputs, and prices in health literature are explored to determine the extent of consensus in the literature. On the methodological front, the literature review shows that while the Data Envelopment Analysis and the Stochastic Frontier Analysis have been used extensively in healthcare productivity and efficiency studies, their application in the context of longitudinal data is limited. Further, no study currently undertakes to measure the TFP changes and its components that use both primal and dual approaches. There is also a considerable variation in the use of inputs, outputs, and price variables, suggesting that the use of variables in healthcare productivity and efficiency literature rests on the balance between data availability and the research scope.

1 Introduction

The last 30 years have witnessed considerable momentum in the number of studies published on the topic of healthcare efficiency. The theory of production and cost functions, following the seminal work of Farrell (1957), influences the current methods of efficiency evaluation. Many of the healthcare literature’s empirical methods revolve around estimating either technical or allocative efficiency or both (Worthington, 2004).

Researchers have extensively employed frontier-based efficiency techniques to measure healthcare units’ productivity and efficiency. Frontier techniques are divided into parametric and nonparametric methods. Both methods involve estimating a frontier against which the performance of healthcare providers is compared. A healthcare provider on the frontier is believed to be able to provide a given level of service using the least amount of inputs/minimum cost or the maximum level of services for a given level of inputs/cost (Hollingsworth & Peacock, 2008, p. 2). The degree of deviation from the efficient frontier provides an estimate of the level of inefficiency.

Data Envelopment Analysis (DEA) is a nonparametric methodology based on linear programming tools developed by Charnes et al. (1978) and is one of the commonly used frontier-based methodologies in health efficiency studies. The DEA frontier includes a series of linear segments connecting one efficient decision-making unit (DMU[1]) to another. The frontier’s construction is based on “best-observed practice,” where inefficient DMUs are “enveloped” by the efficiency frontier. A notable feature of the traditional DEA, and often considered a drawback, is that all deviations from the frontier are attributed to inefficiency. However, this issue has been addressed by the bootstrap methods of Simar and Wilson (1998), which provide a mechanism to distinguish between inefficiency and statistical noise, thereby refining the accuracy of the efficiency estimates derived from DEA.

One of the earliest applications of efficiency measurement techniques was undertaken by Nunamaker (1983), who used the DEA to estimate the technical efficiency of 16 hospitals in the state of Wisconsin, USA. Soon after, Borden (1988) and Sherman (1984) also employed the DEA methodology to compute the technical efficiency scores of hospitals in the USA.

A major limitation of using the DEA comes from the fact that it makes an unverifiable and strong assumption of no measurement error or random variation in output (Newhouse, 1994). In particular, interpretation of DEA-based results may be problematic, as frontiers may be affected by stochastic variance, measurement error, or unobserved heterogeneity of data (Hollingsworth & Peacock, 2008, p. 37).

In the healthcare sector, there are occasions where the healthcare unit’s capacity to deliver services is affected by factors outside the healthcare provider’s control. For example, the sudden onset of a pandemic in a region, medical equipment may suddenly break down,[2] or there may be errors in the measurement of the level of resources used. Since the DEA fails to account for random shocks, it can introduce bias to the efficiency scores (Jacobs et al., 2006, p. 153). Nevertheless, the DEA and its variants are still the most widely used tool in healthcare studies, possibly due to its ease of use and versatility (Jacobs et al., 2006, p. 13).

In the last 20 years, various studies have been put forward that can be used in conjunction with the DEA to deal with efficiency scores’ sensitivity. One of the most popular among these techniques is the application of the bootstrap methodology introduced by Simar and Wilson (1998, 2007). The bootstrap methodology, to some extent, has addressed the issue of the sensitivity of efficiency scores to the sampling variation and has provided the statistical properties of the nonparametric estimators. Some of the recent studies in healthcare literature using bootstrap methodology with the DEA include work by Alonso et al. (2015), Andrews (2020a), Andrews (2020b), Chowdhury and Zelenyuk (2016), and Jiang and Andrews (2020).

Stochastic Frontier Analysis (SFA) is, on the other hand, a parametric approach developed independently by Aigner et al. (1977) and Meeusen and Van den Broeck (1977). SFA differs from DEA in its assumption that discrepancies between actual and optimal organizational performance are due to inefficiencies and random shocks.

In order to incorporate the concept of stochastic shocks and inefficiency in SFA, the error term is defined as the sum of two components – a one-sided, non-negative term that represents inefficiency and the other component, which represents random or stochastic fluctuations. In addition to the distributional assumption, the production function specification is also required in SFA. On the other hand, DEA requires no specification of the production functions or distributional assumptions where the efficiency frontier is constructed purely based on observed data (Jacobs et al., 2006, p. 90; Nedelea & Fannin, 2013).

Even though there are challenges associated with SFA’s assumptions and specifications, its ability to separate random fluctuations beyond a hospital’s control has made it very popular. Furthermore, SFA allows researchers to estimate the relationships between outputs, inputs, and costs. Further, SFA allows researchers to separate healthcare provider-specific effects (heterogeneity) and time-specific effects when longitudinal data are available. Hence, applying SFA to longitudinal data also allows for a more robust estimate of parameters.

While the majority of studies in healthcare efficiency literature use classical inferences to estimate the model parameters, Koop et al. (1997) employed Bayesian inference to estimate the model parameters and cost efficiency by using longitudinal data on 382 non-teaching U.S. hospitals. More recently, Chen et al. (2016) used Bayesian SFA to estimate hospital cost efficiency in 31 provinces in China.

Although SFA’s implementation is more demanding in terms of modelling and interpretive skills (Jacobs et al., 2006, p. 13), it has been gaining more prominence in healthcare productivity and efficiency studies (Worthington, 2004). Some healthcare studies that employ SFA include work by Al-Amin et al. (2016), Chen et al. (2016), Colombi et al. (2017), and Jiang and Andrews (2020). Some more studies include the following:

Although SFA’s implementation is more demanding in terms of modelling and interpretive skills (Jacobs et al., 2006), it has been gaining prominence in healthcare productivity and efficiency studies (Worthington, 2004). Some healthcare studies employing SFA include the works of Al-Amin et al. (2016), Chen et al. (2016), Colombi et al. (2017), and Jiang and Andrews (2020). Recent additions to this body of literature are the analysis of health facility efficiency for non-communicable diseases by Bala et al. (2023), an examination of hospital cost efficiency in US acute care by Linde (2023), a dynamic analysis of cost efficiency in New Zealand healthcare providers by Andrews and Emvalomatis (2023), and an evaluation of the temporal–spatial evolution of Healthcare Services Efficiency in 31 Chinese provinces by Ye and Tao (2023).

Since the introduction of SFA in efficiency literature, several approaches have been put forward that employ it in the context of longitudinal data in various other sectors. Early research into longitudinal SFA focussed on estimating time-invariant (persistent) or long-run efficiency (Battese & Coelli, 1988; Kumbhakar, 1987; Pitt & Lee, 1981; Schmidt & Sickles, 1984), time-varying (transient) or short-run efficiency (Battese & Coelli, 1992, 1995; Cornwell et al., 1990; Kumbhakar, 1990). Other studies, such as those by Kumbhakar and Heshmati (1995) and Kumbhakar and Hjalmarsson (1995), estimated persistent efficiency and transient efficiency. On the other hand, studies by Greene (2005a,b), Kumbhakar and Wang (2005), and Wang and Ho (2010) estimated transient efficiency while accounting for heterogeneity at the cost of ignoring persistent inefficiency.

Currently, two main approaches exist in the efficiency literature that incorporates the idea of persistence in inefficiency. The difference between these two approaches is due to the specific treatment of the adjustment cost hypothesis in efficiency analysis. Studies such as those by Colombi et al. (2014), Filippini and Greene (2016), Filippini and Hunt (2015), Kumbhakar and Heshmati (1995), Kumbhakar and Hjalmarsson (1995), Kumbhakar et al. (2014), and Tsionas and Kumbhakar (2014) employ SFA without incorporating the adjustment cost theory. Instead, they divide total inefficiency into short-run (transient) and long-run inefficiency (persistent inefficiency). In healthcare literature, studies incorporating the persistent nature of inefficiency are scarce. So far, only one study by Colombi et al. (2017) has used SFA to evaluate the transient and persistent efficiency of 133 Italian hospitals.

An important point to note about these studies is that the authors differentiate between transient (short-run) and persistent (long-run) inefficiency by specifying a time-varying and one-sided time-invariant skewed error term, respectively. In such specifications, both short-run and long-run inefficiency terms necessitate one-sided distributional assumptions that only take positive values, such as half-normal, exponential, and gamma distributions. While these terms often employ the same type of distributional specification, they are considered independent of each other. Also, the short-run inefficiency estimates in these models are assumed to be independent between different time periods.

The motivation behind the existence of long-run inefficiency in these models fundamentally rests on the idea that there are long-run factors that give time-invariant characteristics to persistent inefficiency. Examples of such factors include obsolete production equipment and technology, substandard buildings, substandard transport systems, the continuous lack of workforce development leading to underexploited technologies, and other management rigidities associated with administrative practices.

In other words, long-run inefficiency stems from structural issues that constrain efficient methods due to operational rigidities over a longer time horizon. These operational rigidities are theorized to be related to physical capacity, infrastructural problems, recurring managerial incompetence, and modern technology availability. Though the motivation behind persistence in inefficiency makes economic sense, none of these studies incorporates dependencies in inefficiency through time.

Another strand of efficiency literature provides a more comprehensive and economically intuitive way of combining the idea of short- and long-run inefficiencies in SFA through a dynamic process. This approach explicitly highlights the existence of the adjustment costs of quasi-fixed inputs to be the primary reason for persistence in inefficiency over time. As a result, the organization would prefer to remain partly inefficient in the short run due to high adjustment costs and instead seek to achieve its targeted long-run efficiency level (i.e. steady-state efficiency level) in the long run. Furthermore, these models are dynamic, allowing the short-run inefficiency between periods to be dependent.

Ahn and Sickles (2000) pioneered the dynamic stochastic frontier approach by specifying an autoregressive process to accommodate persistence in inefficiency due to adjustment costs. The dynamic models they specified use generalized non-linear methods of moments (GMM) to estimate the parameters. However, a study by Bun and Windmeijer (2010) highlights that in dynamic longitudinal data models estimated via GMM methods, weak instruments – variables that inadequately correlate with endogenous predictors – can become particularly problematic near the unit root boundary, where variables exhibit stochastic trends without reverting to a long-term mean. This issue can lead to a skewed variance ratio of errors, a measure comparing the variability due to inefficiency versus other random effects, straying from the ideal value of unity. This divergence affects the model’s accuracy, especially in differentiating between short-run and long-run inefficiencies, emphasizing the need for careful instrument selection in econometric analyses to ensure reliable results.

Desli et al. (2003) put forth a version of the dynamic stochastic frontier model estimated using maximum likelihood methods (ML), assuming that the healthcare provider-specific intercept is autoregressive, with a set of covariates that influence a healthcare provider’s production frontier over time. However, as Khalaf and Saunders (2016) highlighted, this specification is prone to incidental parameter bias due to the correlation between unobserved heterogeneity and efficiency-specific covariates in the latent equation.

Using the Bayesian approach, Tsionas (2006) presented a dynamic model where an autoregressive process was applied to a transformed efficiency that can take any value on the real line, and thus, a standard autoregressive process can be imposed on it. Similarly, using the Bayesian approach, Emvalomatis (2012) used the inverse of the logistic function of technical efficiency as a transformation in the autoregressive process. Building on Tsionas (2006), many other Bayesian dynamic model versions have been presented in studies such as those by Emvalomatis et al. (2011), Galán et al. (2015), Lambarraa et al. (2015), and Skevas et al. (2018).

These dynamic models are motivated by the adjustment cost hypothesis, where the short-run efficiency is derived based on the organization’s performance relative to the production possibility frontier. In contrast, the long-run efficiency corresponds to the long-run equilibrium value of efficiency specified by the autoregressive process. Hence, the dynamic model is more flexible, as it accommodates the relationship between transient efficiency between different periods. However, no such relationship is found in the models where transient efficiency is assumed to be a one-sided time-varying error component. Further, in the non-dynamic model, short-run efficiency is obtained from a system that is always assumed to be in equilibrium. The assumption of constant equilibrium is unrealistic in the presence of adjustment costs and other rigidities arising from the sector’s regulatory framework.

To the best of current knowledge, there are no studies that have incorporated the idea of dynamic models in assessing healthcare providers’ efficiency or productivity performances. Jacobs et al. (2006, pp. 174–176) argue that in the short run, healthcare providers might only be able to perform relative to the various constraints imposed by infrastructure and available inputs (e.g. quality of clinical equipment and technology). Therefore, short-run efficiency levels should only be assessed based on the configuration of the inputs that a healthcare provider has available. On the other hand, healthcare providers may reconfigure their resources to bring about efficiency improvements in the long run. This implies that the healthcare production process should be modelled through a dynamic link between its present and past performances Jacobs et al. (2006, pp. 177–178).

Specifying a dynamic link basically makes it possible for the current output of a healthcare provider to depend on the ease with which the inputs can be reconfigured or through which technology may be adopted in the presence of adjustment costs. Given the prevalence of public finance as a fundamental source of healthcare services in the majority of developed countries and the existence of a highly regulated operating environment (Jacobs et al., 2006, p. 3), it is surprising how little attention is paid to this dynamic link in healthcare studies.

A possible reason for this might be the complexity associated with the Estimation of dynamic longitudinal models and the small data sizes that are prevalent in healthcare efficiency studies (Jacobs et al., 2006, pp. 37–38). Nevertheless, Colombi et al. (2017) and Hollingsworth and Street (2006) argue that identifying the nature and the form of inefficiency in healthcare systems is critical for formulating appropriate policy measures. For example, if a high degree of persistence in inefficiency exists, especially among public healthcare providers, then unless there is a reconfiguration of the current organizational structure or/and a significant change of government policy towards the overall system, efforts to improve efficiency will not yield expected outcomes.

A selective list of healthcare efficiency studies from the early 1990s to 2020 that have used frontier-based approaches is presented in Table A1. While not a complete list, it provides a fair representation of the methodology used in the last three decades. A comprehensive list of frontier-based healthcare efficiency studies can be found at Hollingsworth and Peacock (2008, pp. 102–117) and Worthington (2004).

It is also worth noting that, based on the studies listed in Table A1, 36 of the 40 studies that applied DEA, with or without bootstrapping, 17 of them exclusively used longitudinal data with no control for unobserved or unit-specific heterogeneity. This is likely to result in a substantial bias in the measure of efficiency. Additionally, of the 13 studies that used SFA on longitudinal data, only Barros et al. (2013), Chen et al. (2016), Colombi et al. (2017), and Koop et al. (1997) controlled for unobserved heterogeneity.

2 Previous Contribution to TFP Studies in Healthcare

When longitudinal data are available, it is insightful to investigate the changes in productivity over time and to decompose it into its components to investigate the relative contributions. In the healthcare sector, such a study will help analyse the effect of targeted policies on the provision of health services and ultimately determine the impact of various initiatives on the population’s health outcomes. For example, suppose the intention is to examine the productivity of a group of healthcare providers. In that case, one could determine whether productivity change for a specific healthcare provider is driven by improvement in the provider’s relative efficiency, scale improvement, or technological progress.

In practice, TFP can either be estimated by using index number methods or econometric techniques. Examples of index number methods include the Malmquist productivity index, the Hicks–Moorsteen productivity index, the Törnqvist productivity index, and the Fisher productivity index (Jacobs et al., 2006, p. 129). In the econometric approach, regression analysis and the SFA are often used to estimate a production or cost function with distributional assumptions to estimate the TFP change and its components.

A selective list of healthcare literature studies focusing on TFP change and its components is provided in Table A2. Among others, this list summarizes the methodology, variables, and results of several healthcare studies related to the assessment of TFP and components. While the studies in Table A2 decompose the TFP changes in efficiency, scale, and technological components, they tend to concentrate on the relative contribution of efficiency and technological change to changes in the TFP.

The list of studies in Table A2 shows the Malmquist productivity index (Malmquist, 1953) to be the most common approach to healthcare productivity. Malmquist’s productivity index was introduced into the literature through the seminal study by Caves et al. (1982), which adopted Malmquist’s approach to constructing quantity indices as distance function ratios.

Färe et al. (1992) undertook one of the earliest applications of the Malmquist productivity index in healthcare. The study analysed the productivity changes of a group of pharmacies in Sweden and concluded that most of the improvements in the TFP were due to technological progress. The healthcare studies that followed the studies by Linna (1998); Maniadakis et al. (1999); Ng (2011); Tambour (1997)) and that used the Malmquist productivity index also found positive technological progress.

However, following the study by Färe et al. (1992), another application of the Malmquist productivity index was undertaken by Burgess and Wilson (1995). Their study found that technological decline dominated the effect of technical efficiency on TFP for a group of 137 hospitals in the USA. Other studies, such as those by Giuffrida (1999), González and Gascón (2004), and Jiménez et al. (2003), have also either reported technological regress or no significant impact of technological change on the TFP.

As for the effect of efficiency changes on TFP, studies by Dismuke and Sena (1999), Gannon (2008), Linna (1998), and Maniadakis and Thanassoulis (2000) found that both efficiency and technological progress contributed to the increase in the TFP. On the other hand, Giuffrida (1999), who assessed the TFP growth of 90 English family health service authorities over the period 1991–1995, found that only technical and scale efficiency contributed to TFP growth while there was no noticeable technological progress. Further, the study highlighted that improvements in the TFP were minimal and expressed a limited scope of productivity growth in the healthcare sector.

Two Malmquist index studies have incorporated the quality of healthcare outputs in the assessment of the TFP. The earliest study was by Färe et al. (1995), which showed that the incorporation of quality significantly affects the measure of TFP change. More recently, Karmann and Roesel (2017) found that quality improvements contributed more growth towards TFP than just output volumes for a sample of German hospital data.

While Malmquist indices are frequently utilized in healthcare studies and demand data on inputs and outputs to be consistently measured over time (Jacobs et al., 2006, p. 137), their application in healthcare is challenging due to frequent policy shifts and variable data collection practices. It is important to note that, although the computation of Malmquist indices does not require the assumption of constant returns to scale (CRS), they are often calculated under this assumption. This common practice, albeit not a necessity, is primarily to facilitate the evaluation of productivity changes inclusive of scale effects. This approach, while not always ideal, allows for a simplified analysis of scale efficiency alongside efficiency and technical changes (Coelli et al., 2005, p. 293).

However, very few healthcare studies use econometric approaches to assess improvements in productivity and its components. Morikawa (2010) used a fixed-effects panel-data model to estimate the effects of increasing hospital size on TFP, based on data from 239 Japanese medical facilities. The study found that TFP increases more than 10% when the size of a hospital doubles. Using time-series regression analysis, Blank and Eggink (2014) used Dutch hospital data for the period 1972–2010 to analyse productivity improvements. Their studies looked into the effects of regulations on changes in the TFP and found that hospital competition reform failed to improve productivity among hospitals. Both of these studies excluded inefficiency in the cost/production functions. Concerning the econometric approach, only one study by Dismuke and Sena (1999) used the SFA to decompose the TFP change for a group of hospitals in Portugal. Their result found technical progress for most of the hospitals.

One of the advantages of using SFA is that it allows for the control of unobserved time-invariant heterogeneity when computing costs and input elasticities. Further, it offers an opportunity to decompose productivity changes into parts that have a straightforward economic interpretation. Despite numerous advantages, currently, SFA is not widely used in the healthcare sector to assess changes in the TFP and its components.

Another noteworthy point is that all the studies in Table A2, except for Linna (1998), have used the primal approach to estimate the TFP changes and their components. No study has undertaken a TFP analysis under both the primal and dual approaches. In TFP analysis, the primal and dual approaches offer distinct methodologies for estimating productivity growth. The primal approach, or output-based method, focuses on quantifying the output growth relative to the growth of inputs, directly measuring the changes in the quantities of inputs and outputs. It essentially examines the production function to derive productivity changes.

The dual approach, in contrast, is often characterized by the use of total cost as the dependent variable, with the prices of inputs and outputs serving as independent variables. This input-based or price-based method derives TFP growth by analysing how input and output prices impact the cost structure. It looks at the cost function, focusing on the relationship between costs and prices to deduce productivity changes.

The duality theory, as outlined by Jorgenson and Griliches (1967), suggests that these two approaches should theoretically yield consistent results since they are both grounded in the same economic behaviour but viewed from different perspectives. However, practical differences can arise in their outcomes due to factors like measurement errors, model misspecification, or data inconsistencies (Kee, 2004). For instance, the primal method might be skewed by inaccurate measurements of physical quantities, whereas the dual method could be influenced by price fluctuations or market changes that affect costs and the pricing of inputs and outputs. Hence, although the primal and dual methods are theoretically aligned, their empirical implementation can yield divergent conclusions, underscoring the need for meticulous data scrutiny and precise model specification in TFP research. Thus, applying both methods can provide a comprehensive view and help identify the reasons for variations in TFP estimations.

3 Variables used in Healthcare Efficiency and Productivity Studies

3.1 Output Variables

The measurement of output in the healthcare sector is not straightforward, as the demand for healthcare services arises from the need to improve health status. A healthcare institution combines resources such as labour and capital to provide healthcare services, which individuals then consume, leading to improved health (Hollingsworth & Peacock, 2008, p. 21). Therefore, production and efficiency analysis should ideally be based on improving the population’s health status (Jacobs et al., 2006, p. 22).

Using health results when studying effectiveness and performance analyses, proponents claim that outcome metrics are the primary purpose of delivering health services. While the argument is convincing, it lacks consistency in practical applications. The quality of life measures is often formulated using different key indicators and methodologies (Hollingsworth & Peacock, 2008, p. 24). Furthermore, healthcare outcomes may take years to be realized, and the collection of healthcare outcome data may impose impractically high costs on the health system (Jacobs et al., 2006, p. 27). Additionally, the expected improvement in an individual’s health status depends on other factors which may be outside the healthcare providers’ control.

Due to the practical difficulties involved in measuring healthcare outcomes and the associated costs of collection, various measures of healthcare activities are used as proxies for healthcare outcomes (Jacobs et al., 2006, p. 27). These proxies measure healthcare outputs in inpatient care episodes, outpatient visits, and the length of inpatient stay (Hollingsworth & Peacock, 2008, p. 24). For example, measures of healthcare activities can include a count of the patients admitted, surgical procedures performed, outpatient numbers, or immunizations given.

The studies in Tables A1 and A2 show that almost all healthcare productivity and efficiency studies use healthcare activities to measure healthcare outputs. However, the exceptions are studies that focus on measuring healthcare productivity at the regional or cross-country level. For example, Cozad and Wichmann (2013) used state-level data from U.S. hospitals, including survival rates, health status, and population share without disabilities, to measure technical efficiency. Similarly, Kinfu (2013) used data on under-five-year-old mortality rates for 52 districts in South Africa. In a cross-country analysis, Cetin and Bahce (2016) used DEA to measure life expectancy and infant mortality rates from 26 OECD countries to assess their relative technical efficiency using DEA.

From Tables A1 and A2, it is also evident that the majority of studies used inpatient admissions or discharges as one of the measures, along with some versions of outpatient visits. A handful of studies also used ancillary services, such as the number of X-rays taken (Pilyavsky and Staat (2008)), laboratory tests performed (Athanassopoulos & Gounaris, 2001; Pilyavsky & Staat, 2008), and ambulatory visits (Ancarani et al., 2009; Burgess & Wilson, 1995; Chowdhury & Zelenyuk, 2016) as a measure of healthcare outputs.

The number of inpatients can be considered the most critical measure of hospital output in resource consumption. The measuring of inpatient services can further be divided into the number of admissions, the number of inpatient bed days, and the number of separations.[3] While the majority of studies use separation as a measure of output, there are a few studies such as those by Mutter et al. (2008), Pilyavsky and Staat (2008), and Pilyavsky et al. (2006) that use inpatient admissions as a measure of output.

In an attempt to incorporate both case complexity and severity into the measurement of healthcare outputs, studies often use the number of inpatient days to account for case complexity and resource use. One of the earliest studies to use this variable is by Grosskopf and Valdmanis (1987). They used acute and intensive care inpatient bed days along with other variables to assess the efficiency of 80 hospitals in California, USA. More recently, Giménez et al. (2019) and Jiang et al. (2017) included measuring inpatient days as one of the output variables in evaluating efficiency levels.

However, the use of inpatient days still does not fully capture the case complexity and can only be considered a crude measure (Hollingsworth & Peacock, 2008, p. 24). For example, a one-day inpatient stays in the geriatric ward cannot be counted as equal to a one-day stay by a newborn in a paediatric ward. The treatments and costs differ greatly depending on patients’ health conditions and characteristics. Nevertheless, inpatient days and treatments provide some reliability in terms of output measurement but do not fully reflect the heterogeneity of outputs (Hollingsworth & Peacock, 2008, p. 25).

In assessing healthcare efficiency, studies like Thanassoulis et al. (2016, 2020) take an alternative approach by examining the inpatient episode at the individual patient level. Here, each case is inherently homogenized by its diagnosis, negating the need for “case mix” adjustments. This method treats each inpatient episode as a distinct entity categorized by a specific diagnostic group, thus simplifying comparisons. Nonetheless, it is crucial to ensure the average length of stay is accurately considered, as this can significantly impact the assessment of healthcare productivity.

However, it is important to recognize that not all studies can utilize such a fixed categorization. As Hollingsworth & Peacock (2008) explain, individual inpatient stays often differ greatly due to the complexity and severity of the patient’s condition. This variation has necessitated the development of “Case-Mix” adjusted outputs, which standardize the outputs by accounting for the varying severity of patient conditions. Commonly, these adjusted measures are based on Diagnosis Related Groups (DRGs), providing a more equitable comparison across diverse patient groups. While DRGs offer a solution to output measurement inconsistency, they also introduce an additional layer of complexity to the analysis, hence the merit in considering both methodologies for a comprehensive review.

The DRG system was pioneered by Fetter et al. (1980) who stated that “the primary objective in the construction of the DRGs was a definition of case types, each of which could be expected to receive outputs or services from a hospital” (p. 5). In other words, DRG is essentially a statistical system of classifying any inpatient stay by considering the diagnosis involved and the hospital resources necessary to treat the condition. Each DRG is assigned a specific price based on at least five characteristics: the person’s age, their primary and secondary diagnosis, the primary and secondary surgical procedures and, in some cases, the gender of the patient (Fetter et al., 1980).

A study by Rosko and Chilingerian (1999) found that the inclusion of case-mix output variables reduces the mean efficiency score by more than 50%. Another study by Björkgren et al. (2004) showed that the efficiency scores vary considerably, depending on the case-mix adjustments used for inpatient services. The popularity of case-mix output measures has grown since it was first used by Wagstaff (1989) to assess the efficiency of 49 Spanish hospitals.

Similarly, using DRG-based case-mix measures, Brown (2003) separated discharges into three classes to account for relative resource use and the complexity of treatments. Soon after, Linna et al. (2006) used DRG-based output measures to assess hospitals’ efficiency in Norway and Finland. A more recent study on hospitals in Ontario used case-mix adjusted weighted inpatient bed days and ambulatory visits as outputs Chowdhury and Zelenyuk (2016). In New Zealand, studies by Andrews (2020a,b), and Jiang and Andrews (2020) have used DRG-based case-weighted inpatients and price-weighted outpatient visits as a measure of outputs.

3.2 Inputs and Price Variables

In healthcare literature, the measurement of inputs tends to be relatively less challenging than outputs, as physical inputs can often be measured more precisely than outputs (Jacobs et al., 2006, p. 29). In its simplest form, the production of healthcare services involves combining resources such as labour, capital, and other intermediary inputs to produce healthcare services, which the individuals then consume to improve their health status (Hollingsworth & Peacock, 2008, p. 21).

As healthcare is a labour-intensive sector, medical and non-medical staff’s contribution is crucial in providing services to the population. The use of measures of labour input in efficiency and productivity studies varies significantly. Of the 40 studies presented in Table A1 on technical efficiency, 18 used the staff numbers, and 15 used FTEs to account for labour consumption. The use of counts may not be appropriate as the number of labour units does not account for actual workforce use; most importantly, they do not reflect the actual time spent doing tasks. Moreover, counts obscure the mix of staff who are employed on a part-time or casual basis or who work overtime (Peacock et al., 2001). In such cases, compared to headcounts, the FTE measure is more appropriate in accounting for the mix of various types of staff time.

Another consideration in the healthcare efficiency literature relates to the level of labour disaggregation, based on the skill level that is deemed to be appropriate. Jacobs et al. (2006, p. 30) argue that unless there is a particular interest in analysing the input relationships or addressing specific policy-related questions, it may be reasonable to aggregate labour inputs by weighting them according to their relative wages. However, such data on labour prices might not always be available. Studies such as those by Mitropoulos et al. (2015) and Sommersguter-Reichmann (2000) appear to have used an unweighed aggregated measure of labour inputs. Andrews (2020b) used disaggregated labour data on medical, nurses, allied, support and management staff, Andrews (2020a) and Jiang and Andrews (2020) weighted and aggregated the FTEs of nurses, allied, support and management staff, based on their relative price.

Suppose the research interest is to extract the input elasticities or measure how each labour group’s input interacts with each other or with outputs. In that case, it may be appropriate to use disaggregated data. This may be particularly important when the SFA is used to estimate the efficiency and relationships between various inputs and outputs.

According to the studies in Tables A1 and A2, most hospital-based efficiency and productivity studies have at least disaggregated labour inputs into doctors, nurses, and all other staff. On the other hand, studies such as those by Ahmed et al. (2019), Alonso et al. (2015), and Ancarani et al. (2016) have aggregated only the labour inputs into doctors and nurses while completely ignoring the contribution of administration and other labour groups. Once again, this might be due to the unavailability or inconsistency of data for various non-medical skill groups.

Jacobs et al. (2006, p. 30) suggest that labour inputs can also be measured in terms of expenditure, as physical inputs fail to capture variations in the wage rates between different labour groups and organizations. Studies by Giokas (2001) and Steinmann and Zweifel (2003) used labour expenditure data as a proxy for input consumption to estimate technical efficiency. However, due to commercial and political sensitivity, access to financial data may be limited in many cases.

Capital is the second crucial input factor needed to undertake any kind of productivity and efficiency study. However, measuring capital in the healthcare sector is more complicated than labour input. This is due to the difficulty in distinguishing and measuring the flow of capital services from the capital stock at any given time. As a result, researchers often rely on very rudimentary measures, such as hospital beds, depreciation or hospital floorspace. Ideally, the best indicator of capital input is the flow of capital services from capital stock (Jacobs et al., 2006, pp. 31–32). However, such services are hard to measure in practice, and the associated data is challenging to obtain.

In the healthcare literature, the number of beds is the most widely used proxy for the measure of capital stock (Worthington, 2004). Similarly, based on the studies listed in Tables A1 and A2, 26 out of 40 studies on technical efficiency and nine out of 19 studies on TFP decomposition use beds to measure capital stock. Some recent studies that use hospital beds include those by Ahmed et al. (2019), Colombi et al. (2017), and Sultan and Crispim (2018). Though widely used, the use of hospital beds is far from ideal and can lead to an overestimation of capital use, which may result in biased estimates of efficiency (Jacobs et al., 2006, p. 32).

Another rarely used measure of capital is the “capital charge,” which was first used by Parkin and Hollingsworth (1997) to evaluate the technical efficiency levels of 75 Scottish hospitals. According to The Treasury (2001), the capital charge is a cost levied on the Crown’s investment in various government agencies. It implies that capital is not costless and should be managed in the same manner as any other cost of production. The capital charge is usually levied on the organization’s net worth (assets minus liabilities). A New Zealand study by Jiang and Andrews (2020) also used the capital charge as a measure of capital input to estimate the technical efficiency of 20 DHBs.

While incorporating capital in efficiency and productivity analysis can take various forms, Coelli et al. (2005, pp. 149–150) recommend that conducting a sensitivity analysis of efficiency scores to various choices of capital can assist in providing some reliability of the chosen capital measure. In both DEA and SFA models, capital and other expenses can be treated as distinct inputs, allowing analysts to explore the flexibility in input trade-offs and their implications for efficiency assessments.

According to the studies in the healthcare literature in Table A1, while the majority undertake technical efficiency analysis, a handful evaluates the cost and allocative inefficiency. The estimates of cost and allocative efficiency can offer insights into how successfully healthcare providers can minimize costs. However, evaluating cost/allocative efficiency requires information on input prices for labour and capital.

The measure of labour price in healthcare productivity and efficiency studies takes different forms, given that researchers are often required to work with the data where availability is either limited or may even vary significantly across healthcare providers. Based on the studies listed in Tables A1 and A2, the most common way of computing the price of labour is to divide the labour expenditure by their respective FTEs, which then computes the price per FTE for a particular professional group. Some recent cost efficiency studies that use price per FTE as a proxy for labour price include those of Al-Amin et al. (2016), Jiang and Andrews (2020), and Widmer (2015).

Another set of studies, such as Araújo et al. (2014), Friesner et al. (2008), and Vitaliano and Toren (1994), used the average wages paid to the staff as a measure of labour price. In contrast, Blank and Valdmanis (2005), Koop et al. (1997), and Medin et al. (2011) used wage indices as a proxy for labour price. Further, whether labour price should be disaggregated by skill mix once again depends on the research question and the context of the study.

As with capital input, the incorporation of capital price into the efficiency analysis is more problematic than the labour price (Folland & Hofler, 2001). Ideally, the price of capital should estimate the price of the flow of capital services, which is not straightforward to measure. Further, according to the neoclassical theory of investment, the price of capital is the rental price of capital, which, under a profit-maximization assumption, is equal to the marginal product of capital (Eisner & Nadiri, 1968). Therefore, the rental price or price of capital can be considered to be the sum of the interest rate or the borrowing cost of capital and the depreciation rate (Harcourt & Riach, 1997). Surprisingly, the neoclassical concept of the rental price is seldom used in healthcare studies as a proxy for the capital price. The only exception was the study by Chen et al. (2016), where the ratio of depreciation over total assets (depreciation rate) is used to estimate the capital price. The standard approach in healthcare studies appears to be dividing the value of the capital stock or expenditure by the real value of capital.

According to Webster et al. (1998), the user cost of capital is often reflected in the depreciation and the opportunity cost of capital (capital charge and interest expenditure). Following the idea of the user cost of capital to compute capital, Al-Amin et al. (2016), Mutter et al. (2008), Nedelea and Fannin (2013), and Rosko (2001) used the sum of depreciation and interest costs per bed to estimate the capital price. Another study by Friesner et al. (2008) used the sum of depreciation and interest costs divided by the hospital building area in square footage to estimate the price of capital. Similarly, in New Zealand, Jiang and Andrews (2020) used both depreciation and the capital charge per inpatient discharge to estimate capital price and conduct a sensitivity analysis of the two measures of price.

The third category of the price needed to estimate cost efficiency is intermediate inputs such as supplies (clinical and non-clinical) and other operating costs. Since the data on the volume of intermediate inputs are often unavailable, the standard way is to divide the aggregate expenditure by the number of hospital beds (Webster et al., 1998). Herr (2008) used the cost of clinical expenditure per installed bed to measure the price of clinical materials. On the other hand, Widmer (2015) computed the operating price per hospital admission, whereas Jiang and Andrews (2020) estimated the price per inpatient discharge. In another study, Friesner et al. (2008) used a producer price index to estimate the price of hospital supplies.

4 Summary

The literature review examines various methodologies applied in healthcare efficiency and productivity studies, revealing a variety of analytical approaches yet punctuated by significant gaps. While Data Envelopment Analysis (DEA) and Stochastic Frontier Analysis (SFA) have been prominent tools in this research domain, their utilization with longitudinal data, which is critical for understanding performance over time, remains limited. This gap suggests a potential underutilization of available data and methodologies to capture the dynamic nature of healthcare efficiency.

Moreover, the literature lacks studies that concurrently apply primal and dual approaches to measure Total Factor Productivity (TFP) changes. The primal approach, which focuses on output relative to input growth, is commonly contrasted with the dual approach that examines the impact of input prices and output on cost structures. The simultaneous application of both methods could offer a comprehensive analysis, reconciling the quantity-based and price-based perspectives of productivity growth. It could also inform discrepancies arising from external influences, including policy shifts and economic conditions, that singularly employed methodologies might overlook.

The review also highlights the variability in the application of inputs, outputs, and price variables in healthcare productivity and efficiency studies. This variability is not merely a methodological preference but often a response to the challenges of data acquisition and the particular objectives of research. The sector’s complexity necessitates carefully selecting these variables, where data availability and consistency constrain choices. This limitation impacts not only the precision of efficiency assessments but also the validity of subsequent policy recommendations.

A significant literature gap is identified in the control (or lack thereof) for cross-sectional unobserved heterogeneity in longitudinal studies. This methodological oversight can yield efficiency measurements fraught with bias, a concern highlighted by Greene (2005a) and others. The nuanced understanding of healthcare provision, influenced by a myriad of patient, provider, and systemic factors, calls for an analytical framework that accounts for these unobserved elements.

In the limited instances where studies like that of Colombi et al. (2017) have differentiated between short-run and long-run inefficiency, there remains an absence of analyses that model efficiency dynamically, acknowledging the inter-temporal dependencies of inefficiencies. Such dynamic modelling is crucial for capturing the true nature of efficiency evolution within healthcare providers, where past performances can cast long shadows on future productivity.

The review identifies that the majority of healthcare efficiency studies have traditionally focused on technical efficiency, often using inpatient admissions or discharges as proxies for output. However, there is an emerging recognition of the need to evaluate cost and allocative efficiency, which requires detailed information on input prices for labour and capital. The latter is particularly challenging in the healthcare sector, where capital is not just a static measure of infrastructure but a dynamic resource that evolves with technological advancements and strategic investments.

The complexity of healthcare services, which combine labour, capital, and materials to produce outcomes that impact health status, is mirrored in the complexity of measuring these very inputs and outcomes. Labour inputs, from medical to non-medical staff, contribute vitally to service delivery, yet their measurement varies from headcounts to full-time equivalents (FTEs), with the latter providing a more nuanced representation of labour utilization.

Capital, as the second crucial input, has typically been measured by proxies such as the number of beds or floor space, but these measures fail to capture the nuances of capital service flows. The “capital charge” model is an alternative that reflects the economic cost of capital, yet its adoption is not widespread in healthcare studies. Such measures are necessary to reflect the actual use and cost of capital more accurately in healthcare settings.

In conclusion, this literature review not only maps out the existing methodologies and applications in healthcare efficiency studies but also casts a spotlight on the avenues for methodological improvement. The identified gaps underscore the need for a more dynamic, nuanced approach to efficiency and productivity analysis that can accommodate the sector’s complexity. By addressing these gaps, future research can offer more detailed insights into the temporal dynamics of healthcare efficiency, ultimately informing policy decisions that could lead to better health outcomes and more efficient use of resources.

  1. Funding information: Funding for open access to this research was provided by Ajman University under request number 1560631.

  2. Conflict of interest: The authors report no competing interests to declare.

  3. Article note: As part of the open assessment, reviews and the original submission are available as supplementary files on our website.

Appendix

Table A1 and Table A2

Table A1

Previous contribution to healthcare technical, allocative, and cost-efficiency studies

Study Country Efficiency type Facility type and period Methodology Variables
Vitaliano and Toren (1994) USA Cost efficiency 604 Nursing and other health-related facilities. Data relates to the years 1987 and 1990 SFA Outputs: patient days, admissions and transfers. Prices: wages of nursing aids, nurses and property expenses per square feet of a nursing home. An indicator variable for the type of owner and a variable for controlling quality was also used
Koop et al. (1997) USA Cost efficiency 382 Non-teaching hospitals from 1987–1991 Bayesian SFA Outputs: number of discharges, inpatient days, beds, outpatient visits and case mix index. Prices: wage-price index
Fried et al. (1999) USA Technical efficiency 990 Nursing homes. Data relates to the year 1993 DEA Outputs: inpatient days. Labour inputs: registered nurses (FTEs), licensed practical nurses (FTEs), other personnel (FTEs). Other input: non-payroll expenses
Maniadakis and Thanassoulis (2000) Scotland Technical, allocative and scale efficiency 75 Acute hospitals over the period 1992 to 1996 DEA Outputs: accident and emergency attendances, case-mix adjusted outpatient attendances, day cases and inpatient discharges. Labour inputs: FTEs of doctors, nurses and other personnel. Capital inputs: hospital beds and the cubic metres of the hospital buildings
Giokas (2001) Greece Technical efficiency 91 Hospitals (72 general and 19 teaching hospitals) for the year 1992 DEA & SFA Outputs: inpatient days, outpatient visits and ancillary services. Labour input: total staff earnings. Other input: expenditure on operating services and supplies
Rosko (2001) USA Cost efficiency 1631 Urban hospitals for the period 1990–1996 SFA Outputs: outpatient visits and case-mix adjusted inpatient discharges. Labour prices: average annual salary per FTE employee. Capital price: depreciation and interest expenses per bed
Athanassopoulos and Gounaris (2001) Greece Technical and allocative efficiency 98 Public hospitals in the year 1992 DEA Outputs: medical patients, surgical patients, medical examinations and laboratory tests. Labour inputs: a count of medical, administrative and nursing personnel. Other inputs: operating and pharmaceutical costs, medical supply and other supply costs. Prices: only the labour price: average annual costs per hospital employee
Steinmann and Zweifel (2003) Switzerland Technical efficiency 89 Swiss hospitals covering the years 1993–1996 DEA Outputs: inpatients days. Labour inputs: expenditure on academic, nursing and administrative staff. Other input: non-labour expenditure
Brown (2003) USA Technical efficiency 613 Hospitals relating to years 1992–1996 SFA Outputs: Case-mix discharges. Labour inputs: The FTEs of employees. Capital input: total beds and total expenses minus labour expenses are proxies for capital equipment. Indicator variables for year-specific effects, profit and public hospitals were used
Chang et al. (2004) Taiwan Technical efficiency 1996: 43 Regional hospitals and 440 district hospitals. In 1997, the 44 regional hospitals and 429 district hospitals DEA Outputs: patient days, outpatient visits and surgeries. Labour inputs: the number of physicians, nurses and ancillary service personnel. Capital input: number of beds
Blank and Valdmanis (2005) The Netherlands Cost efficiency 71 Homes for the disabled. Data for the year 1998 DEA Outputs: number of patient days. Inputs: number of general personnel, nursing and medical personnel, auxiliary personnel and weighted material supplies costs. Input prices: the regional price index was used
Pilyavsky et al. (2006) Ukraine Technical efficiency 61 Community hospitals DEA-bootstrap Outputs: number of medical admissions and surgical admissions. Labour inputs: number of physicians and nurses. Capital inputs: number of hospital beds
Aletras et al. (2007) Greece Technical and scale efficiency 51 General hospitals for the years 2000 and 2003. DEA Outputs: case-mix adjusted inpatient cases, outpatient visits and surgical operations. Labour inputs: FTEs of medical and other staff. Capital input: staffed hospital beds
Linna et al. (2006) Norway & Finland Cost efficiency 47 Finnish and 51 Norwegian public hospitals in 1999 were studied DEA Outputs: weighted discharges, bed days, daycare and outpatient visits. Prices: wage expenditure per FTE employee and an input price index for operating costs
Herr (2008) Germany Technical & cost efficiency 1556–1635 General hospitals each year for 2000 and 2003 SFA Output: weighted hospital cases. Labour inputs: FTE counts of doctors, nurses, and other staff. Capital input: number of beds. Labour input prices: cost for each labour group divided by respective FTEs. Capital price: costs for all medical requirements (pharmaceutical drugs, medical instruments, transplants, etc.) divided by the number of installed beds. Various exogenous variables are included to control for observable heterogeneity and to measure the effects on inefficiency
Pilyavsky and Staat (2008) Ukraine Technical efficiency 193 Community hospitals and polyclinics for the years 1997–2001 Order-m estimator (related to FDH/DEA) Hospital Outputs: admissions and surgical procedures. Polyclinics Outputs: admissions, surgical procedures, laboratory tests and X-rays. Hospital inputs: a count of nurses, physicians and beds. Polyclinics Inputs: a count of nurses and physicians
Mutter et al. (2008) USA Cost efficiency 1,290 Urban hospitals in 20 states operating in 2001. SFA Outputs: inpatient admissions, outpatient visits and patient days in nonacute care units. Labour price: average salary and benefits per FTE employee. Capital price: depreciation and interest expenses per bed. Quality variable: teaching and the excess in-hospital mortality rate index
Friesner et al. (2008) USA Technical, allocative & scale efficiency 80 Hospitals and 1076 observations, balanced longitudinal data for the period 1998–2001 DEA Outputs: case-mix outpatient visits, inpatient days. Inputs: hospital beds, square feet of hospital, and paid labour hours. Labour price: average real wage paid by the hospital. Intermediate input price: supply expenses divided by the number of licensed beds and the producer price index. Capital price: the sum of interest and depreciation expenses divided by the square footage of the hospital and the producer price index
Shimshak et al. (2009) USA Technical efficiency 38 Rest homes for the year 2003 DEA Outputs: number of residents, separated by who needs bathing, dressing, transferring, toileting and eating. Inputs: FTEs of nurses, nursing aids, ancillary and administrative staff
Ancarani et al. (2009) Italy Technical efficiency 48 Hospital wards for the year 2004. DEA Outputs: ambulatory visits, discharges and day surgeries. Inputs: number of physicians, non-medical personnel, number of beds, shifts of surgery rooms and maintenance costs of medical equipment.
Herr et al. (2011) Germany Cost, technical & profit efficiency 541 Hospitals between period 2002 and 2006. Unbalanced longitudinal SFA Outputs: weighted hospital cases. Inputs: FTEs of doctors, nurses, and other staff. Labour input prices: salary of doctors, nurses, and other staff divided FTEs. Other input prices: administration costs per bed and material cost per bed. Capital input: installed beds
Ng (2011) China Scale and technical efficiency Data for 2004–08 on 463 hospitals (balanced longitudinal) DEA Outputs: outpatients and inpatient cases. Inputs: number of doctors, nurses, pharmacists and other staff. Capital input: number of beds
Medin et al. (2011) Norway, Finland, Denmark and Sweden. Cost efficiency 70 university hospitals in the Nordic countries over 3 years (2002–2004). Unbalanced longitudinal data DEA-bootstrap Outputs: case-mix medical daycare and inpatient discharges, surgical daycare and inpatient discharges, and clinical teaching activities. Inputs: operating costs, costs for physicians and nurses. Input prices: wage index of respective countries. The authors also used quality indicators
Hu et al. (2012) China Technical efficiency 30 Province-level hospital data for the year 2002–2008. DEA Outputs: number of outpatient and emergency room visits and the total number of inpatient days. Undesirable output - patient mortality. Labour inputs: number of doctors, medical technicians (nurses and physicians), and other personnel (mainly administrative staff). Capital inputs: hospital beds and value of fixed assets
Nedelea and Fannin (2013) USA Cost efficiency Unbalanced longitudinal data for a set of Critical Access Hospitals in the period 1999–2006 DEA-bootstrap Outputs: outpatient visits, admissions, post-admission days, emergency room visits, outpatient surgeries, and total births. Labour input: FTEs of personnel. Capital input: staffed and licensed beds. Labour prices: the price of labour (payroll expenses + employee benefits) divided by total FTEs. Capital price: depreciation expenses plus interest expenses divided by the number of beds in each facility. A quality proxy variable was used in the second stage of truncated regression
Ferrier and Trivitt (2013) USA Technical efficiency 1,074 General acute-care hospitals operating in 2005 DEA Outputs: case-mix measure of inpatient days, emergency room visits, outpatient visits, outpatient surgeries and inpatient surgeries. Inputs: FTEs of registered nurses, licensed practical nurses, medical residents and other labour. Various measures of quality were also used
Barros et al. (2013) Portugal Cost efficiency 51 Hospitals relating to the year 1997–2008 (balanced longitudinal data) Latent class SFA Outputs: number of discharged patients, external consultations and emergency visits. Input prices: ratio of wages to the number of employees and the regional price index. Capital input proxied by the number of beds
Cozad and Wichmann (2013) USA Technical efficiency 48 State-level balanced longitudinal data from 2000 to 2007. DEA-bootstrap Outputs: survival rates, health status, and population share without disabilities. Labour input: number of general practitioners and registered nurses. Capital input: number of hospital beds
Kinfu (2013) South Africa Technical efficiency 52 Districts in South Africa for the year 2001 SFA Outputs: under-five mortality and coverage of birth care. Inputs: per-capita public expenditures on health, health insurance coverage, the proportion of the population with access to safe drinking water, sanitation and waste disposal, the density of hospital beds and the number of health workers
Yang and Zeng (2014) China Technical efficiency 46 Are public hospitals for the period 2006–2010 (balanced longitudinal) DEA Outputs: number of outpatient visits and inpatients. Labour inputs: number of doctors, nurses, administrative staff and other staff. Capital input: number of beds
Alonso et al. (2015) Spain Technical efficiency 25 Public hospitals, in the year 2009 DEA-bootstrap Output: desirable outputs: Case-mix adjusted number of discharges and the number of outpatient visits. Undesirable outputs: In-hospital mortality rate and the ratio between patient readmissions and discharges. Labour inputs: FTEs of physicians and nursing staff. Capital input: number of beds
Mateus et al. (2015) England, Portugal, Spain and Slovenia Technical efficiency Portugal (2002–2009) for 102 hospitals. England (2005–2008) for 163 hospitals. Spain (2003–2009) for 287 hospitals. Slovenia (2005–2009) for 19 hospitals SFA Outputs: weighted hospital discharges. Labour inputs: headcounts of physicians, nurses and other employees. Capital input: number of beds
Gok and Altındağ (2015) Turkey Technical efficiency 251 Hospitals for the period 2011–2008 (balanced longitudinal data) DEA Outputs: bed utilization rate, bed turnover rate, total surgical operations, number of births, total outpatient visits, average facility inpatient days, and number of discharges. Labour inputs: number of specialized physicians and non-specialized physicians. Capital inputs: number of hospital beds
Mitropoulos et al. (2015) Greece Technical efficiency 117 General public hospitals for the year 2009. DEA Outputs: numbers of inpatient admissions and aggregated scheduled and emergency outpatient visits. Labour inputs: number of doctors as an aggregation of all specialties of doctors in the hospital, number of other personnel as an aggregation of nurses, administrative and support staff in the hospital. Capital input: number of hospital beds
Cordero et al. (2015) Spain Technical efficiency 132 Primary care providers in the year 2010 DEA Outputs: hospitalization rates. Inputs: number of GPs, nurses and number of prescriptions
Ancarani et al. (2016) UAE Technical efficiency 48 Wards of three main hospitals in Dubai for the year 2013 DEA Outputs: inpatient surgery discharges, inpatient non-surgery discharges and outpatients. Labour inputs: number of doctors and nurses. Capital input: number of beds
Widmer (2015) Switzerland Cost efficiency 333 Hospitals for period 2004–2009 Bayesian SFA Outputs: number of case-mix adjusted inpatient cases and revenue from outpatient’s treatment. Labour prices: labour expenditure divided by FTEs. Other input prices: price of other inputs such as energy, material, and purchased services, computed by dividing total costs by the number of admissions
Chowdhury and Zelenyuk (2016) Canada Technical efficiency 113 acute-care hospitals in Ontario for the years 2003 and 2006 DEA-bootstrap Outputs: ambulatory visits and case-mix weighted inpatient days. Labour inputs: FTEs of nurses and administrative workers. Other inputs: Medical/otherical supplies costs and equipment costs. Capital input: number of staffed beds
Cetin and Bahce (2016) OECD countries Technical efficiency 26 OECD countries in the year 2016 DEA Outputs: life expectancy and infant mortality rates. Inputs: number of doctors, beds and health expenditure per capita
Al-Amin et al. (2016) USA Cost efficiency 1108 Hospitals that reported HCAPHS data in both August 2008 and July 2009 SFA Outputs: ratio of emergency department visits to total outpatient visits, the ratio of outpatient surgeries to total outpatient visits, the proportion of total hospital beds classified as acute care, and the ratio of births to total admissions. Labour input prices: the price of labour was approximated by the area average annual salary per full-time-equivalent employee. Capital price: depreciation and interest expenses per bed
Chen et al. (2016) China Cost efficiency 31 Provincial-level hospital data from 2002–2011 Bayesian SFA Outputs: number of surgeries and total revenue. Input prices: salary expenditure by the staff of the hospitals. Capital price: total depreciation by total assets
Jiang et al. (2017) China Technical efficiency 1105 Hospitals across 31 provinces for period 2008–2012 DEA Outputs: outpatient & emergency visits and inpatient days. Labour inputs: number of physicians, nurses, medical technicians. Capital input: number of open beds
DePuccio and Ozcan (2017) USA Technical efficiency 2212 General medical-surgical hospitals in the year 2012 DEA Outputs: medicare case mix-adjusted inpatient admissions, outpatient visits, and ED visits. Labour inputs: hospital service-mix, non-physician FTEs. Other input: non-labour operating expenses. Capital input: number of staffed and set-up beds
Colombi et al. (2017) Italy Technical efficiency 133 Acute hospitals during the period 2008–2013 SFA Outputs: hospital annual acute discharges corrected by treatment cost. Labour inputs: annual working hours of physicians, nurses and other workers. Capital input: total beds for acute discharges
Stefko et al. (2018) Slovak Republic Technical efficiency 8 Regions during the period 2008–2015 DEA Outputs: use of beds and average nursing time. Labour inputs: number of medical staff. Other input: quantity of medical equipment, magnetic resonance and computed tomography. Capital input: number of beds
Sultan and Crispim (2018) Palestine Technical efficiency 11 Public hospitals from 2010 to 2015 DEA Outputs: total number of annual care days, annual outpatient visits and cases served without admission. Inputs: FTEs of nurses, technicians, and other employees in paramedical departments and the administrative staff. Capital input: number of hospital beds
Ferreira and Marques (2019) Portugal Technical efficiency 7 Hospitals and 20 hospital centres, operating between 2013 and 2016 DEA Outputs: number of inpatient discharges, emergency cases, first medical appointments, follow-up medical appointments, outpatient surgeries, conventional surgeries, urgent surgeries and number of births. Labour inputs: FTEs of doctors, nurses, hospital days. Also, the use of various expenditures as inputs
Giménez et al. (2019) Mexico Technical efficiency 606 public and 182 private hospitals DEA Outputs: surgical medical procedures, medical consultations, days of stay and hospital discharges. Labour inputs: number of doctors in direct contact with the patient and nurses. Capital inputs: operating rooms and licensed beds
Ahmed et al. (2019) Bangladesh Technical efficiency 62 District hospitals for the year 2015 DEA Outputs: number of women receiving ANC services, regular deliveries, caesarean-section services, PNC services, outpatient visits and inpatient admissions. Labour inputs: number of doctors and nurses. Capital input: number of beds
Jiang and Andrews (2020) New Zealand Technical efficiency and cost efficiency 20 District health boards for period 2011–2017. SFA & DEA Outputs: case-weighted inpatient discharges and price-weighted outpatient visits. Labour inputs: FTEs of medical and weighted nurses and other staff. Capital input: depreciation and capital charges. Intermediate inputs: expenditure on clinical supplies. Labour price: total expenditure divided by FTEs. Capital price: capital charges divided by inpatient discharges. Intermediate input price: total expenditure divided by inpatient discharges
Andrews (2020a) New Zealand Technical efficiency 20 District health boards for the period 2011–2017 DEA-bootstrap Outputs: case-weighted inpatient discharges and price-weighted outpatient visits. Labour inputs: FTEs of medical, nurses, allied, support and management staff. Capital input: capital assets value. Intermediate inputs: clinical supply expenditure
Andrews (2020b) New Zealand Technical efficiency 20 District health boards for period 2011–2018 DEA-bootstrap Outputs: case-weighted inpatient discharges and price-weighted outpatient visits. Labour inputs: FTEs of medical and weighted nurses & other staff. Capital input: capital assets value. Intermediate input: clinical supply expenditure
Table A2

Previous contribution to healthcare TFP studies

Study Country Facility type and period Methodology Variables Findings
Färe et al. (1992) Sweden 42 Swedish group pharmacies for period 1980–1989 Malmquist index Outputs: number of drug deliveries, prescription drugs, medical appliances and over the counter goods. Labour inputs: number of hours of pharmacists, technical staff, other building and equipment services staff. Capital input: depreciation amount. Average TFP increased in seven periods and decreased in two periods. On average, progress in TFP during the latter part of the 1980s was due to the positive shifts in the frontier
Burgess and Wilson (1995) USA 137 Nonpsychiatric hospitals for the period 1985–1988. DEA-Malmquist index Outputs: number of acute care inpatient days, case-mix weighted acute care inpatient discharges, long-term care inpatient days, outpatient visits; ambulatory surgical procedures and inpatient surgical procedures. Labour inputs: FTEs of nurses, other clinical labour, nonclinical labour, and long-term care labour staff. Capital input: number of acute-care beds and long-term hospital beds On average, there was technical regress which dominated changes in inefficiency in determining changes in TFP
(RNs) measured in full time equivalents (RNF/E); licensed practical nurses (LPNs) measured in full time equivalents (LPFTE); other clinical labor (excluding RNs and LPNs) measured in full time equivalents (XCF/∼); nonclinical labor measured in full time equivalents (NCFTE); and long-term care labor measured in fufl-time equivalents. Capital input: Depreciation amount.
Färe et al. (1995) Sweden 257 Pharmacies in cities and suburban areas for the period 1990–1991 Malmquist index Outputs: number of prescriptions, drug deliveries, prescription drugs, medical appliances and over-the-counter goods. Labour inputs: number of hours of pharmacists, technical staff, building and equipment services staff Capital input: depreciation amount. The results suggest that the incorporation of quality makes a difference in measured productivity change
Tambour (1997) Sweden 20 Ophthalmology departments in various hospitals from 1988 to 1993 DEA-bootstrap Malmquist index Outputs: number of performed operations for cataract, glaucoma, squint diseases and number of physician visits. Labour inputs: FTEs of specialists and other physicians. Capital input: number of beds The positive changes in TFP are mainly due to positive changes in production technology rather than an overall positive change in relative (technical) efficiency or scale efficiency
Linna (1998) Finland 43 Acute hospitals in period 1988–1994. Malmquist index Outputs: DRG weighted inpatient episodes, number of outpatients, emergency visits, residents, research outputs and nursing students. Labour price: personnel price index Results showed a 3–5% annual average increase in TFP, half of which was due to an improvement in cost efficiency and the other half due to technological change
Dismuke and Sena (1999) Portugal 58 Hospitals during the years 1992–1994 SFA and DEA-Malmquist index Outputs: DRG weighted desirable and undesirable discharges. Inputs: authors concentrate on diagnostic technology utilization on three technological inputs: the computerized axial tomography scanner, the electrocardiogram and the echocardiogram in the production of discharges Improvement of technical efficiency has not been accompanied by an equivalent improvement in the quality of output in district hospitals. The parametric frontiers show technical progress in most outputs, except echocardiograms which experienced technical regress
Giuffrida (1999) United Kingdom 90 English Family Health Service Authorities over the period 1991–1995 DEA-Malmquist index Outputs: the total number of people registered with a general practitioner broken down by various demographics. Also, a measure of intermediate outputs, such as pre-determined targets for children, was also included. Labour inputs: number of general practitioners and practice nurses The improvement in TFP was very small. The rise was due to pure progress in technical efficiency and positive changes in scale efficiency, although the technology shows no noticeable change. Analysis indicates very a limited scope for productivity growth in this sector
Maniadakis et al. (1999) United Kingdom 72 Acute Scottish hospitals for the period 1992–1996 DEA-Malmquist index Outputs: number of accident and emergency attendances, case-mix adjusted outpatient attendances, day cases and inpatient discharges. Labour inputs: number of doctors, nurses, other personnel. Capital input: number of beds The improvement in TFP was dominated by technical change rather than hospital-relative efficiency changes
Maniadakis and Thanassoulis (2000) United Kingdom 75 Scottish hospitals for the period 1992–1996 DEA-Malmquist index Outputs: number of accident and emergency attendances, case-mix adjusted outpatient attendances, day cases and inpatient discharges. Labour inputs: number of doctors, nurses, other personnel. Capital input: number of beds The improvement in TFP is due to overall progress in efficiency, which, in turn, is primarily attributed to an increase in allocative efficiency. Technical progress resulted in a small reduction in the number of inputs used, but also a higher cost of production due to the worsening of the match between input mixes and relative input prices
Sommersguter-Reichmann (2000) Austria 22 Austrian hospitals for period 1994 and 1998 DEA-Malmquist index Outputs: the total number of patients treated in the outpatients and the number of credit points reported by each hospital, multiplied by a steering factor. Labour inputs: FTEs of labour. Intermediate input: expenses for external medical services. Capital input: number of beds TFP decreased from 1994 to 1995, while it increased from 1995 to 1996. The results showed a positive shift in technology between 1996 and 1998, without any technical efficiency improvement
Jiménez et al. (2003) United Kingdom 39 English county council hospitals for period 1992–1995 DEA-Malmquist index Outputs: number of people who receive residential care at day centres; the number of hours of domiciliary care delivered; the number of meals delivered to people at home; and the magnitude of the user charges raised from those in care. Inputs: gross cost of all services for older people The TFP shows a steady increase, from 0.7% in year 2 to 2.3% in year 5. There was minimal improvement in any of the components in 1993/1994. Subsequently, a dip of 3.5% in technological progress in 1994/1995 was offset by a 13.3% rise in the following year. Conversely, both pure and scale efficiencies fell back in the final year
González and Gascón (2004) Spain 80 pharmaceutical labs for the period 1994–2000 DEA-Malmquist index Outputs: net sales. Labour input: labour costs. Capital input: fixed assets depreciation (capital). Intermediate input: other costs The results indicate that improvements in technical efficiency and changing technology explain most of the observed TFP growth. However, the contribution of technological improvements to productivity growth is minimal
Gannon (2008) Ireland Set of hospitals from 1995 to 1998. DEA-Malmquist index Outputs: number of case-mix adjusted inpatients, outpatients and day cases. Labour inputs: FTEs of people employed in each hospital. Capital input: number of beds in each hospital Results show that, on average, both technical and efficiency changes contribute to a higher TFP in larger hospitals but lead to lower productivity levels in smaller hospitals. However, the contribution of these productivity components varies over time, and technical improvements play a more critical role in increasing the productivity of larger hospitals
Pilyavsky and Staat (2008) Ukraine 193 Community hospitals for the years 1997–2001 DEA-Malmquist index Outputs: number of admissions and surgical procedures. Labour inputs: FTEs of people employed in each hospital. Capital input: number of beds in each hospital The overall average TFP did not change throughout the observation period. However, substantial deviations from unity can be observed depending on the period and the region
Morikawa (2010) Japan 239 Secondary medical areas for the period 1998–2007 Fixed-effects regression Outputs: number of inpatient days and outpatient visits. Labour inputs: FTEs of the physicians and the ratio of physicians to the total number of other staff. Capital input: number of beds multiplied by the utilization rate TFP increases by more than 10% when the size of the hospital doubles
Ng (2011) China 463 Hospitals from Guangdong province for the period 2004–2008. DEA-Malmquist index Outputs: number of inpatient and outpatient cases. Labour inputs: number of doctors, nurses, pharmacists and other staff. Capital input: number of hospital beds TFP grew between 2004 and 2008, mainly driven by technological progress. However, technical efficiency deteriorated in the period under study
Blank and Eggink (2014) The Netherlands Aggregated hospital data over the period 1972–2010 which yielded 39 observations Time series regression Outputs: number of surgeries and total revenue. Input prices: price of personnel per FTE, price of material supplies is proxied by the consumer price index. Capital price: total capital costs divided by depreciation and investment The results indicate that the average productivity of the hospital sector in different periods varies and that these differences are related to the structure of regulation in those periods. Further, the authors argue that competition reform failed to improve hospital sector productivity
Kittelsen et al. (2015) Denmark, Finland, Norway, and Sweden Public acute somatic hospitals for the period 2005–2007 DEA-Malmquist index Outputs: number of outpatient visits, DRG weighted inpatients and day patients. Inputs: real operating costs The results show small differences in scale and technical efficiency between countries but significant differences in production possibilities (frontier position). The country-specific Finnish frontier is the key source of the Finnish productivity advantage
Karmann and Roesel (2017) Germany Hospitals from 16 federal states for the period 1993–2013 Frontier-based Malmquist approach and Non-frontier Tornqvist approach Outputs: the number of discharges, a quality index, and the quality‐adjusted number of discharges (outcome). Labour inputs: FTEs of physicians, nurses, and other staff. Intermediate inputs: deflated costs of energy, materials, and service expenses. Capital input: proxied by the amount of deflated capital stocks The authors find that quality improvements rather than increases in quantity volumes generate TFP growth in hospital care. Also, reducing the length of stay is a proper way to enhance hospital TFP

References

Ahmed, S., Hasan, M. Z., Laokri, S., Jannat, Z., Ahmed, M. W., Dorin, F., Vargas, V., & Khan, J. A. M. (2019). Technical efficiency of public district hospitals in Bangladesh: a data envelopment analysis. Cost Effectiveness and Resource Allocation, 17(1), 15. doi: 10.1186/s12962-019-0183-6.Search in Google Scholar

Ahn, S. C., & Sickles, R. C. (2000). Estimation of long-run inefficiency levels: a dynamic frontier approach. Econometric Reviews, 19(4), 461–492. doi: 10.1080/07474930008800482.Search in Google Scholar

Aigner, D., Lovell, C. A. K., & Schmidt, P. (1977). Formulation and estimation of stochastic frontier production function models. Journal of Econometrics, 6(1), 21–37. doi: 10.1016/0304-4076(77)90052-5.Search in Google Scholar

Al-Amin, M., Makarem, S. C., & Rosko, M. (2016). Efficiency and hospital effectiveness in improving hospital consumer assessment of healthcare providers and systems ratings. Health Care Management Review, 41(4), 296–305. doi: 10.1097/hmr.0000000000000076.Search in Google Scholar

Aletras, V., Kontodimopoulos, N., Zagouldoudis, A., & Niakas, D. (2007). The short-term effect on technical and scale efficiency of establishing regional health systems and general management in Greek NHS hospitals. Health Policy, 83(2–3), 236–245.10.1016/j.healthpol.2007.01.008Search in Google Scholar

Alonso, J. M., Clifton, J., & Díaz-Fuentes, D. (2015). The impact of new public management on efficiency: An analysis of Madrid’s hospitals. Health Policy, 119(3), 333–340. doi: 10.1016/j.healthpol.2014.12.001.Search in Google Scholar

Ancarani, A., Di Mauro, C., & Giammanco, M. D. (2009). The impact of managerial and organisational aspects on hospital wards’ efficiency: Evidence from a case study. European Journal of Operational Research, 194(1), 280–293. doi: 10.1016/j.ejor.2007.11.046.Search in Google Scholar

Ancarani, A., Di Mauro, C., Gitto, S., Mancuso, P., & Ayach, A. (2016). Technology acquisition and efficiency in Dubai hospitals. Technological Forecasting and Social Change, 113, 475–485. doi: 10.1016/j.techfore.2016.07.010.Search in Google Scholar

Andrews, A. (2020a). The efficiency of New Zealand District Health Boards in administrating public funds: An application of bootstrap DEA and beta regression. International Journal of Public Administration, 44(14), 1297–1308. doi: 10.1080/01900692.2020.1755685.Search in Google Scholar

Andrews, A. (2020b). Investigating technical efficiency and its determinants: The case of New Zealand District Health Boards. Health Policy and Technology, 9(3), 323–334. doi: 10.1016/j.hlpt.2020.04.006.Search in Google Scholar

Andrews, A., & Emvalomatis, G. (2023). Dynamic analysis of healthcare providers’ cost efficiency. Applied Economics, 1–18. doi: 10.1080/00036846.2023.2257031.Search in Google Scholar

Araújo, C., Barros, C. P., & Wanke, P. (2014). Efficiency determinants and capacity issues in Brazilian for-profit hospitals. Health Care Management Science, 17(2), 126–138. doi: 10.1007/s10729-013-9249-8.Search in Google Scholar

Athanassopoulos, A., & Gounaris, C. (2001). Assessing the technical and allocative efficiency of hospital operations in Greece and its resource allocation implications. European Journal of Operational Research, 133(2), 416–431. doi: 10.1016/S0377-2217(00)00180-6.Search in Google Scholar

Bala, M. M., Singh, S., & Gautam, D. K. (2023). Stochastic frontier approach to efficiency analysis of health facilities in providing services for non-communicable diseases in eight LMICs. International Health, 15(5), 512–525. doi: 10.1093/inthealth/ihac080.Search in Google Scholar

Barros, C. P., de Menezes, A. G., & Vieira, J. C. (2013). Measurement of hospital efficiency, using a latent class stochastic frontier model. Applied Economics, 45(1), 47–54. doi: 10.1080/00036846.2011.579061.Search in Google Scholar

Battese, G. E., & Coelli, T. J. (1988). Prediction of firm-level technical efficiencies with a generalised frontier production function and panel data. Journal of Econometrics, 38(3), 387–399. doi: 10.1016/0304-4076(88)90053-X.Search in Google Scholar

Battese, G. E., & Coelli, T. J. (1992). Frontier production functions, technical efficiency and panel data: With application to paddy farmers in India. Journal of Productivity Analysis, 3(1), 153–169. doi: 10.1007/bf00158774.Search in Google Scholar

Battese, G. E., & Coelli, T. J. (1995). A model for technical inefficiency effects in a stochastic frontier production function for panel data. Empirical Economics, 20(2), 325–332. doi: 10.1007/bf01205442.Search in Google Scholar

Björkgren, M. A., Fries, B. E., Häkkinen, U., & Brommels, M. (2004). Case-mix adjustment and efficiency measurement. Scandinavian Journal of Public Health, 32(6), 464–471. doi: 10.1080/14034940410028235.Search in Google Scholar

Blank, J. L. T., & Eggink, E. (2014). The impact of policy on hospital productivity: A time series analysis of Dutch hospitals. Health Care Management Science, 17(2), 139–149. doi: 10.1007/s10729-013-9257-8.Search in Google Scholar

Blank, J. L. T., & Valdmanis, V. (2005). A modified three-stage data envelopment analysis. The European Journal of Health Economics, 6(1), 65–72. doi: 10.1007/s10198-004-0260-3.Search in Google Scholar

Borden, J. P. (1988). An assessment of the impact of diagnosis-related group (DRG)-based reimbursement on the technical efficiency of New Jersey hospitals using data envelopment analysis. Journal of Accounting and Public Policy, 7(2), 77–96. doi: 10.1016/0278-4254(88)90012-9.Search in Google Scholar

Brown, H. S. (2003). Managed care and technical efficiency. Health Economics, 12(2), 149–158. doi: 10.1002/hec.712.Search in Google Scholar

Bun, M. J. G., & Windmeijer, F. (2010). The weak instrument problem of the system GMM estimator in dynamic panel data models. The Econometrics Journal, 13(1), 95–126. doi: 10.1111/j.1368-423X.2009.00299.x.Search in Google Scholar

Burgess, J. F., & Wilson, P. W. (1995). Decomposing hospital productivity changes, 1985–1988: A nonparametric Malmquist approach. Journal of Productivity Analysis, 6(4), 343–363. doi: 10.1007/BF01073525.Search in Google Scholar

Caves, D. W., Christensen, L. R., & Diewert, W. E. (1982). The economic theory of index numbers and the measurement of input, output, and productivity. Econometrica, 50(6), 1393–1414. doi: 10.2307/1913388.Search in Google Scholar

Cetin, V. R., & Bahce, S. (2016). Measuring the efficiency of health systems of OECD countries by data envelopment analysis. Applied Economics, 48(37), 3497–3507. doi: 10.1080/00036846.2016.1139682.Search in Google Scholar

Chang, H., Cheng, M. A., & Das, S. (2004). Hospital ownership and operating efficiency: Evidence from Taiwan. European Journal of Operational Research, 159(2), 513–527.10.1016/S0377-2217(03)00412-0Search in Google Scholar

Charnes, A., Cooper, W. W., & Rhodes, E. L. (1978). Measuring the efficiency of decision making units. European Journal of Operational Research, 2(6), 429–444. doi: 10.1016/0377-2217(78)90138-8.Search in Google Scholar

Chen, Z., Barros, C. P., & Hou, X. (2016). Has the medical reform improved the cost efficiency of Chinese hospitals? The Social Science Journal, 53(4), 510–520. doi: 10.1016/j.soscij.2016.04.006.Search in Google Scholar

Chowdhury, H., & Zelenyuk, V. (2016). Performance of hospital services in Ontario: DEA with truncated regression approach. Omega, 63, 111–122. doi: 10.1016/j.omega.2015.10.007.Search in Google Scholar

Coelli, T. J., Rao, D. S., O’Donnell, C., & Battese, G. E. (2005). An introduction to efficiency and productivity analysis. New York: Springer Science + Business Media, Inc.Search in Google Scholar

Colombi, R., Kumbhakar, S., Martini, G., & Vittadini, G. (2014). Closed-skew normality in stochastic frontiers with individual effects and long/short-run efficiency. Journal of Productivity Analysis, 42(2), 123–136. https://EconPapers.repec.org/RePEc:kap:jproda:v:42:y:2014:i:2:p:123-136.10.1007/s11123-014-0386-ySearch in Google Scholar

Colombi, R., Martini, G., & Vittadini, G. (2017). Determinants of transient and persistent hospital efficiency: The case of Italy. Health Economics, 26(S2), 5–22. doi: 10.1002/hec.3557.Search in Google Scholar

Cordero, J. M., Alonso-Morán, E., Nuño-Solinis, R., Orueta, J. F., & Arce, R. S. (2015). Efficiency assessment of primary care providers: A conditional nonparametric approach. European Journal of Operational Research, 240(1), 235–244. doi: 10.1016/j.ejor.2014.06.040.Search in Google Scholar

Cornwell, C., Schmidt, P., & Sickles, R. C. (1990). Production frontiers with cross-sectional and time-series variation in efficiency levels. Journal of Econometrics, 46(1), 185–200. doi: 10.1016/0304-4076(90)90054-W.Search in Google Scholar

Cozad, M., & Wichmann, B. (2013). Efficiency of health care delivery systems: effects of health insurance coverage. Applied Economics, 45(29), 4082–4094. doi: 10.1080/00036846.2012.750420.Search in Google Scholar

DePuccio, M. J., & Ozcan, Y. A. (2017). Exploring efficiency differences between medical home and non-medical home hospitals. International Journal of Healthcare Management, 10(3), 147–153. doi: 10.1080/20479700.2015.1101913.Search in Google Scholar

Desli, E., Ray, S. C., & Kumbhakar, S. C. (2003). A dynamic stochastic frontier production model with time-varying efficiency. Applied Economics Letters, 10(10), 623–626. doi: 10.1080/1350485032000133291.Search in Google Scholar

Dismuke, C. E., & Sena, V. (1999). Has DRG payment influenced the technical efficiency and productivity of diagnostic technologies in Portuguese public hospitals? An empirical analysis using parametric and nonparametric methods. Health Care Management Science, 2(2), 107–116. doi: 10.1023/A:1019027509833.Search in Google Scholar

Eisner, R., & Nadiri, M. I. (1968). Investment behavior and neo-classical theory. The Review of Economics and Statistics, 50(3), 369–382. doi: 10.2307/1937931.Search in Google Scholar

Emvalomatis, G. (2012). Adjustment and unobserved heterogeneity in dynamic stochastic frontier models. Journal of Productivity Analysis, 37(1), 7–16. doi: 10.1007/s11123-011-0217-3.Search in Google Scholar

Emvalomatis, G., Stefanou, S. E., & Oude Lansink, A. (2011). A reduced-form model for dynamic efficiency measurement: Application to dairy farms in Germany and The Netherlands. American Journal of Agricultural Economics, 93(1), 161–174. http://www.jstor.org/stable/41240266.10.1093/ajae/aaq125Search in Google Scholar

Färe, R., Grosskopf, S., Lindgren, B., & Roos, P. (1992). Productivity changes in Swedish pharamacies 1980–1989: A nonparametric Malmquist approach. Journal of Productivity Analysis, 3(1), 85–101. doi: 10.1007/BF00158770.Search in Google Scholar

Färe, R., Grosskopf, S., & Roos, P. (1995). Productivity and quality changes in Swedish pharmacies. International Journal of Production Economics, 39(1), 137–144. doi: 10.1016/0925-5273(94)00063-G.Search in Google Scholar

Farrell, M. J. (1957). The measurement of productive efficiency. Journal of the Royal Statistical Society. Series A (General), 120(3), 253–290. doi: 10.2307/2343100.Search in Google Scholar

Ferrier, G. D., & Trivitt, J. S. (2013). Incorporating quality into the measurement of hospital efficiency: a double DEA approach. Journal of Productivity Analysis, 40(3), 337–355. doi: 10.1007/s11123-012-0305-z.Search in Google Scholar

Ferreira, D. C., & Marques, R. C. (2019). Do quality and access to hospital services impact on their technical efficiency? Omega, 86, 218–236. doi: 10.1016/j.omega.2018.07.010.Search in Google Scholar

Fetter, R. B., Shin, Y., Freeman, J. L., Averill, R. F., & Thompson, J. D. (1980). Case mix definition by diagnosis-related groups. Medical Care, 18(2 Suppl), iii, 1–53.Search in Google Scholar

Filippini, M., & Greene, W. (2016). Persistent and transient productive inefficiency: a maximum simulated likelihood approach. Journal of Productivity Analysis, 45(2), 187–196. doi: 10.1007/s11123-015-0446-y.Search in Google Scholar

Filippini, M., & Hunt, L. C. (2015). Measurement of energy efficiency based on economic foundations. Energy Economics, 52, S5–S16. doi: 10.1016/j.eneco.2015.08.023.Search in Google Scholar

Folland, S. T., & Hofler, R. A. (2001). How reliable are hospital efficiency estimates? Exploiting the dual to homothetic production. Health Economics, 10(8), 683–698. doi: 10.1002/hec.600.Search in Google Scholar

Fried, H. O., Schmidt, S. S., & Yaisawarng, S. (1999). Incorporating the operating environment into a nonparametric measure of technical efficiency. Journal of Productivity Analysis 12, 249–267. doi: 10.1023/A:1007800306752.Search in Google Scholar

Friesner, D., Roseman, R., & McPherson, M. Q. (2008). Are hospitals seasonally inefficient? Evidence from Washington State. Applied Economics, 40(6), 699–723. doi: 10.1080/00036840600749730.Search in Google Scholar

Galán, J. E., Veiga, H., & Wiper, M. P. (2015). Dynamic effects in inefficiency: Evidence from the Colombian banking sector. European Journal of Operational Research, 240(2), 562–571. doi: 10.1016/j.ejor.2014.07.005.Search in Google Scholar

Gannon, B. (2008). Total factor productivity growth of hospitals in Ireland: A nonparametric approach. Applied Economics Letters, 15(2), 131–135. doi: 10.1080/13504850600706115.Search in Google Scholar

Giménez, V., Keith, J. R., & Prior, D. (2019). Do healthcare financing systems influence hospital efficiency? A metafrontier approach for the case of Mexico. Health Care Management Science, 22(3), 549–559. doi: 10.1007/s10729-019-9467-9.Search in Google Scholar

Giokas, D. I. (2001). Greek hospitals: How well their resources are used. Omega, 29(1), 73–83. doi: 10.1016/S0305-0483(00)00031-1.Search in Google Scholar

Giuffrida, A. (1999). Productivity and efficiency changes in primary care: A Malmquist index approach. Health Care Management Science, 2(1), 11–26. doi: 10.1023/A:1019067223945.Search in Google Scholar

Gok, M. S., & Altındağ, E. (2015). Analysis of the cost and efficiency relationship: experience in the Turkish pay for performance system. The European Journal of Health Economics, 16(5), 459–469. doi: 10.1007/s10198-014-0584-6.Search in Google Scholar

González, E., & Gascón, F. (2004). Sources of productivity growth in the Spanish pharmaceutical industry (1994–2000). Research Policy, 33(5), 735–745. doi: 10.1016/j.respol.2003.12.004.Search in Google Scholar

Greene, W. (2005a). Fixed and random effects in stochastic frontier models. Journal of Productivity Analysis, 23(1), 7–32. http://www.jstor.org/stable/41770178.10.1007/s11123-004-8545-1Search in Google Scholar

Greene, W. (2005b). Reconsidering heterogeneity in panel data estimators of the stochastic frontier model. Journal of Econometrics, 126(2), 269–303. doi: 10.1016/j.jeconom.2004.05.003.Search in Google Scholar

Grosskopf, S., & Valdmanis, V. (1987). Measuring hospital performance: A nonparametric approach. Journal of Health Economics, 6(2), 89–107. https://EconPapers.repec.org/RePEc:eee:jhecon:v:6:y:1987:i:2:p:89-107.10.1016/0167-6296(87)90001-4Search in Google Scholar

Harcourt, G., & Riach, P. (1997). A second edition of the general theory: Volume 1. Routledge: 11 New Fetter Lane, London EC4P 4EE.Search in Google Scholar

Herr, A. (2008). Cost and technical efficiency of German hospitals: does ownership matter? Health Economics, 17(9), 1057–1071. doi: 10.1002/hec.1388.Search in Google Scholar

Herr, A., Schmitz, H., & Augurzky, B. (2011). Profit efficiency and ownership of German hospitals. Health Economics, 20(6), 660–674. doi: 10.1002/hec.1622.Search in Google Scholar

Hollingsworth, B., & Peacock, S. (2008). Efficiency measurement in health and health care. New York, NY: Routledge.10.4324/9780203486566Search in Google Scholar

Hollingsworth, B., & Street, A. (2006). The market for efficiency analysis of health care organisations. Health Economics, 15(10), 1055–1059. doi: 10.1002/hec.1169.Search in Google Scholar

Hu, H. H., Qi, Q., & Yang, C. H. (2012). Analysis of hospital technical efficiency in China: Effect of health insurance reform. China Economic Review, 23(4), 865–877. doi: 10.1016/j.chieco.2012.04.008.Search in Google Scholar

Jacobs, R., Smith, P. C., & Street, A. (2006). Measuring efficiency in health care: Analytic techniques and health policy. Cambridge: Cambridge University Press.10.1017/CBO9780511617492Search in Google Scholar

Jiang, N., & Andrews, A. (2020). Efficiency of New Zealand’s District Health Boards at providing hospital services: A stochastic frontier analysis. Journal of Productivity Analysis, 53(1), 53–68. doi: 10.1007/s11123-019-00550-z.Search in Google Scholar

Jiang, S., Min, R., & Fang, P.-Q. (2017). The impact of healthcare reform on the efficiency of public county hospitals in China. BMC Health Services Research, 17(1), 838. doi: 10.1186/s12913-017-2780-4.Search in Google Scholar

Jiménez, J. S., Chaparro, F. P., & Smith, P. C. (2003). Evaluating the introduction of a quasi-market in community care. Socio-Economic Planning Sciences, 37(1), 1–13. doi: 10.1016/S0038-0121(02)00042-3.Search in Google Scholar

Jorgenson, D. W., & Griliches, Z. (1967). The explanation of productivity change. The Review of Economic Studies, 34(3), 249–283. doi: 10.2307/2296675.Search in Google Scholar

Karmann, A., & Roesel, F. (2017). Hospital policy and productivity – Evidence from German states. Health Economics, 26(12), 1548–1565. doi: 10.1002/hec.3447.Search in Google Scholar

Kee, H. L. (2004). Estimating productivity when primal and dual TFP accounting fail: An illustration using Singapore’s industries. The B.E. Journal of Economic Analysis & Policy, 4(1), 1–38. doi: 10.2202/1538-0653.1193.Search in Google Scholar

Khalaf, L., & Saunders, C. J. (2016). Dynamic technical efficiency. In: W. Greene, L. Khalaf, R. Sickles, M. Veall, & MC. Voia (Eds.) Productivity and Efficiency Analysis. Springer Proceedings in Business and Economics. Cham: Springer. 10.1007/978-3-319-23228-7_6.Search in Google Scholar

Kinfu, Y. (2013). The efficiency of the health system in South Africa: evidence from stochastic frontier analysis. Applied Economics, 45(8), 1003–1010. doi: 10.1080/00036846.2011.613787.Search in Google Scholar

Kittelsen, S. A. C., Winsnes, B. A., Anthun, K. S., Goude, F., Hope, Ø., Häkkinen, U., Kalseth, B., Kilsmark, J., Medin, E., Rehnberg, C., & Rättö, H. (2015). Decomposing the productivity differences between hospitals in the Nordic countries. Journal of Productivity Analysis, 43(3), 281–293. doi: 10.1007/s11123-015-0437-z.Search in Google Scholar

Koop, G., Osiewalski, J., & Steel, M. F. J. (1997). Bayesian efficiency analysis through individual effects: Hospital cost frontiers. Journal of Econometrics, 76(1), 77–105. doi: 10.1016/0304-4076(95)01783-6.Search in Google Scholar

Kumbhakar, S. C. (1987). The specification of technical and allocative inefficiency in stochastic production and profit frontiers. Journal of Econometrics, 34(3), 335–348. doi: 10.1016/0304-4076(87)90016-9.Search in Google Scholar

Kumbhakar, S. C. (1990). Production frontiers, panel data, and time-varying technical inefficiency. Journal of Econometrics, 46(1), 201–211. doi: 10.1016/0304-4076(90)90055-X.Search in Google Scholar

Kumbhakar, S. C., & Heshmati, A. (1995). Efficiency measurement in Swedish dairy farms: An application of rotating panel data, 1976–88. American Journal of Agricultural Economics, 77(3), 660–674. doi: 10.2307/1243233.Search in Google Scholar

Kumbhakar, S. C., & Hjalmarsson, L. (1995). Labour-use efficiency in Swedish social insurance offices. Journal of Applied Econometrics, 10(1), 33–47. http://www.jstor.org/stable/2284941.10.1002/jae.3950100104Search in Google Scholar

Kumbhakar, S. C., Lien, G., & Hardaker, J. B. (2014). Technical efficiency in competing panel data models: a study of Norwegian grain farming. Journal of Productivity Analysis, 41(2), 321–337. doi: 10.1007/s11123-012-0303-1.Search in Google Scholar

Kumbhakar, S. C., & Wang, H.-J. (2005). Estimation of growth convergence using a stochastic production frontier approach. Economics Letters, 88(3), 300–305. doi: 10.1016/j.econlet.2005.01.023.Search in Google Scholar

Lambarraa, F., Stefanou, S., & María Gil Roig, J. (2016). The analysis of irreversibility, uncertainty and dynamic technical inefficiency on the investment decision in Spanish olive sector. European Review of Agricultural Economics, 43(1), 59–77. doi: 10.1093/erae/jbv006.Search in Google Scholar

Linde, S. (2023). Hospital cost efficiency: An examination of US acute care inpatient hospitals. International Journal of Health Economics and Management, 23(3), 325–344. doi: 10.1007/s10754-022-09314-3.Search in Google Scholar

Linna, M. (1998). Measuring hospital cost efficiency with panel data models. Health Economics, 7(5), 415–427. doi: 10.1002/(sici)1099-1050(199808)7:5<415::Aid-hec357>3.0.Co;2-9.Search in Google Scholar

Linna, M., Häkkinen, U., & Magnussen, J. (2006). Comparing hospital cost efficiency between Norway and Finland. Health Policy, 77(3), 268–278. doi: 10.1016/j.healthpol.2005.07.019.Search in Google Scholar

Malmquist, S. (1953). Index numbers and indifference surfaces. Trabajos de Estadistica, 4(2), 209–242. doi: 10.1007/BF03006863.Search in Google Scholar

Maniadakis, N., Hollingsworth, B., & Thanassoulis, E. (1999). The impact of the internal market on hospital efficiency, productivity and service quality. Health Care Management Science, 2(2), 75–85. doi: 10.1023/A:1019079526671.Search in Google Scholar

Maniadakis, N., & Thanassoulis, E. (2000). Assessing productivity changes in U.K. hospitals reflecting technology and input prices. Applied Economics, 32(12), 1575–1589. doi: 10.1080/000368400418970.Search in Google Scholar

Mateus, C., Joaquim, I., & Nunes, C. (2015). Measuring hospital efficiency—comparing four European countries. European Journal of Public Health, 25(suppl_1), 52–58. doi: 10.1093/eurpub/cku222.Search in Google Scholar

Medin, E., Anthun, K. S., Häkkinen, U., Kittelsen, S. A. C., Linna, M., Magnussen, J., Olsen, K., Rehnberg, C. (2011). Cost efficiency of university hospitals in the Nordic countries: a cross-country analysis. The European Journal of Health Economics, 12(6), 509–519. doi: 10.1007/s10198-010-0263-1.Search in Google Scholar

Meeusen, W., & Van den Broeck, J. (1977). Efficiency estimation from Cobb-Douglas production functions with composed error. International Economic Review, 18(2), 435–444. https://EconPapers.repec.org/RePEc:ier:iecrev:v:18:y:1977:i:2:p:435-44.10.2307/2525757Search in Google Scholar

Mitropoulos, P., Talias, M. A., & Mitropoulos, I. (2015). Combining stochastic DEA with Bayesian analysis to obtain statistical properties of the efficiency scores: An application to Greek public hospitals. European Journal of Operational Research, 243(1), 302–311. doi: 10.1016/j.ejor.2014.11.012.Search in Google Scholar

Morikawa, M. (2010). Economies of scale and hospital productivity: An empirical analysis of medical area Level panel data. The Research Institute of Economy, Trade and Industry. http://www.rieti.go.jp/jp/publications/dp/10e050.pdf.Search in Google Scholar

Mutter, R. L., Rosko, M. D., & Wong, H. S. (2008). Measuring hospital inefficiency: the effects of controlling for quality and patient burden of illness. Health Services Research, 43(6), 1992–2013. doi: 10.1111/j.1475-6773.2008.00892.x.Search in Google Scholar

Nedelea, C. I., & Fannin, M. J. (2013). Analysing cost efficiency of critical access hospitals. Journal of Policy Modeling, 35(1), 183–195. doi: 10.1016/j.jpolmod.2012.10.002.Search in Google Scholar

Newhouse, J. P. (1994). Frontier estimation: How useful a tool for health economics? Journal of Health Economics, 13(3), 317–322. doi: 10.1016/0167-6296(94)90030-2.Search in Google Scholar

Ng, Y. C. (2011). The productive efficiency of Chinese hospitals. China Economic Review, 22(3), 428–439. doi: 10.1016/j.chieco.2011.06.001.Search in Google Scholar

Nunamaker, T. R. (1983). Measuring routine nursing service efficiency: A comparison of cost per patient day and data envelopment analysis models. Health Services Research, 18(2 Pt 1), 183–208. https://pubmed.ncbi.nlm.nih.gov/6874357, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1068745/.Search in Google Scholar

Parkin, D., & Hollingsworth, B. (1997). Measuring production efficiency of acute hospitals in Scotland, 1991–94: validity issues in data envelopment analysis. Applied Economics, 29(11), 1425–1433. doi: 10.1080/000368497326255.Search in Google Scholar

Peacock, S., Chan, C., Mangolini, M., Johansen, D., & Johansen. (2001). Staff Working Paper Techniques for Measuring Efficiency in Health Services. Productivity Commission.Search in Google Scholar

Pilyavsky, A. I., Aaronson, W. E., Bernet, P. M., Rosko, M. D., Valdmanis, V. G., & Golubchikov, M. V. (2006). East–West: Does it make a difference to hospital efficiencies in Ukraine? Health Economics, 15(11), 1173–1186. doi: 10.1002/hec.1120.Search in Google Scholar

Pilyavsky, A., & Staat, M. (2008). Efficiency and productivity change in Ukrainian health care. Journal of Productivity Analysis, 29(2), 143–154. doi: 10.1007/s11123-007-0070-6.Search in Google Scholar

Pitt, M. M., & Lee, L.-F. (1981). The measurement and sources of technical inefficiency in the Indonesian weaving industry. Journal of Development Economics, 9(1), 43–64. doi: 10.1016/0304-3878(81)90004-3.Search in Google Scholar

Rosko, M. D. (2001). Cost efficiency of U.S. hospitals: A stochastic frontier approach. Health Economics, 10(6), 539–551. doi: 10.1002/hec.607.Search in Google Scholar

Rosko, M. D., & Chilingerian, J. A. (1999). Estimating hospital inefficiency: Does case mix matter? Journal of Medical Systems, 23(1), 57–71. doi: 10.1023/A:1020823612156.Search in Google Scholar

Schmidt, P., & Sickles, R. C. (1984). Production frontiers and panel data. Journal of Business & Economic Statistics, 2(4), 367–374. doi: 10.2307/1391278.Search in Google Scholar

Sherman, H. D. (1984). Hospital efficiency measurement and evaluation. Empirical test of a new technique. Medical Care, 22(10), 922–938. doi: 10.1097/00005650-198410000-00005.Search in Google Scholar

Shimshak, D. G., Lenard, M. L., & Klimberg, R. K. (2009). Incorporating Quality into Data Envelopment Analysis of Nursing Home Performance: A Case Study. Omega, 37(3), 672–685. doi: 10.1016/j.omega.2008.05.004.Search in Google Scholar

Simar, L., & Wilson, P. W. (1998). Sensitivity analysis of efficiency scores: How to bootstrap in nonparametric frontier models. Management Science, 44(1), 49–61. http://www.jstor.org/stable/2634426.10.1287/mnsc.44.1.49Search in Google Scholar

Simar, L., & Wilson, P. W. (2007). Estimation and inference in two-stage, semi-parametric models of production processes. Journal of Econometrics, 136(1), 31–64. doi: 10.1016/j.jeconom.2005.07.009.Search in Google Scholar

Skevas, I., Emvalomatis, G., & Brümmer, B. (2018). Heterogeneity of long-run technical efficiency of German dairy farms: A Bayesian approach*. Journal of Agricultural Economics, 69(1), 58–75. doi: 10.1111/1477-9552.12231.Search in Google Scholar

Sommersguter-Reichmann, M. (2000). The impact of the Austrian hospital financing reform on hospital productivity: Empirical evidence on efficiency and technology changes using a nonparametric input-based Malmquist approach. Health Care Management Science, 3(4), 309–321. doi: 10.1023/A:1019022230731.Search in Google Scholar

Stefko, R., Gavurova, B., & Kocisova, K. (2018). Healthcare efficiency assessment using DEA analysis in the Slovak Republic. Health Economics Review, 8(1), 6. doi: 10.1186/s13561-018-0191-9.Search in Google Scholar

Steinmann, L., & Zweifel, P. (2003). On the (in)efficiency of Swiss hospitals. Applied Economics, 35(3), 361–370. doi: 10.1080/00036840210167183.Search in Google Scholar

Sultan, W. I. M., & Crispim, J. (2018). Measuring the efficiency of Palestinian public hospitals during 2010–2015: An application of a two-stage DEA method. BMC Health Services Research, 18(1), 381. doi: 10.1186/s12913-018-3228-1.Search in Google Scholar

Tambour, M. (1997). The impact of health care policy initiatives on productivity. Health Economics, 6(1), 57–70. doi: 10.1002/(sici)1099-1050(199701)6:1<57::Aid-hec243>3.0.Co;2-#.Search in Google Scholar

Thanassoulis, E., Portela, M. C. A. S., & Graveney, M. (2016). Identifying the scope for savings at inpatient episode level: An illustration applying DEA to chronic obstructive pulmonary disease. European Journal of Operational Research, 255(2), 570–582. 10.1016/j.ejor.2016.05.028.Search in Google Scholar

Thanassoulis, E., Takamura-Tweedy, A., Patel, M., & Rao, S. (2020). A method for identifying cost-efficient practices in the treatment of thoracic empyema. British Journal of Healthcare Management, 26(7). 10.12968/bjhc.2019.0010.Search in Google Scholar

The Treasury. (2001). Capital Charge- Formula and Rates for 2002/03. https://www.treasury.govt.nz/sites/default/files/2007-11/tc-2001-16.pdf.Search in Google Scholar

Tsionas, E. G. (2006). Inference in dynamic stochastic frontier models. Journal of Applied Econometrics, 21(5), 669–676. doi: 10.1002/jae.862.Search in Google Scholar

Tsionas, E. G., & Kumbhakar, S. C. (2014). Firm heterogeneity, persistent and transient technical inefficiency: A generalised true random-effects model. Journal of Applied Econometrics, 29(1), 110–132. doi: 10.1002/jae.2300.Search in Google Scholar

Vitaliano, D. F., & Toren, M. (1994). Cost and efficiency in nursing homes: A stochastic frontier approach. Journal of Health Economics, 13(3), 281–300. doi: 10.1016/0167-6296(94)90028-0.Search in Google Scholar

Wagstaff, A. (1989). Estimating efficiency in the hospital sector: a comparison of three statistical cost frontier models. Applied Economics, 21(5), 659–672. doi: 10.1080/758524897.Search in Google Scholar

Wang, H.-J., & Ho, C.-W. (2010). Estimating fixed-effect panel stochastic frontier models by model transformation. Journal of Econometrics, 157(2), 286–296. doi: 10.1016/j.jeconom.2009.12.006.Search in Google Scholar

Webster, R., Kennedy, S., & Johnson, L. (1998). Comparing Techniques for Measuring the Efficiency and Productivity of Australian Private Hospitals (98/3). Canberra, ACT: Australian Bureau of Statistics. http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.197.1525&rep=rep1&type=pdf.Search in Google Scholar

Widmer, P. K. (2015). Does prospective payment increase hospital (in)efficiency? Evidence from the Swiss hospital sector. The European Journal of Health Economics, 16(4), 407–419. doi: 10.1007/s10198-014-0581-9.Search in Google Scholar

Worthington, A. C. (2004). Frontier efficiency measurement in health care: A review of empirical techniques and selected applications. Medical Care Research and Review, 61(2), 135–170. doi: 10.1177/1077558704263796.Search in Google Scholar

Yang, J., & Zeng, W. (2014). The trade-offs between efficiency and quality in the hospital production: Some evidence from Shenzhen, China. China Economic Review, 31, 166–184. doi: 10.1016/j.chieco.2014.09.005.Search in Google Scholar

Ye, Y., & Tao, Q. (2023). Measurement and characteristics of the temporal-spatial evolution of China’s healthcare services efficiency. Archives of Public Health, 81(1), Article number: 197. doi: 10.1186/s13690-023-01009-4.Search in Google Scholar

Received: 2022-10-28
Revised: 2023-11-22
Accepted: 2023-12-11
Published Online: 2024-01-18

© 2024 the author(s), published by De Gruyter

This work is licensed under the Creative Commons Attribution 4.0 International License.

Articles in the same Issue

  1. Regular Articles
  2. Political Turnover and Public Health Provision in Brazilian Municipalities
  3. Examining the Effects of Trade Liberalisation Using a Gravity Model Approach
  4. Operating Efficiency in the Capital-Intensive Semiconductor Industry: A Nonparametric Frontier Approach
  5. Does Health Insurance Boost Subjective Well-being? Examining the Link in China through a National Survey
  6. An Intelligent Approach for Predicting Stock Market Movements in Emerging Markets Using Optimized Technical Indicators and Neural Networks
  7. Analysis of the Effect of Digital Financial Inclusion in Promoting Inclusive Growth: Mechanism and Statistical Verification
  8. Effective Tax Rates and Firm Size under Turnover Tax: Evidence from a Natural Experiment on SMEs
  9. Re-investigating the Impact of Economic Growth, Energy Consumption, Financial Development, Institutional Quality, and Globalization on Environmental Degradation in OECD Countries
  10. A Compliance Return Method to Evaluate Different Approaches to Implementing Regulations: The Example of Food Hygiene Standards
  11. Panel Technical Efficiency of Korean Companies in the Energy Sector based on Digital Capabilities
  12. Time-varying Investment Dynamics in the USA
  13. Preferences, Institutions, and Policy Makers: The Case of the New Institutionalization of Science, Technology, and Innovation Governance in Colombia
  14. The Impact of Geographic Factors on Credit Risk: A Study of Chinese Commercial Banks
  15. The Heterogeneous Effect and Transmission Paths of Air Pollution on Housing Prices: Evidence from 30 Large- and Medium-Sized Cities in China
  16. Analysis of Demographic Variables Affecting Digital Citizenship in Turkey
  17. Green Finance, Environmental Regulations, and Green Technologies in China: Implications for Achieving Green Economic Recovery
  18. Coupled and Coordinated Development of Economic Growth and Green Sustainability in a Manufacturing Enterprise under the Context of Dual Carbon Goals: Carbon Peaking and Carbon Neutrality
  19. Revealing the New Nexus in Urban Unemployment Dynamics: The Relationship between Institutional Variables and Long-Term Unemployment in Colombia
  20. The Roles of the Terms of Trade and the Real Exchange Rate in the Current Account Balance
  21. Cleaner Production: Analysis of the Role and Path of Green Finance in Controlling Agricultural Nonpoint Source Pollution
  22. The Research on the Impact of Regional Trade Network Relationships on Value Chain Resilience in China’s Service Industry
  23. Social Support and Suicidal Ideation among Children of Cross-Border Married Couples
  24. Asymmetrical Monetary Relations and Involuntary Unemployment in a General Equilibrium Model
  25. Job Crafting among Airport Security: The Role of Organizational Support, Work Engagement and Social Courage
  26. Does the Adjustment of Industrial Structure Restrain the Income Gap between Urban and Rural Areas
  27. Optimizing Emergency Logistics Centre Locations: A Multi-Objective Robust Model
  28. Geopolitical Risks and Stock Market Volatility in the SAARC Region
  29. Trade Globalization, Overseas Investment, and Tax Revenue Growth in Sub-Saharan Africa
  30. Can Government Expenditure Improve the Efficiency of Institutional Elderly-Care Service? – Take Wuhan as an Example
  31. Media Tone and Earnings Management before the Earnings Announcement: Evidence from China
  32. Review Articles
  33. Economic Growth in the Age of Ubiquitous Threats: How Global Risks are Reshaping Growth Theory
  34. Efficiency Measurement in Healthcare: The Foundations, Variables, and Models – A Narrative Literature Review
  35. Rethinking the Theoretical Foundation of Economics I: The Multilevel Paradigm
  36. Financial Literacy as Part of Empowerment Education for Later Life: A Spectrum of Perspectives, Challenges and Implications for Individuals, Educators and Policymakers in the Modern Digital Economy
  37. Special Issue: Economic Implications of Management and Entrepreneurship - Part II
  38. Ethnic Entrepreneurship: A Qualitative Study on Entrepreneurial Tendency of Meskhetian Turks Living in the USA in the Context of the Interactive Model
  39. Bridging Brand Parity with Insights Regarding Consumer Behavior
  40. The Effect of Green Human Resources Management Practices on Corporate Sustainability from the Perspective of Employees
  41. Special Issue: Shapes of Performance Evaluation in Economics and Management Decision - Part II
  42. High-Quality Development of Sports Competition Performance Industry in Chengdu-Chongqing Region Based on Performance Evaluation Theory
  43. Analysis of Multi-Factor Dynamic Coupling and Government Intervention Level for Urbanization in China: Evidence from the Yangtze River Economic Belt
  44. The Impact of Environmental Regulation on Technological Innovation of Enterprises: Based on Empirical Evidences of the Implementation of Pollution Charges in China
  45. Environmental Social Responsibility, Local Environmental Protection Strategy, and Corporate Financial Performance – Empirical Evidence from Heavy Pollution Industry
  46. The Relationship Between Stock Performance and Money Supply Based on VAR Model in the Context of E-commerce
  47. A Novel Approach for the Assessment of Logistics Performance Index of EU Countries
  48. The Decision Behaviour Evaluation of Interrelationships among Personality, Transformational Leadership, Leadership Self-Efficacy, and Commitment for E-Commerce Administrative Managers
  49. Role of Cultural Factors on Entrepreneurship Across the Diverse Economic Stages: Insights from GEM and GLOBE Data
  50. Performance Evaluation of Economic Relocation Effect for Environmental Non-Governmental Organizations: Evidence from China
  51. Functional Analysis of English Carriers and Related Resources of Cultural Communication in Internet Media
  52. The Influences of Multi-Level Environmental Regulations on Firm Performance in China
  53. Exploring the Ethnic Cultural Integration Path of Immigrant Communities Based on Ethnic Inter-Embedding
  54. Analysis of a New Model of Economic Growth in Renewable Energy for Green Computing
  55. An Empirical Examination of Aging’s Ramifications on Large-scale Agriculture: China’s Perspective
  56. The Impact of Firm Digital Transformation on Environmental, Social, and Governance Performance: Evidence from China
  57. Accounting Comparability and Labor Productivity: Evidence from China’s A-Share Listed Firms
  58. An Empirical Study on the Impact of Tariff Reduction on China’s Textile Industry under the Background of RCEP
  59. Top Executives’ Overseas Background on Corporate Green Innovation Output: The Mediating Role of Risk Preference
  60. Neutrosophic Inventory Management: A Cost-Effective Approach
  61. Mechanism Analysis and Response of Digital Financial Inclusion to Labor Economy based on ANN and Contribution Analysis
  62. Asset Pricing and Portfolio Investment Management Using Machine Learning: Research Trend Analysis Using Scientometrics
  63. User-centric Smart City Services for People with Disabilities and the Elderly: A UN SDG Framework Approach
  64. Research on the Problems and Institutional Optimization Strategies of Rural Collective Economic Organization Governance
  65. The Impact of the Global Minimum Tax Reform on China and Its Countermeasures
  66. Sustainable Development of Low-Carbon Supply Chain Economy based on the Internet of Things and Environmental Responsibility
  67. Measurement of Higher Education Competitiveness Level and Regional Disparities in China from the Perspective of Sustainable Development
  68. Payment Clearing and Regional Economy Development Based on Panel Data of Sichuan Province
  69. Coordinated Regional Economic Development: A Study of the Relationship Between Regional Policies and Business Performance
  70. A Novel Perspective on Prioritizing Investment Projects under Future Uncertainty: Integrating Robustness Analysis with the Net Present Value Model
  71. Research on Measurement of Manufacturing Industry Chain Resilience Based on Index Contribution Model Driven by Digital Economy
  72. Special Issue: AEEFI 2023
  73. Portfolio Allocation, Risk Aversion, and Digital Literacy Among the European Elderly
  74. Exploring the Heterogeneous Impact of Trade Agreements on Trade: Depth Matters
  75. Import, Productivity, and Export Performances
  76. Government Expenditure, Education, and Productivity in the European Union: Effects on Economic Growth
  77. Replication Study
  78. Carbon Taxes and CO2 Emissions: A Replication of Andersson (American Economic Journal: Economic Policy, 2019)
Downloaded on 23.9.2025 from https://www.degruyterbrill.com/document/doi/10.1515/econ-2022-0062/html?lang=en&srsltid=AfmBOopvFkGSl6kSx4OOfrcmCzjaqxtE_SKNLAkT_zexbyFkSYKN0NR3
Scroll to top button