Startseite The Effects of Enforcement on Corporate Environmental Performance: The Role of Perceived Fairness
Artikel
Lizenziert
Nicht lizenziert Erfordert eine Authentifizierung

The Effects of Enforcement on Corporate Environmental Performance: The Role of Perceived Fairness

  • Dietrich Earnhart EMAIL logo , Donna Ramirez Harrington und Robert Glicksman
Veröffentlicht/Copyright: 24. Januar 2020
Veröffentlichen auch Sie bei De Gruyter Brill

Abstract

Several empirical studies explore the effects of regulatory enforcement on environmental behavior and performance. Within this literature, extremely little empirical research examines the role of fairness, which we interpret broadly to include multiple dimensions, e. g. similar treatment of similarly situated regulated entities. Our study empirically examines the effect of perceived enforcement fairness on the extent of compliance with wastewater limits imposed on chemical manufacturing facilities regulated under the Clean Water Act. Our study also explores the influence of perceived fairness on the effectiveness of enforcement efforts – government inspections and enforcement actions – at inducing better compliance. For our analysis, we use a subjective measure of the degree of “fair treatment” of regulated facilities by environmental regulators, as perceived by facilities and reported as survey responses. Results reveal that a more (perceived) fair enforcement approach raises compliance, but only under limited enforcement conditions; in most instances, perceived more fair enforcement lowers compliance. As important, results show that greater perceived fairness improves the effectiveness of federal inspections and informal enforcement, but undermines the effectiveness of state inspections and formal non-penalty enforcement.

Award Identifier / Grant number: DFF – 4180-00147

Funding statement: This work was supported by the U.S. Environmental Protection Agency (Funder Id: http://doi.org/10.13039/100000139, Grant Number: STAR Research Assistance Agreement No. R-82882801).

Acknowledgements

The research described in this article was conducted as part of a larger project financed by the U.S. Environmental Protection Agency (EPA) pursuant to STAR Research Assistance Agreement No. R-82882801-0. This article has not been formally reviewed by EPA. The views expressed in this article are solely those of Robert Glicksman and Dietrich Earnhart. EPA does not endorse any products or commercial services mentioned in this manuscript. The authors thank Donald Haider-Markel and Tatsui Ebihara for their participation in the EPA STAR grant research project. The authors also thank Chris Drahozal, Joel Mintz, and Cliff Rechtschaffen for their very helpful insight. Dietrich Earnhart thanks Dylan Rassier, J. Mark Leonard, and Trisha Shrum for their valuable research assistance.

Appendix

A Incomplete response to survey of chemical manufacturing facilities

This appendix assesses the incomplete response to our original survey of chemical manufacturing facilities. Given the survey’s non-response rate of 73 %, the potential for sample selection bias is a valid concern. As the initial assessment of this concern, we compare the original sample of 1,003 survey recipients to the 267 facilities that completed our survey. Based on this comparison, we find no systematic state or regional bias in survey participation. As two examples, only the Midwest region is slightly over-represented in the response group, and only the Northeast region is slightly under-represented. However, these differences are small. Moreover, across most of the states, the difference between representation in the set of survey recipients and representation in the response group averages less than two percent. In contrast, our initial assessment reveals some difference in the participation of major facilities versus minor facilities. In the sample of survey recipients, 69 % of facilities are minor facilities and 31 % are major facilities. In the group of survey respondents, major facilities are somewhat over-represented at 39 %, which proves statistically significant.

As a stronger assessment, we test for sample selection bias by assessing whether any relevant factors appear to affect a facility’s decision to complete our survey after being contacted. For this assessment, we use a probit estimator to capture the relationship between the binary decision whether or not to complete our survey and a set of explanatory factors, including major versus minor status, inspections, enforcement actions, and EPA region. This assessment reveals a bias in a single dimension: major facilities were more likely to respond to the survey than were minor facilities. Put differently, the analysis indicates that only the distinction between minor and major facilities proves important for explaining whether or not a contacted facility completed our survey. The analysis demonstrates that neither the preceding history of inspections nor the preceding enforcement actions against a particular facility explains whether or not a contacted facility responded to the survey. Moreover, the analysis demonstrates that the decision to respond is not explained by the EPA region in which a particular facility resides.

Thus, based on our analysis, it appears that a sample selection bias exists in only a single dimension: the distinction between a major facility and a minor facility. This single distinction proves irrelevant for our final sample of analysis since it includes onlymajor facilities.

As one last form of sample selection assessment, we incorporate information on wastewater limits and discharges, for which data are publicly available only for major facilities, for both survey respondents and non-respondents. Consistent with our final sample of analysis, our last form of assessment focuses exclusively on major facilities. Using Two-Sample Means T-tests, we demonstrate that the sample of survey respondents and the sample of survey non-respondent facilities generated extremely similar discharge ratios for the time period covered by the survey instrument: January, 1999, to March, 2003. This analysis considers separately the two most prominent wastewater pollutants: Total Suspended Solids (TSS) and Biological Oxygen Demand (BOD). For the TSS discharge ratio, both of the sample means equal 0.267 and the t-test p-value is 0.969. For the BOD discharge ratio, the two sample means are nearly identical – 0.261 and 0.256 – and the t-test p-value is 0.616.

For all these reasons, our study does not correct for any potential sample selection bias. This lack of correction is consistent with prominently published studies of environmental management practices (Anton et al., 2004; Arimura et al., 2008).

B Focus on major facilities

Our exclusive focus on major facilities obviously constrains the generalizability of our results. Still, our analysis remains strongly policy relevant since the EPA focuses its regulatory efforts on major facilities (Earnhart, 2004a, 2004b, 2009; Earnhart and Segerson, 2012). Moreover, major facilities vary among themselves with respect to several dimensions, such as their environmental management practices and product mix, as reflected in the standard deviation values reported in Table 1. Therefore, our empirical analysis exploits variation even within a sample of major facilities. In addition, major facilities represented 21 % of the chemical manufacturing facilities regulated under the NPDES program in 2001. Given their relative size, we suspect that major facilities were responsible for the bulk of wastewater discharges from this sector during the sample period. (The EPA does not systematically maintain data on discharges from minor facilities for any sector. The data coverage of the EPA Discharge Monitoring Report (DMR) Pollutant Loading Tool, which is designed to provide data on discharge loadings, confirms this point; see http://cfpub.epa.gov/dmr. Thus, we are not able to substantiate our assertion.)

Finally, all previous studies of U.S. wastewater discharges exclusively examine major facilities (e. g. Earnhart, 2004a; Shimshack and Ward, 2005, 2009; Earnhart and Segerson, 2012). Nevertheless, we acknowledge that major and minor facilities may respond differently to perceived enforcement fairness. Due to our data constraints, we cannot assess this possibility. Fortunately, our analysis of the full sample of surveyed facilities reveals no statistical link from major/minor classification to perceived enforcement fairness. (This conclusion is robust across multiple econometric estimators, regressor sets, and sub-sample time periods.) Therefore, this concern does not seem relevant, at least not in our sample. In addition, we acknowledge that major and minor facilities may face different variation in enforcement fairness. If true, the extent of enforcement fairness is correlated with the NPDES major/minor classification. This correlation is not problematic since we measure enforcement fairness, at least its perception, and focus exclusively on major facilities.

C Relevance and validity of instruments

This appendix assesses the relevance and validity of the instruments used in our instrumental variables estimation. Appendix Tables 5 and 6 display the first stage estimates for the TSS and BOD samples, respectively. To assess the relevance of our instruments, we primarily test under-identification in the first stage of estimation. Based on both the Angrist-Pischke χ2 Test statistics and the Anderson Canonical Correlation Lagrange Multiplier Test statistics, we reject the null hypothesis of under-identification given p-values of 0.008 and 0.016, respectively, for the TSS regression sample and p-values of 0.0004 and 0.007, respectively, for the BOD regression sample. Moreover, we conduct a Partial F-test based on the coefficients estimated for the instruments used in the first stage. The test statistics demonstrate that the instruments appear relevant. For the TSS sample, the Partial F-test statistic equals 2.34 (p = 0.036). Therefore, the instruments are jointly significant at the 5 % level. One of the instruments proves individually significant at the 5 % level. Lagged enforcement is associated with perceived enforcement fairness. For the BOD sample, the Partial F-test statistic equals 2.48 (p = 0.076). Therefore, the instruments are jointly significant but only at the 8 % level. Again, one of the instruments proves significant: lagged enforcement is associated with perceived enforcement fairness.

To assess the validity of our instruments, we primarily utilize the Sargan-Hansen Test of Overidentifying Restrictions. For both the TSS and BOD regression samples, the Sargan-Hansen Test statistic fails to reject the null hypothesis of valid orthogonality conditions given p-values of 0.118 and 0.765, respectively. Moreover, we conduct weak instrument robust inference tests: the Anderson-Rubin Wald Test and the Stock-Wright Lagrange Multiplier Test. These tests also assess whether the overidentifying restrictions are valid. Unlike the tests offered by Stock and Yogo (2005) and Staiger and Stock (1997), these tests do not rely on the F-statistic exceeding a specific value (22.3 or 13.91, respectively). Instead, these tests are robust to the presence of weak instruments, i. e. each test statistic has the correct size even when the instruments are weak, and robust to accidental exclusion of relevant instruments (Dufour and Taamouti, 2005). For both regression samples, based on both tests, the statistics fail to reject the null hypothesis of valid orthogonality conditions, given p-values of 0.272 and 0.177, respectively, for the TSS sample and p-values of 0.884 and 0.686, respectively, for the BOD sample.

This assessment uses the relevant regressors in levels. If our assessment considers the endogeneity of the lagged dependent variable, then we use the regressors in first-differences. This alternative assessment generates test statistics that comparably support our conclusions that the chosen instruments appear both relevant and not invalid.

All of these points notwithstanding, we acknowledge that our instruments may prove weak. In this case, our ability to test properly the exogeneity of the primary regressor – perceived enforcement fairness – is similarly weak.

Table 5:

First stage 1 estimation of perceived enforcement fairness: Based on TSS sample.

VariableCoeffp-value
Instruments
Lagged Immunity Audit Policy0.220.22
Lagged Privilege Audit Policy0.0310.89
Lagged Immunity and Privilege Audit Policy−0.0840.40
Lagged Enforcement0.300.02
Control Factors
Audits−0.0060.16
Lagged TSS Discharge Ratio0.520.01
Year 20010.020.78
Region 10.970.01
Region 20.730.07
Region 30.910.02
Region 40.880.01
Region 51.110.01
Region 60.800.03
State Inspections0.160.18
Federal Inspections0.760.63
Formal Non-penalty Enforcement−0.640.38
Informal Enforcement0.340.54
Penalty Enforcement0.0090.71
Firm Environmental Employees−0.0280.20
Organic Chemical Sector−0.120.27
Inorganic Chemical Sector−0.230.07
Firm Ownership Structure0.0780.61
Treatment Technology0.550.02
Local Community Pressure0.170.29
Constant−0.960.04
Tests
Exogeneity Tests
Wu-Hausman Exogeneity Test0.5250.470
Durbin-Wu-Hausman Exogeneity Test0.6270.428
Partial F-test of Instruments2.340.036
Underidentification Tests
Angrist-Pischke χ2 Test17.470.008
Anderson Canonical Correlation Lagrange Multiplier15.580.016
Sargan-Hansen Test of Overidentifying Restrictions8.780.118
Weak Instrument Robust Inference Tests
Anderson-Rubin Wald Test1.280.272
Stock-Wright Lagrange Multiplier Test8.940.177
  1. p-values lying at or below 0.10 shown in bold.

  2. Null hypotheses for tests shown above:

  3. Exogeneity: enforcement fairness variable is exogenous

  4. Instrument relevance: instruments’ coefficients jointly equal zero

  5. Underidentification: system is under-identified

  6. Overidentifying restrictions: instrumental variables estimation is valid

  7. Weak instrument: orthogonality conditions are valid

  8. Estimation also includes lagged federal and state inspections as regressors.

Table 6:

First stage 1 estimation of perceived enforcement fairness: Based on BOD sample.

VariableCoeffp-value
Instruments
Lagged Immunity Audit Policy−0.440.13
Lagged Privilege Audit Policy0.050.86
Lagged Immunity and Privilege Audit Policy−0.230.24
Lagged Enforcement−110.400.02
Control Factors
Audits−0.0030.65
Lagged BOD Discharge Ratio0.050.23
Year 20010.020.78
Region 114.790.19
Region 244.180.05
Region 3−7.080.04
Region 4−7.770.15
Region 5−7.060.05
State Inspections0.310.10
Federal Inspections12.270.81
Formal Non-penalty Enforcement744.510.02
Informal Enforcement−82.970.05
Penalty Enforcement−0.0030.50
Firm Environmental Employees−0.100.01
Organic Chemical Sector0.080.66
Inorganic Chemical Sector−0.290.32
Firm Ownership Structure2.560.03
Treatment Technology−0.440.22
Local Community Pressure−0.720.11
Constant6.900.01
Tests
Exogeneity Tests
Wu-Hausman Exogeneity Test0.020.886
Durbin-Wu-Hausman Exogeneity Test0.040.841
Partial F-test of Instruments2.480.076
Underidentification Tests
Angrist-Pischke χ2 Test20.760.000
Anderson Canonical Correlation Lagrange Multiplier4.100.007
Sargan-Hansen Test of Overidentifying Restrictions1.150.765
Weak Instrument Robust Inference Tests
Anderson-Rubin Wald Test0.290.884
Stock-Wright Lagrange Multiplier Test2.270.686
  1. p-values lying at or below 0.10 shown in bold.

  2. Null hypotheses for tests shown above:

  3. Exogeneity: enforcement fairness variable is exogenous

  4. Instrument relevance: instruments’ coefficients jointly equal zero

  5. Underidentification: system is under-identified

  6. Overidentifying restrictions: instrumental variables estimation is valid

  7. Weak instrument: orthogonality conditions are valid

  8. Estimation excludes lagged federal and state inspections as regressors, contrary to the first-stage estimation based on the TSS sample.

References

Anderson, T.W., and Cheng Hsiao. 1982. “Formulation and Estimation of Dynamic Models Using Panel Data.” 18 Journal of Econometrics 47–82.10.1016/0304-4076(82)90095-1Suche in Google Scholar

Andreen, William. 2007. “Motivating Enforcement: Institutional Culture and the Clean Water Act.” 24 Pace Environmental Law Review 67–98.10.58948/0738-6206.1054Suche in Google Scholar

Anton, Willma Rose, George Deltas, and Madhu Khanna. 2004. “Incentives for Environmental Self-Regulation and Implications for Environmental Performance.” 48 Journal of Environmental Economics and Management 632–654.10.1016/j.jeem.2003.06.003Suche in Google Scholar

Arimura, Toshi, Nicole Darnall, and Hajime Katayama. 2011. “Is ISO 14001 a Gateway to More Advanced Voluntary Action? the Case of Green Supply Chain Management.” 61 Journal of Environmental Economics and Management 170–182.10.1016/j.jeem.2010.11.003Suche in Google Scholar

Arimura, Toshi, Akira Hibiki, and Hajime Katayama. 2008. “Is A Voluntary Approach an Effective Environmental Policy Instrument? A Case for Environmental Management Systems.” 55 Journal of Environmental Economics and Management 281–295.10.2139/ssrn.1001325Suche in Google Scholar

Ayers, Ian, and John Braithwaite. 1992. Responsive Regulation: Transcending the Deregulation Debate. New York: Oxford University Press.10.1093/oso/9780195070705.001.0001Suche in Google Scholar

Becker, Gary. 1968. “Crime and Punishment: An Economic Approach.” 76 Journal of Political Economy 169–217.10.1086/259394Suche in Google Scholar

Burby, Raymond. 1995. “Coercive V. Cooperative Pollution Control: Comparative Study of State Programs to Reduce Erosion and Sedimentation Pollution in Urban Areas.” 19 Environmental Management 359–361.10.1007/BF02471978Suche in Google Scholar

Burby, Raymond, and Robert Paterson. 1993. “Improving Compliance with State Environmental Regulations.” 12 Journal of Policy Analysis and Management 753–756.10.2307/3325349Suche in Google Scholar

Costle, Douglas. 1982. “Environmental Regulation and Regulatory Reform.” 57 Washington Law Review 409–431.Suche in Google Scholar

Craig, Robin Kundis. 2010. “The Public Health Aspects of Environmental Enforcement.” 4 Pittsburgh Journal of Environmental Public Health Law 1–71.10.5195/pjephl.2010.17Suche in Google Scholar

Dasgupta, Susmita, Hemamala Hettige, and David Wheeler. 2000. “What Improves Environmental Compliance? Evidence from Mexican Industry.” 39 Journal of Environmental Economics and Management 39–66.10.1006/jeem.1999.1090Suche in Google Scholar

Deily, Mary, and Wayne Gray. 1991. “Enforcement of Pollution Regulations in a Declining Industry.” 21 Journal of Environmental Economics and Management 260–274.10.26509/frbc-wp-198912Suche in Google Scholar

Dion, C., Paul Lanoie, and Benoît Laplante. 1998. “Monitoring of Pollution: Do Local Conditions Matter?” 13 Journal of Regulatory Economics 5–18.10.1023/A:1007970031068Suche in Google Scholar

Dufour, Jean Marie, and Mohamed Taamouti. 2005. “Projection-Based Statistical Inference in Linear Structural Models with Possibly Weak Instruments.” 73 Econometrica 1351–1365.10.1111/j.1468-0262.2005.00618.xSuche in Google Scholar

Earnhart, Dietrich. 2004a. “Panel Data Analysis of Regulatory Factors Shaping Environmental Performance.” 86 Review of Economics and Statistics 391–401.10.1162/003465304323023895Suche in Google Scholar

Earnhart, Dietrich. 2004b. “Regulatory Factors Shaping Environmental Performance at Publicly-Owned Treatment Plants.” 48 Journal of Environmental Economics and Management 655–681.10.1016/j.jeem.2003.10.004Suche in Google Scholar

Earnhart, Dietrich. 2004c. “The Effects of Community Characteristics on Polluter Compliance Levels.” 80 Land Economics 408–432.10.2307/3654729Suche in Google Scholar

Earnhart, Dietrich. 2009. “The Influence of Facility Characteristics and Permit Conditions on the Effects of Environmental Regulatory Deterrence.” 36 Journal of Regulatory Economics 247–273.10.1007/s11149-009-9095-2Suche in Google Scholar

Earnhart, Dietrich, and Robert Glicksman. 2015. “Coercive Vs. Cooperative Enforcement: Effect of Enforcement Approach on Environmental Management.” 42 International Review of Law and Economics 135–146.10.1016/j.irle.2015.02.003Suche in Google Scholar

Earnhart, Dietrich, and Kathleen Segerson. 2012. “The Influence of Financial Status on the Effectiveness of Environmental Enforcement.” 96 Journal of Public Economics 670–684.10.1016/j.jpubeco.2012.05.002Suche in Google Scholar

EPA. 1990. “A Primer on the Office of Water Enforcement and Permits and Its Programs,” Office of Water, Environmental Protection Agency, Washington, DC. March 1990.Suche in Google Scholar

EPA. 1997. “Chemical Industry National Environmental Baseline Report 1990-1994,” Office of Enforcement and Compliance Assurance, Environmental Protection Agency. EPA 305-R-96-002, October 1997.Suche in Google Scholar

EPA. 1999. “EPA/CMA Root Cause Analysis Pilot Project: An Industry Survey,” Environmental Protection Agency. EPA-305-R-99-001, May 1999.Suche in Google Scholar

EPA. 2000. U.S. EPA Strategic Plan 2000. Washington, DC: Environmental Protection Agency.Suche in Google Scholar

EPA. 2011. Technical Support Document for the 2010 Effluent Guidelines Program Plan. Washington, DC: Environmental Protection Agency.Suche in Google Scholar

Ervin, David, JunJie Wu, Madhu Khanna, Cody Jones, and Teresa Wirkkala. 2013. “Motivations and Barriers to Corporate Environmental Management.” 22 Business Strategy and the Environment 390–409.10.1002/bse.1752Suche in Google Scholar

Evans, Mary, Lirong Liu, and Sarah Stafford. 2011. “Do Environmental Audits Improve Long-Term Compliance? Evidence from Manufacturing Facilities in Michigan.” 40 Journal of Regulatory Economics 279–302.10.1007/s11149-011-9163-2Suche in Google Scholar

Glicksman, Robert, and Dietrich Earnhart. 2007. “The Comparative Effectiveness of Government Interventions on Environmental Performance in the Chemical Industry.” 26 Stanford Environmental Law Journal 112–139.Suche in Google Scholar

Harrington, Donna Ramirez. 2013. “Effectiveness of State Pollution Prevention Programs and Policies.” 31 Contemporary Economic Policy 255–278.10.1111/j.1465-7287.2011.00312.xSuche in Google Scholar

Harrington, Donna Ramirez, Madhu Khanna, and George Deltas. 2008. “Striving to Be Green: The Adoption of Total Quality Environmental Management.” 40 Applied Economics 2995–3007.10.1080/00036840600994005Suche in Google Scholar

Harrington, Winston. 1988. “Enforcement Leverage When Penalties are Restricted.” 37 Journal of Public Economics 29–53.10.1016/0047-2727(88)90003-5Suche in Google Scholar

Harrison, Kathryn. 1995. “Is Cooperation the Answer? Canadian Environmental Enforcement in Comparative Context.” 14 Journal of Policy Analysis and Management 221–223.10.2307/3325151Suche in Google Scholar

Helland, Eric. 1998. “The Enforcement of Pollution Control Laws: Inspections, Violations, and Self-Reporting.” 80 Review of Economics and Statistics 141–153.10.1162/003465398557249Suche in Google Scholar

Henriques, Irene, and Perry Sadorsky. 1996. “The Determinants of an Environmentally Responsive Firm: An Empirical Approach.” 30 Journal of Environmental Economics and Management 381–395.10.1006/jeem.1996.0026Suche in Google Scholar

Hsu, Shi-Ling. 2004. “Fairness versus Efficiency in Environmental Law.” 31 Ecology Law Quarterly 303–401.10.2139/ssrn.442420Suche in Google Scholar

Islam, Nazrul. 2001. “Small Sample Performance of Dynamic Panel Data Estimators in Estimating the Growth-Convergence Equation: A Monte Carlo Study,” in Badi Baltagi, R. Thomas Fomby, and Carter Hill, eds. Nonstationary Panels, Panel Cointegration, and Dynamic Panels (Advances in Econometrics, Volume 15). Bingley, England: Emerald Group Publishing Limited. 317–339.10.1016/S0731-9053(00)15012-1Suche in Google Scholar

Jaffe, Adam, and Robert Stavins. 1995. “Dynamic Incentives of Environmental Regulation: The Effects of Alternative Policy Instruments on Technology Diffusion.” 29 Journal of Environmental Economics and Management S43–63.10.1006/jeem.1995.1060Suche in Google Scholar

Johnson, Stephen. 1999. “Economics V. Equity: Do Market-Based Environmental Reforms Exacrerbate Environmental Justice?” 56 Washington & Lee Law Review 111–166.Suche in Google Scholar

Kagan, Robert. 1994. “Regulatory Enforcement,” in David Rosenbloom, and Richard D. Schwartz, eds. Handbook of Administrative Law and Regulation. New York, NY: Marcel Dekker.Suche in Google Scholar

Kagan, Robert, Neil Gunningham, and Dorothy Thornton. 2003. “Explaining Corporate Environmental Performance: How Does Regulation Matter?” 37 Law and Society Review 51–89.10.1111/1540-5893.3701002Suche in Google Scholar

Khanna, Madhu, George Deltas, and Donna Ramirez Harrington. 2009. “Adoption of Pollution Prevention Techniques: The Role of Management Systems and Regulatory Pressures.” 44 Environmental and Resource Economics 85–106.10.1007/s10640-009-9263-ySuche in Google Scholar

Khanna, Madhu, Patricia Koss, Cody Jones, and David Ervin. 2007. “Motivations for Voluntary Environmental Management.” 35 The Policy Studies Journal 751–772.10.1111/j.1541-0072.2007.00246.xSuche in Google Scholar

Khanna, Madhu, and Diah Widyawati. 2011. “Fostering Regulatory Compliance: The Role of Environmental Self-Auditing and Audit Policies.” 7 Review of Law and Economics 129–163.10.2202/1555-5879.1483Suche in Google Scholar

Kuehn, Robert. 1996. “The Limits of Devolving Enforcement of Federal Environmental Laws.” 70 Tulane Law Review 2373–2393.Suche in Google Scholar

Kuehn, Robert. 2015. “Bias in Environmental Agency Decision Making.” 45 Environmental Law 957–1019.Suche in Google Scholar

Laplante, Benoît, and Paul Rilstone. 1996. “Environmental Inspections and Emissions of the Pulp and Paper Industry in Quebec.” 31 Journal of Environmental Economics and Management 19–36.10.1006/jeem.1996.0029Suche in Google Scholar

Lazarus, Richard. 1993. “Pursuing ‘Environmental Justice:’ the Distrubutional Effects of Environmental Protection.” 87 Northwestern University Law Review 787–857.Suche in Google Scholar

Lazarus, Richard. 1997. “Fairness in Environmental Law.” 27 Environmental Law 705–739.Suche in Google Scholar

Lazarus, Richard, and Stephanie Tai. 1999. “Integrating Environmental Justice into EPA Permitting Authority.” 26 Ecology Law Quarterly 617–678.10.2139/ssrn.209190Suche in Google Scholar

Lederman, Leandra. 2003. “Tax Compliance and the Reformed IRS.” 51 University of Kansas Law Review 971–1011.10.2139/ssrn.391134Suche in Google Scholar

Magat, Wesley, and W. Kip Viscusi. 1990. “Effectiveness of the EPA’s Regulatory Enforcement: The Case of Industrial Effluent Standards.” 33 Journal of Law and Economics 331–360.10.1086/467208Suche in Google Scholar

Markell, David. 2000. “The Role of Deterrence-Based Enforcement in a ‘reinvented’ State / Federal Relationship: The Divide between Theory and Reality.” 24 Harvard Environmental Law Review 1–5.Suche in Google Scholar

Markell, David. 2005. “‘slack’ in the Administrative State and Its Implications for Governance: The Issue of Accountability.” 84 Oregon Law Review 1–22.Suche in Google Scholar

Mintz, Joel. 1995. Enforcement at the EPA: High Stakes and Hard Choices. Austin: University of Texas Press.Suche in Google Scholar

Nakamura, Masao, Takuya Takahashi, and Ilan Vertinsky. 2001. “Why Japanese Firms Choose to Certify: A Study of Managerial Responses to Environmental Issues.” 42 Journal of Environmental Economics and Management 23–52.10.1006/jeem.2000.1148Suche in Google Scholar

Pargal, Sheoli, and David Wheeler. 1996. “Informal Regulation of Industrial Pollution in Developing Countries: Evidence from Indonesia.” 104 Journal of Political Economy 1314–1327.10.1086/262061Suche in Google Scholar

Paxson, M.C. 1992. “Response Rates for 183 Studies,” Working Paper Washington State University, Washington State University.Suche in Google Scholar

Peltzman, Samuel. 1976. “Toward a More General Theory of Regulation.” 19 The Journal of Law and Economics 211–240.10.3386/w0133Suche in Google Scholar

Posner, Richard. 1974. “Theories of Economic Regulation.” 5 Bell Journal of Economics 335–358.10.3386/w0041Suche in Google Scholar

Rechtschaffen, Clifford. 1998. “Deterrence Vs. Cooperation and the Evolving Theory of Environmental Enforcement.” 71 Southern California Law Review 1181–1188.Suche in Google Scholar

Rechtschaffen, Clifford, and David Markell. 2003. Reinventing Environmental Enforcement and the State/Federal Relationship. Washington, DC: Environmental Law Institute.Suche in Google Scholar

Sah, Raaj. 1991. “Social Osmosis and Patterns of Crime.” 99 Journal of Political Economy 1272–1295.10.1086/261800Suche in Google Scholar

Scholz, John. 1984. “Cooperation, Deterrence, and the Ecology of Regulatory Enforcement.” 18 Law and Society Review 179–180.10.2307/3053402Suche in Google Scholar

Schroeder, Christopher. 1993. “Cool Analysis Versue Moral Outrage in the Development of Federal Environmental Criminal Law.” 35 William & Mary Law Review 251–269.Suche in Google Scholar

Shimshack, Jay and Michael Ward. 2005. “Regulator Reputation, Enforcement, and Environmental Compliance.” 50 Journal of Environmental Economics and Management 519–540.10.1016/j.jeem.2005.02.002Suche in Google Scholar

Shimshack, Jay, and Michael Ward. 2008. “Enforcement and over-Compliance.” 55 Journal of Environmental Economics and Management 90–105.10.1016/j.jeem.2007.05.003Suche in Google Scholar

Short, Jodi, and Michael Toffel. 2008. “Coerced Confessions: Self-Policing in the Shadow of the Regulator.” 24 Journal of Law, Economics and Organization (May) 45–71.10.1093/jleo/ewm039Suche in Google Scholar

Short, Jodi, and Michael Toffel. 2010. “Making Self-Regulation More than Merely Symbolic: The Critical Role of the Legal Environment.” 55 Administrative Science Quarterly 361–396.10.2189/asqu.2010.55.3.361Suche in Google Scholar

Silverman, S. L. 1990. “Federal Enforcement of Environmental Laws.” 75 Massachusetts Law Review 95–98.Suche in Google Scholar

Stafford, Sarah. 2011. “Outsourcing Enforcement: Principles to Guide Self-Policing Regimes.” 32 Cardozo Law Review 2293–2323.Suche in Google Scholar

Staiger, Douglas, and James Stock. 1997. “Instrumental Variables Regression with Weak Instruments.” 65 Econometrica 557–586.10.3386/t0151Suche in Google Scholar

Stigler, George. 1971. “The Theory of Economic Regulation.” 2 The Bell Journal of Economics and Management Science 3–21.10.4324/9781315495811-8Suche in Google Scholar

Stock, James, and Motohiro Yogo. 2005. “Testing for Weak Instruments in Linear IV regression,” Chapter 5,” in J.H. Stock, and D.W.K. Andrews, eds. Identification and Inference for Econometric Models: Essays in Honor of Thomas J. Rothenberg. Cambridge University Press: Cambridge, UK.10.1017/CBO9780511614491Suche in Google Scholar

Stoughton, Mark, Jeanne Herb, Jennifer Sullivan, and Michael Crow. 2001. “Toward Integrated Approaches to Compliance Assurance.” 31 Environmental Law Report 11266–11283.Suche in Google Scholar

Tarlock, A. Dan. 1992. “Environmental Protection: The Potential Misfit between Equity and Efficiency.” 63 University of Colorado Law Review 871–900.Suche in Google Scholar

Toffel, Michael, and Jodi Short. 2011. “Coming Clean and Cleaning Up: Does Voluntary Self-Reporting Indicate Effective Self-Policing?” 54 Journal of Law and Economics (August) 609–649.10.1086/658494Suche in Google Scholar

Wasserman, Cheryl 1984. “Improving the Efficiency and Effectiveness of Compliance Monitoring and Enforcement of Environmental Policies, United States: A National Review,” OECD.Suche in Google Scholar

White, Christen Carlson. 1996. “Regulation of Leaky Underground Fuel Tanks: An Anatomy of Regulatory Failure.” 14 UCLA Journal of Environmental Law and Policy 105–177.10.5070/L5141018909Suche in Google Scholar

Wiener, Jonathan Baert. 1999. “Global Environmental Regulation: Instrument Choice in Legal Context.” 108 Yale Law Journal 677–800.10.2307/797394Suche in Google Scholar

Wooldridge, Jeffrey. 2003. Introductory Econometrics: A Modern Approach, 2nd. Mason, OH: South-Western.Suche in Google Scholar

Zinn, Matthew. 2002. “Policing Environmental Regulatory Enforcement: Cooperation, Capture and Citizen Suits.” 21 Stanford Environmental Law Journal 81–90.Suche in Google Scholar


Supplementary Material

The online version of this article offers supplementary material (https://doi.org/10.1515/rle-2019-0012).


Published Online: 2020-01-24

© 2020 Walter de Gruyter GmbH, Berlin/Boston

Heruntergeladen am 24.9.2025 von https://www.degruyterbrill.com/document/doi/10.1515/rle-2019-0012/html
Button zum nach oben scrollen