Summary
The occurrence of scientific fraud damages the credibility of science. An instrument to discover deceit was proposed with Benford’s law, a distribution which describes the probability of significant digits in many empirical observations. If Benford-distributed digits are expected and empirical observations deviate from this law, the difference yields evidence for fraud.
This article analyses the practicability and capability of the digit distribution to investigate scientific counterfeit. In our context, capability means that little data is required to discover forgery. Furthermore, we present a Benford-based method which is more effective in detecting deceit and can also be extended to several other fields of digit analysis. We also restrict this article to the research area of non-standardized regressions. The results reproduce and extend the finding that non-standardized regression coefficients follow Benford’s law. Moreover, the data show that investigating regressions from different subjects demands more observations and hence is less effective than investigating regressions from single persons. Consequently, the digit distribution can discover indications for fraud, but only if the percentage of forgery in the data is large. With a decreasing proportion of fabricated values, the number of required cases to detect a significant difference between real and fraudulent regressions rises. Under the condition that only few scientists forge results, the investigation method becomes ineffective and inapplicable.
© 2011 by Lucius & Lucius, Stuttgart
Articles in the same Issue
- Titelei
- Inhalt / Contents
- Guest Editorial
- Abhandlungen / Original Papers
- The Production of Historical “Facts”: How the Wrong Number of Participants in the Leipzig Monday Demonstration on October 9, 1989 Became a Convention
- “True Believers” or Numerical Terrorism at the Nuclear Power Plant
- One-eyed Epidemiologic Dummies at Nuclear Power Plants
- Are Most Published Research Findings False?
- What Fuels Publication Bias?
- The Identification and Prevention of Publication Bias in the Social Sciences and Economics
- Benford’s Law as an Instrument for Fraud Detection in Surveys Using the Data of the Socio-Economic Panel (SOEP)
- When Does the Second-Digit Benford’s Law-Test Signal an Election Fraud?
- Difficulties Detecting Fraud? The Use of Benford’s Law on Regression Tables
- Plagiarism in Student Papers: Prevalence Estimates Using Special Techniques for Sensitive Questions
- Pitfalls of International Comparative Research: Taking Acquiescence into Account
- Buchbesprechungen / Book Reviews
Articles in the same Issue
- Titelei
- Inhalt / Contents
- Guest Editorial
- Abhandlungen / Original Papers
- The Production of Historical “Facts”: How the Wrong Number of Participants in the Leipzig Monday Demonstration on October 9, 1989 Became a Convention
- “True Believers” or Numerical Terrorism at the Nuclear Power Plant
- One-eyed Epidemiologic Dummies at Nuclear Power Plants
- Are Most Published Research Findings False?
- What Fuels Publication Bias?
- The Identification and Prevention of Publication Bias in the Social Sciences and Economics
- Benford’s Law as an Instrument for Fraud Detection in Surveys Using the Data of the Socio-Economic Panel (SOEP)
- When Does the Second-Digit Benford’s Law-Test Signal an Election Fraud?
- Difficulties Detecting Fraud? The Use of Benford’s Law on Regression Tables
- Plagiarism in Student Papers: Prevalence Estimates Using Special Techniques for Sensitive Questions
- Pitfalls of International Comparative Research: Taking Acquiescence into Account
- Buchbesprechungen / Book Reviews