Abstract
Philosophers of science are divided over the interpretations of scientific normativity. Larry Laudan defends a sort of goal-directed rules for scientific methodology. In contrast, Gerard Doppelt thinks methodological rules are a mixed batch of rules in that some are goal-oriented hypothetical rules and others are goal-independent categorical rules. David Resnik thinks that the debate between them is at a standstill now. He further thinks there are certain rules, such as the rule of consistency which is goal independent. However, he proposes a holistic understanding of the scientific methodology. Taking a thread from Resnik, the present paper also advocates a holistic understanding of the scientific methodology. Given that many scientific practices deal with systems, the focus will be given to the systems by assuming each as a constellation of methodological norms. By taking each system as a set of mutually supportive methodological rules whose instrumental values underwrite the coherence relation among them, the paper aims to provide what could be a viable holistic epistemological account that can explain scientific normativity at work in a scientific system. The paper will lay down specific holistic criteria for understanding the scientific methodology. They will be used to show how a holistic account could satisfactorily account for the success of the Compact Muon Solenoid (CMS) experiment in discovering the Higgs boson and how the holistic account can account for the instrumental error behind the apparent faster-than-light neutrino anomaly of the Oscillation Project with Emulsion-t Racking Apparatus (OPERA) experiments respectively.
1 Introduction
Methodology discussion mainly stresses the nature of individual methodological rules in insolation-whether categorical or instrumental. Many instances of actual scientific practices deal with a constellation of methodological commitments or rules. We may ask what the nature of the individual methodological rules could be in light of their roles in a system and the system itself that they constitute. By assuming the instrumental conception of the individual methodological rule in the sense of the instrumental roles they play in furthering the system’s goal, the present paper proposes to understand the nature of the system as holistic but one that is expressed in a coherent term. That is, the justification of the individual methodological rules is expressed in terms of their instrumental roles and that of the system in the holistic term. The holistic account attempts to understand how scientific normativity works in the methodological system.
It may be noted that various attempts[1] have been made to address the foundational question of scientific normativity. Taking normativity as the state of being subjected to norms, which are rules that tell us what we should and should not do, they have attempted to understand the source of scientific normativity. By grounding scientific normativity in pragmatic terms, thinkers such as Laudan (1984) have attempted to address the long-held division among philosophers over the different treatments of scientific normativity, either as monolithic (or universal) or pluralism.[2] On one extreme, earlier Thomas Kuhn,[3] Feyerabend, and the likes advocate radical methodological relativism. On the other extreme, there are the universalists such as Popper, whose falsificationism takes the scientific method as a ruthless attempt to falsify the hypothesis that scientists propose to explain phenomena. John Worrall (1988, 1989) can also be taken to advocate a specific form of universalist view when he claims that there is an invariant core of methodological rules which are fixed, unchanging, and universal. This methodological division invites a re-examination of the scientific normativity by looking at its social foundation. It helps us to appreciate how a pragmatic account of scientific normativity can help us understand and justify the methodological choice by appealing to the values, goals, interests, and assumptions of the circumstances or groups in question. Laudan’s (1987, 24) formulation that “if one’s goal is X, then one ought to do Y” can be re-expressed as “if C obtains, then one ought to do Y,” where C comprises the values, interests, goals, and assumptions that characterize the circumstances or group in question and Y the method to be adopted.
Laudan’s instrumental understanding of scientific methodology does not sit well with Doppelt and Resnik. Doppelt (1990) has maintained that the instrumental rendition of scientific methodology does not extend to the basic methodological rules such as the rule of predesignation and the principle of consilience of induction (1990, 9). He claims they are best treated as categorical imperatives because they are justified in and of themselves irrespective of any cognitive ends. He has criticized the conclusion Laudan draws from the underdetermination thesis that the connection between the means and the associated cognitive ends embodied by methodological rules is contingent. Because the connection between means and ends embodied by basic methodological rules is criteriological (Doppelt 1990). Resnik (1992) agrees with Laudan (1990) by asserting that Doppelt’s underdetermination argument is implausible. Because at least some of the basic methodological rules such as the rules of experimental design are hypothetical imperatives. Against Laudan, Resnik argues that some methodological rules such as the rule of consistency are categorical because they are justified irrespective of their capacity to achieve certain ends. In finding the debate between Laudan and Doppelt inconclusive, Resnik proposes a holistic approach as the third alternative account of scientific methodology. By calling for abandoning the hypothetical and categorical distinction, the holistic understanding maintains that the rules that play a central role in a system be treated as categorical while those away from it as hypothetical (Resnik 1992). The present paper is interested in Resnik’s focus on the system and takes it further to develop an account of scientific methodology at work in the scientific system. We will consider the methodological rules of the system as instrumental rules.
The paper will make use of the instrumental rule “if C obtains, then one ought to do Y” to explicate the methodological rules at work in the scientific system. In other words, the rules associated with components of the system can be expressed in the instrumental form. The system can be analyzed in terms of its components or the corresponding rules, each with a certain instrumental role in furthering the system. The instrumental roles of the components set the term for consideration of whether they can be accepted into the system. However, the final term of whether a rule can be accepted into a system is set by how well it coheres with the rest or contributes to its coherence. The system has the ultimate bearing upon consideration of epistemic significance to any of its parts (Elgin and Cleves 2013, 245). We will assess that the rules are not freely admitted but constrained by a certain set of criteria against which the methodological rules and the mutual relations among them can be evaluated. We will evaluate these holistic criteria for the scientific methodology and how they can collectively define a system. They will rule out the addition of unwanted or irrelevant rules to the system to make it a realizable one. Any rule contributing to the system’s overall coherence will carry a certain epistemic weight while those irrelevant ones will not. We will also argue that, besides fulfilling these holistic criteria, there are certain conditions under which the methodological rules have to operate if they are to collectively achieve the system’s goal.
The paper attempts to reconstruct the methodological debates in the history of science to arrive at a holistic understanding of scientific methodology at work in scientific experiments or systems. As in the naturalistic accounts of Laudan and others, the resulting holistic account is naturalistic because it is grounded in empirical sciences and the history of science and is normative because it retains the normative dimension of traditional epistemology. The focus is on establishing a holistic understanding of a system of methodological rules, its justification, and that of individual methodological rules. The paper will make use of Laudan’s instrumental conception of methodological rules and its critiques to establish that one possible identity of the methodological system is its holism expressed in terms of the coherence of the methodological rules. The thesis is that the instrumental value of the methodological rules of a system in promoting its cognitive end one way or another and the instrumental support they lend to one another determine their justification and the system’s coherence. We will lay down the holistic criteria for scientific methodology. They will be used to show how the holistic account could satisfactorily explain the success of the Compact Muon Solenoid (CMS) in discovering the Higgs boson and the instrumental error behind the apparent faster-than-light neutrino anomaly of the Oscillation Project with Emulsion-t Racking Apparatus (OPERA) experiments respectively.
2 Revisiting Laudan’s Instrumental Theory of Methodological Rules
Laudan understands methodological rules as hypothetical imperatives which relate cognitive ends with means that are efficacious for achieving those ends (1984). His typical examples of methodological rules are propounds only falsifiable theories, rejects inconsistent theories falsifiable theories, and avoid ad hoc modifications. All methodological rules can be formulated as hypothetic imperatives: “If one’s cognitive goal is x, then one ought to do y” (Laudan 1987, 24). That is, a consideration of the plausibility of certain means (do Y) in term of how efficacious they are in achieving certain cognitive goals (achieve X) result in the instrumental form of the methodological rules. Laudan asserts that “Popper’s familiar rule, ‘avoid ad hoc hypothesis,’ is more properly formulated as a rule: ‘if one wants to develop theories which are very risky, then one ought to avoid ad hoc hypotheses’” (1987, 24). The force of methodological rule will depend upon the theories about X and Y and their connection. If they tell us that Y is the most effective means to achieve y, then acting on this particular methodological rule (to achieve X) is rational. The rules are employed to achieve various goals. For instance, the rule that advises propounding only falsifiable theories promotes scientific theory, and the rule that asks us to reject inconsistent theories promotes truth.
Laudan asserts that the justification of methodological rules is both instrumental and empirical. He considers the methodological rule: “If one wants to learn whether a drug or therapy is genuinely effective, prefer double-blind to single-blind experiments” (Laudan 1984, 38–39). The choice of this methodological rule is instrumentally justified because the double-blind experiment is a more efficacious means than the single-blind experiment to test whether a drug or therapy is genuinely effective. Because the single-blind experiment suffers from a placebo effect, the unconscious transmission of the researchers’ expectations to the subject results in that of the subject. So, if one wants to know the efficacy of the drug/therapy under test, one should rule out the placebo effect. This is achieved by using the double-blind methodology in which the researcher does not know whether a subject is receiving the drug or therapy under test and has no relevant expectations to convey to the subject. It is also empirically justified as the observed instrumental relationship between the double-blind experiment and its efficacy in testing the genuineness of the drug provides evidence that gives conclusive support to the methodological choice of the double-blind experiment. The same can be said of the rest of the methodological rules.
Laudan’s argument is based on the underdetermination thesis that methodological rules are radically underdetermined by the evidence. As he states, “So far as we know, there may be equally viable methods for achieving all the cognitive goals usually associated with science (Laudan 1984, 36). This implies that multiple rules could be efficacious to a specific aim. It is a historical fact that scientists with the same ends differ on which rules to adopt to realize those ends. So, if the scientific methodology is fixed, it will be irrational for those scientists who share the same ends but disagree on which rules to adopt to achieve them (Laudan 1984, 35). It is a historical fact that there has been a disagreement over whether the rule of predesignation-which asserts that a hypothesis has to be tested only by the new predictions it entails, not by its ability post hoc to explain already known phenomena-is the legitimate rule to adopt. Although they share the same aim of seeking true, general, and explanatory theories, Whewell, Pierce, and Popper prefer the rule of predesignation, whereas Mill and Keynes do not (Laudan 1984, 36). If we assume that a scientific methodology is fixed, then we will not be able to rationally account for their disagreement. We can draw from this that it is plausible to maintain that there is a diversity of instrumentally efficacious rules to achieve a specific aim.
Doppelt has questioned Laudan’s instrumental interpretation of methodological rules. He claims that the history of science itself reveals evidence that basic methodological standards such as the rule of predesignation, the principle of consilience of induction, etc., are categorical imperatives (Doppelt 1990, 9). Because they are justified even if we do not have conclusive evidence that they are the most effective means for attaining cognitive ends. That is, he agrees with Laudan in assuming that many methodological rules are rightly interpreted as hypothetical imperatives. He disagrees with Laudan by arguing that basic methodological rules, such as the rule of induction and the rule of predesignation, do not rest on empirical knowledge and are to be treated as categorical imperatives (Doppelt 1990, 9). He takes the example of the rule of predesignation, which states that a hypothesis is tested only by the new predictions drawn from it, not by its ability post hoc to explain what was already known.
Doppelt finds that Laudan’s underdetermination argument is implausible as there is no empirical evidence, for instance, that the rule of predesignation is even “an appropriate means” for reaching the cognitive ends associated with it. The rule of predesignation asserts a criteriological connection, not a contingent one, between the means and associated cognitive ends. That is why we do not have any empirical evidence that it is or is not an effective means to the associated aims even after 150 years of inquiry and debates (Doppelt 1990, 13). Treating methodological rules as hypothetical imperatives in such cases fails to yield meta-methodological advice, which helps scientists resolve the disagreements over these rules. Laudan, of course, can maintain that in these cases, the methodological choice is underdetermined by cognitive aims and empirical evidence. But, this will imply that his naturalism allows us to use any one of the “equally effective” methodological rules. This will show that methodological choices cannot be made on empirical grounds. Doppelt asserts that treating all methodological rules as hypothetical imperatives misrepresents the way scientists treat basic methodological standards in actual scientific practice (1990, 14). Doppelt concludes that certain basic methodological standards in actual scientific practices are to be treated as categorical rather than hypothetical.
Laudan (1990) asserts that Doppelt’s argument is misleading as it is based on cases of methodological disagreement in science, such as the rule of predesignation that scientists have not been able to resolve. He claims that Doppelt’s conclusion from this that methodology is not an empirical matter is unfounded, for it has overlooked many cases of agreements in methodology (Laudan 1990, 57). Resnik (1992) has also found Doppelt’s underdetermination argument unconvincing. He claims that some methodological rules are radically underdetermined by the evidence does not imply that they are not hypothetical imperatives (Resnik 1992, 449). This implies that even if the underdetermination argument may be taken to defeat the justification of a particular methodological rule (s), it does not follow that it undermines the instrumental view of methodology. Doppelt’s underdetermination argument fails because although he claims to have defended a categorical view on scientific methodology, it appeals to empirical support for the justification of methodological rules (Resnik 1992, 499). Resnik thinks that Doppelt essentially assumes Laudan’s position that the justification of methodological rules is an empirical matter. Contrary to Laudan’s full-fledged instrumental view, Resnik claims that some methodological rules are categorical because they are justified independently of their ability to realize certain ends. As an illustrative example, Resnik uses the rule of consistency (RL) which states that “if the inconsistencies from inconsistent theories cannot be eliminated, then reject the inconsistent theories provided some viable alternative theories are available” (1992, 500). He claims that if RL is considered justified irrespective of one’s cognitive aims, then there are basic methodological standards that are not hypothetical imperatives. Resnik (1992) claims that the argument that relies on uncontroversial rules (i.e. the rule of consistency) is more convincing than Dopple’s underdetermination argument that uses controversial rules (i.e. rule of predesignation) to defeat Laudan’s instrumental interpretation.
Resnik has argued that Laudan’s view works in cases where the ends are clearly defined but fails in cases where they are vague (1992, 502). For example, while RL may be used to obtain truth, it could also be used to achieve empirical adequacy, explanatory power, and predictive success. So, it is more plausible to maintain that the rule is justified regardless of the ends it serves. Resnik also presents a regress argument against Laudan. He states that if aims change and the process of change is governed by some rules, then the justification for these rules could not depend on the ability they have to promote certain aims since this would imply a regress of aims and rules (Resnik 1992, 502). He maintains that there are rules that should be best understood as categorical imperatives because they are good in and of themselves and are not justified by referring to certain ends.
Contrary to Doppelt’s assertion that all basic methodological rules are categorical imperatives, Resnik argues that at least some of them are hypothetical imperatives (1992, 503). He focuses on Doppelt’s distinction of basic methodological rules from other methodological rules (e.g., experimental design rules) by considering the latter as hypothetical imperatives while the former is not. In contrast to Doppelt, Resnik does not distinguish rules of experimental design from basic methodological rules for two reasons (1992, 503). First, experimentation cannot be sharply separated from other epistemically more “fundamental” processes such as confirmation and acceptance. Even if it is taken logically independent from them, logical independence does not entail that they are independent of one another in practice. Because, in practice, experimentation is more or less continuous with other scientific activities, we cannot reasonably maintain that a distinct set of rules governs it while other rules govern other activities. Again, some basic methodological standards seem to apply to experimental design and matters of epistemic choice. While the principle of simplicity is helpful in experimental design as it decreases the chances of errors, it is also used as a basis for choosing a simpler theory than a complex one. Therefore, Doppelt’s sharp distinction between experimental and basic methodological rules is unconvincing.
Resnik concludes that the debate between Laudan and Doppelt is inconclusive (1992, 506). Because while one can justify that an epistemic reason is instrumental by citing a specific goal, another can justify it as categorical by claiming that a belief or action’s justification does not depend on its ability to promote a goal. So, he thinks the hypothetical/categorical distinction is dubious and be abandoned. He proposes a holistic approach that focuses on the role of methodological rules in a system of beliefs such that those that play major roles may be considered categorical while those with marginal roles as hypothetical. No rule is considered foundational or absolute but revisable in principle.[4] Taking a cue from Resnik’s holistic turn of methodological rule, we want to establish a holistic understanding of the scientific methodology. The paper proposes that this holistic understanding can be expressed in the coherence relation among the methodological rules of a system. We want to differ from him by taking an instrumental interpretation of the methodological rules. Thus, we take the instrumental value of a methodological role in furthering the system’s goal as a key to the idea of coherence and thus holism of a system of methodological rules. The thesis is that the scientific system can be analysed as a system of methodological rules that are mutually interdependent to constitute a coherent body. Catherine Elgin (Elgin and Cleves 2013) has developed a coherence-based holistic epistemological account, but in a slightly different context, which can help explicate the position defended in the present paper. While using her insights relevant to the current concern, the paper will go further by taking the instrumental-based coherence that underwrites the holistic epistemological account.
3 Holistic Criteria for Scientific Methodology
The paper will attempt to establish a holistic account of a methodological system whose holism is expressed in a coherent term. The holistic approach constitutes a transition from focusing on individual commitment to the organized system of commitments as the primary unit of analysis. This means that individual methodological commitments are evaluated in the light of how well they cohere with rest of the system. In other words, the epistemic status of a given methodological rule or commitment is primarily a property of a fairly comprehensive system of mutually supportive commitments (Elgin and Cleves 2013, 245) The term coherence implies that for a methodological rule to be accepted, it must fit into an organized system comprising mutually supportive commitments. The term instrumental implies that the acceptability of norms into the system is partially set by its instrumental role, that is, how efficacious it is in furthering the system’s cognitive goal. In a similar line to Catherine (Elgin and Cleves 2013, 245) the acceptability of a particular methodological rule stems from its (instrumental) role in a system.
In line with Catherine (Elgin and Cleves 2013, 245), the issue at hand is to explain how systematic interconnections among the methodological rules of a system give rise to the justification of the system itself and the methodological rules themselves. That is, the task is to explain how the fact that a specific methodological rule is instrumental in the system provides the reason for the justification of its acceptance into the system. We assume that being a part of a coherence system is insufficient to justify accepting a methodological rule. The position undertaken in this paper is that coherence underwritten by the instrumental value of a specific methodological rule is what justifies its acceptability in a system. We want to show how the mutual accord among a system’s methodological rules would indicate their justification only if not only individually but also collectively; they are instrumental in promoting its goal. The set of criteria that may collectively define a system are spelled out below:
Instrumental criterion (I): A methodological rule X is justified if it has instrumental value in promoting the system’s cognitive goal.
A coherent criterion for a methodological rule (Cc): A methodological rule X is said to be justified if it coheres with the rest of the coherent system.
Coherence criterion for two methodological rules (Cpc): If both X and Y advance a given goal, and if X and Y are instrumentally connected, X and Y are coherent (X and Y are instrumentally connected if one advances the other or if together they advance a common goal).
Acceptability criterion (A): The acceptability of a methodological rule in a system of normative commitments depends on how it coheres with them and its instrumental value in advancing its goal.
Coherence criterion for the system (Cs): A system S is said to be coherent if its methodological rules are mutually consistent and supportive in advancing its goal.
These holistic criteria constitute the basic elements of the holistic conception of scientific normativity. They may not fully specify a system but can satisfactorily explain its constitution and success or failure in its performance. To a satisfactory extent, they can collectively define a goal-oriented system as a set of methodological rules. In other words, they collectively set the term for what a methodological system should be such that a given methodological rule is justified only in light of how it satisfies them. A justified methodological rule contributes to the system’s overall coherence. Two points are in view. The first point is to use the holistic criteria to explain the success of CMS collaboration in detecting the Higgs boson. The second is to see how the instrumental error behind the apparent faster-than-light anomaly of the OPERA experiment can be interpreted as a failure to meet holistic criteria. As the paper proceeds, it will become clear why the whole of the system matters. It will also inform us of the epistemic standings and dispensability of the methodological considerations and how the holistic account supports the plurality of systems. In what follows, we will present a brief overview of the methodological descriptions of the components of these experiments to see how they fare with the holistic criteria.
3.1 The Successful Discovery of the Higgs Boson: The CMS Experiment
The CMS experiment and its counterpart A Toroidal LHC Apparatus (ATLAS) experiment at the European Council for Nuclear Research (CERN) are designed with the main goals of finding evidential support for Higgs boson, super-symmetry, or to study aspects of heavy ion collision, and particles that could make up the dark matter. With these similar goals in mind, although the CMS and ATLAS experiments are composed of sub-detector systems that employ different technologies, different methods of calibration and reconstruction, they are designed to complement each other by providing corroboration of findings. This is evident in the roles they play in discovering the Higgs boson, and hence the confirmation of the existence of the Higgs field (Giannoti and Virdee 2015). Since Higgs bosons are extremely difficult to produce and detect, particle collisions at sufficiently high energies, as in the Large Hadron Collider (LHC), are necessary to produce them. The constituent quarks and gluons interact when two protons collide within the LHC. These high-energy interactions can produce a Higgs boson through well-predicted quantum effects, which would immediately transform or decay into lighter particles that ATLAS and CMS could observe. The two particle detectors are designed to complement each other and provide corroboration of findings. Both are composed of sub-detector systems that use different technical solutions and designs for their detector magnet system to achieve the goals (Giannoti and Virdee 2015). We will provide a rough sketch of the CMS experiment by highlighting the aims of its subsystems and the corresponding rules.
3.1.1 An Overview of the CMS Experiment
For the detection of the Higgs boson, the CMS experiment requires a large magnetic system, the best possible electromagnetic calorimeter, a high-quality tracking system, a hadron calorimeter with sufficient energy resolution, and the steel flux return yoke outside the solenoid hosts gas ionization detectors for the identification and reconstruction of muons (CMS Collaboration 2008). The superconducting solenoid magnet is the central feature of the CMS Experiment (CMS Collaboration 2008, 6). The charged particles emerging from the LHC must be identified and separated as positive and negative. The CMS magnet’s method (says Mm) achieves this by bending the trajectories of these particles in opposite directions according to the charges they carry. It facilitates the CMS tracker to track the paths of the particles through its magnetic field.
The CMS tracker consists of many concentric layers of silicon sensors (CMS Collaboration 2008, 29). It also allows the reconstruction of vertices, both the primary proton-proton interaction points and secondary vertices due to particle decays. This requires a detector with high channel density and high spatial precision so that close-by tracks can be distinguished. When the charged particle traverses the silicon sensors, it creates an electrical signal that can be detected. Dividing the sensors into strips or pixels allows an estimation of the incidence position of the charged particles (CMS Collaboration 2008, 33). By combining the information from many layers, a “track” is reconstructed. The tracker records particle paths accurately by finding their positions at several vital points so accurately that tracks can be reliably reconstructed using a few. It can reconstruct the paths of high-energy muons, electrons, and hadrons and see tracks coming from the decay of very short-lived particles such as beauty or “b quarks.” Once the track path is reconstructed, measuring the radius of curvature of the track gives an estimation of the particle momentum. The point is that the momentum of particles is crucial to building up a picture of events at the heart of the collision. The more curved the path, the less momentum the particle had.
The Electromagnetic Calorimeter (ECAL) then measures the energies of the incident electrons/positrons and photons by completely stopping them (CMS Collaboration 2008, 90). ECAL fits this methodological description since it could deposit almost all the energy of electrons and photons in its crystal volume. The coherence relation still stretches further. Because it is still needed to detect and measure the energies of the hadrons particles that got undetected and passed through the ECAL. Hadronic Calorimeter (HCAL) has the method (MH) to achieve this by completely stopping them (CMS Collaboration 2008, 122). It also aids in the identification of electrons, photons, and muons in conjunction with the other sub-detectors. The CMS has been specifically optimized for the detection and measurement of muons. This is achieved by using drift tubes (DTs) located outside of the solenoid in the barrel region and cathode strip chambers (CSCs) in the forward region (CMS Collaboration 2008, 165). The CMS muon system is also equipped with resistive plate chambers (RPCs) dedicated to triggering purposes (CMS Collaboration 2008, 216). We expect muons to be produced in the decay of several potential new particles. For instance, one of the most apparent signatures of the Higgs Boson is its decay into four muons. Because muons can penetrate several meters of iron without interacting, unlike most particles, they are not stopped by any of CMS’s calorimeters. Therefore, chambers to detect muons are placed at the very edge of the experiment, where they are the only particles likely to register a signal.
In addition, the CMS experiment has a trigger system and data acquisition that run with various sub-detectors (CMS Collaboration 2008). In order to produce a rare particle, such as a Higgs boson, a tremendous amount of collisions is required. However, most collision events in the detector do not produce interesting effects. The amount of raw data from each crossing is enormous, in the order of megabytes. It has to be reduced to a manageable level. The full trigger system accomplished this by using a series of trigger stages. The data from each crossing is held in buffers within the detector. A small amount of key data is used to identify features of interest, such as muons. The data that have passed the triggering stages and are stored on tape is duplicated using the Grid to additional sites around the world such that physicists can use the Grid to access and perform the analyses on the data.
3.1.2 Accounting for the CMS Experiment’s Successful Discovery of the Higgs Boson
The CMS components have complementary relations that can be expressed in the terms set by the holistic criteria. The components are instrumentally related by carrying out the assigned functions and collectively contribute to detecting the Higgs boson. Without one, the CMS experiment will not achieve its goal. Information about the energies of the various particles that each part of the CMS experiment measured separately and produced in each collision is crucial to understanding what occurred at the collision point. The CMS magnet, for example, satisfies the instrumental criterion (I). Because by bending the particles in its field, the CMS magnet plays its part in detecting the Higgs boson. It also satisfies the (Cc) criterion. Because it coheres with the tracking method (MT) by facilitating tracking of the paths of the particles through its magnetic field. They cohere with the Electromagnetic Calorimeter (ECAL), which measures the incident particles’ energies.
The instrumental rule “If C obtains, one ought to do X” can be used to formulate the rules governing the methodological choices associated with the various components of the CMS collaboration. For example, the rule (RM): “if one wants to separate the charge particles emerging from high energy collision in the LHC, one ought to use the CMS magnet to bend them in the opposite direction per the charges they carry” can be taken to govern the choice of CMS magnet, which fulfills the methodological descriptions. Again, RT: “if one wants to identify and measure the momentum of the charged particles emerging from LHC, one ought to use the CMS tracker to record the paths they take” can be the governing rule of the choice of the CMS tracker. Similarly, RE: “if one wants to measure the energies of the incident electrons/positrons and photons emerging from LHC, one ought to use ECAL to stop them completely” and (RH): “if one wants to detect and measure the energies of the hadrons emerging from LHC, one ought to use HCAL to stop them completely” can be the rules governing the choices of ECAL and HCAL respectively. Similarly, the corresponding rules for the other domains can be formulated. Now, RM, RT, RE, RH, etc., are said to promote a goal G jointly, i.e., detecting the Higgs boson, if, and only if,
Each of them promotes G one way or another.
The effectiveness of these rules in promoting G in context C, the CMS experimental set-up to detect the Higgs boson, primarily depends on and increases with the rest of the rules.
If all rules in C promote G, then there is no rule in C such that, if it is replaced by a rule that does not cohere with the rest of the rules, the replacing rule would have been more effective in promoting G.
If all rules in C promote G, then no rule in C promotes G such that it would have been more effective in promoting G if some combination of the other rules in C impedes G
For a system such as the CMS collaboration, its rules are said to promote its goal by means of a certain combination of actions that are similar or different in some relevant ways. In this sense, rules RM, RT, RE, RH, etc., with distinctive instrumental roles, are said to work collectively in a certain way to detect the detection of the Higgs boson. The context C is a situation in which the methodological choices issued from the rules are such that the optimal methodological action ensues from one depends on those from the other rules. It is a context where multiple methods promote the system’s objective. Thus, the CMS collaboration can be meaningfully assumed as the systematic interconnection of diverse commitments and instrumental accord among them.
Considering individually, none of the methodological rules will account for much (Elgin and Cleves 2013, 245) because the chance of achieving the collaboration’s goal will be too low. This, in turn, will make its goal too utopian to pursue. But, collectively, they provide us reason to believe in pursuing the goal. One or some of the considerations among RM, RT, etc., would not matter much. For example, without ECAL (RE), HCAL would receive an avalanche of particles that the CMS will fail in its purpose. This is true for the rest of the collaboration. So, the mutual accord among the methodological rules in terms of the extent to which they individually and collectively contribute to advancing the collaboration’s goal will enhance the epistemic standing of each of them. This enhanced chance of each methodological rule, when considered jointly with the rest of the collaboration, gives us more reason to believe in the justification of each of them in the light of the rest than we have reason to believe in the justification of each of them separately. This explains how the instrumental relationship among the rules entails their justification. The fact that the description of each methodological rule fits the collaboration’s instrumental requirement explains the agreement among them. The justification of a given methodological rule makes sense only in the presence of the rest of the ensemble of commitments shows that they stand or fall together. The above example indicates that there is no point talking about the epistemic justification of RE or any other considered separately from the rest of the ensemble. They have to be weaved together, in the words of Catherine (Elgin and Cleves 2013, 246), but instrumentally, with the rest of the ensemble to begin meaningfully talking about it.
The methodological considerations do not individually acquire their credibility.[5] Only when they are weaved together, they would gain credibility. This suggests that the epistemic standing of the several parts derives from their mutual supportiveness. Because it is only in the light of the rest, a given methodological rule is acceptable and gains credibility (criterion Cs). A given methodological rule will be acceptable only if the rest of the ensemble of supporting methodological rules is. The epistemic status of a given methodological rule is inseparable from the rest of the collaboration. Even if the supporting methodological rules are separately acquired, they stand or fall together. Such an explanation indicates that the epistemic status of a given methodological rule is primarily a property of a fairly comprehensive collaboration with mutually supportive commitments (Elgin and Cleves 2013, 245) when the best explanation of the coherence is expressed in the instrumental term, which claims that each of the methodological rules must in one way or another promotes the system’s objective (criterion CS). The epistemic justification of individual methodological rules derives from being a part of the coherent collaboration. The collaboration cannot be so open to entertaining every methodological rule deemed relevant to its goal to avoid becoming too idealistic. Entertaining every consideration, especially the irrelevant ones, would be contravening as it will impede progress. So, there is every possible reason to exclude them for the system to be realistically workable.
3.2 The Apparent Faster-Than-Light Neutrino Velocity Anomaly: OPERA Experiment
The OPERA experiment has been used to precisely measure the neutrino velocity, although its original goal is to observe how neutrinos switch between different identities (Adam, T., OPERA Collaboration 2012, 1). The principle of the OPERA neutrino velocity experiment is to compare the travel time of neutrinos against the travel time of light through the measurement of the distance between CERN at which the neutrino emerged and the OPERA detector where they are detected and dividing it by the speed of light in a vacuum to measure the neutrino travel time and compared this value to the measured travel time (Adam, T., OPERA Collaboration 2012). We will sketch a rough outline of the OPERA detector by highlighting the aims of its components and the corresponding rules.
3.2.1 An Overview of the OPERA Experiment
The OPERA Detector consists of two identical supermodules (SM). Each has a target section composed of walls filled with lead/emulsion bricks alternated with walls of scintillator strips that constitute the Target Tracker (TT) (Agafonova, N., OPERA Collaboration 2011, 5). The main objective of the Target Tracker is to locate the lead/emulsion brick where a neutrino interaction has occurred. A muon spectrometer at the end of each SM is used to identify and measure the momentum and the sign of the charge of the penetrating muons. Each spectrometer consists of a large iron magnet instrumented with plastic Resistive Plate Chambers (RPC) to provide coarse tracking, a range measurement of the stopping particles, and calorimetric analysis of the hadrons that escape the target along the incoming neutrino direction. Its drift tubes (Precision Trackers, PT) measure the deflection or bending and track the charged particles inside the magnetized iron.
By means of additional resistive plate chambers (XPC) placed after each target section, the left/right ambiguities in the track pattern recognition inside the PT are removed (Agafonova, N., OPERA Collaboration 2011, 6). Together with the RPC, the XPC provides an external trigger to the PT. Finally, a VETO system with glass RPCs is installed in front of the first SM to exclude fake events generated by the neutrino interactions with the rock and concrete around the OPERA detector and upstream of the target (Di Giovanni, Candela, Di Marco, D’Incecco, Gustovino, Lindozzi, Orlandi, and Tatanani 2006, 1). Real-time information from the scintillators and the spectrometers identify the bricks where the neutrino interactions occurred. The bricks in which neutrino interactions have occurred are identified by the event reconstruction in the trackers and the spectrometers. The candidate bricks are extracted from the walls on a regular basis. After X-ray marking and exposure to cosmic rays for alignment, the emulsion films are developed and sent to the emulsion scanning laboratories to accurately scan the event (Acquafredda, R., OPERA Collaboration 2009).[6]
The OPERA experiment can be visualized as a system of rules and not as a case of a single rule. Each of its several components has a specific goal with methodological rules that govern their operations. The instrumental rule “If C obtains, one ought to do X” can be used to formulate the methodological rules governing the choices associated with the various components of OPERA collaboration. For example, the rule (RTT): “if one wants to locate the lead/emulsion brick where a neutrino interaction has occurred, one ought to use the target tracker” can be taken to govern the choice of Target Tracker, which fulfills the methodological descriptions. Again, RMS: “if one wants to identify, measure the momentum, and the sign of the charge of the penetrating muons, one ought to use the muon spectrometer at the end of each SM” can be the governing rule for the choice of the muon spectrometer. Similarly, RPT: “if one wants to measure the deflection or bending and tracks of the charged particles inside the magnetized iron, Precision Tracker” can be the governing rule for the choice of the Precision Tracker. Similarly, the corresponding rules for the other domains can be formulated. Now, RTT, R,MS, RPT, etc., are said to jointly promote a goal G in question, for instance, that of measuring the neutrino velocity. It is in the light of these rules which work collectively that talk about the OPERA experiment makes sense. So, the OPERA experiment is not truly about an individual rule but a system of rules which stands or fall together and are interdependent in one way or another to achieve its goal. The measurement of the neutrino velocity is briefly presented below.
The OPERA scientists observe the neutrino beam traveling continuously from CERN to LNGS, the CERN Neutrinos to Gran Sasso beam (Acquafredda, R., OPERA Collaboration 2009). CERN generates neutrinos by accelerating protons up to a high-speeds and slamming them into carbon targets. The protons then decay into intermediate positively charged particles kaons and pions, which will decay into muons and muon-neutrinos in a 1000 m long vacuum pipe, after which the muons will be filtered out and lead to the laboratory at Gran Sasso. This results in the CERN Neutrino beam to Gran Sasso (CNGS neutrino beam). Most of the beam now consists of muon-neutrinos, which are detected by the OPERA detector, which not only locates neutrino interactions in its target but also measures the arrival time of neutrinos (Acquafredda, R., OPERA Collaboration 2009).
The neutrino velocity has to be measured. For this, the time of flight of neutrinos (TOFn) is measured and compared with the time of flight assuming the speed of light (TOFc), which results in the deviation dt = TOFc−TOFn (Adam, T., OPERA Collaboration 2012). TOFn cannot be measured at the single interaction level because it is not clear which proton will produce a neutrino. The time distributions of protons for each sample of neutrino interactions observed in the detector at CERN are measured. This gives the probability density function (PDF) of the time of emission of the neutrinos at CERN. Besides, these time distributions can then be compared to the time distributions at OPERA. The GPS receivers and Cesium (Cs) atomic clocks at both ends of the CNGS beam performed the timing measurements. These are needed for accurate relative time tagging. Accurate knowledge of the neutrino baseline between the CERN and Gran Sasso facilities is required in order to determine dt. Other measurements such as neutrino event timing are also made (Adam, T., OPERA Collaboration 2012, 8).
After the data has been collected, the contaminated ones are filtered out to be left out of the final analysis. The selected data is sorted into different categories using a classification algorithm for further analysis (Bertolin and Tran 2009). The data is then analysed by using statistics-such as a maximum likelihood procedure for the proton extraction at the CERN- for the different components of the experiment. Data analysis is also done to exclude or minimize possible systematic effects. Finally, on 22 September 2011, the OPERA collaboration communicated the results concerning the neutrino velocity measurement, which appeared to have exceeded the speed of light (c) (Adam, T., OPERA Collaboration 2012). They find that dt = TOFc−TOFn = (6.5 ± 7.4 (stats.) (+ 8.3) (− 8.0) (sys.)) ns and that (v−c)/c = (2.7 ± 3.1 (stats.) (+ 3.4) (− 3.3) (sys.) 10−6) (Adam, T., OPERA Collaboration 2012, 29). These results yield a significant six times statistical deviation from the upper limit c. This would imply a serious anomaly for the theory of relativity in particular and physics in general.
It takes until July 2012 for the OPERA collaboration to figure out that the anomaly arises due to two internal errors in the experimental set-up (Strassler 2012). The first is a time shift due to an improper connection of an optical cable. A link from a GPS receiver to the OPERA master clock was loose, which increased the delay through the fiber. The Master Clock gives a pulse with a delay and makes neutrinos appear to have traveled in less time than they actually have, which results in apparent fast neutrinos. Second, a clock oscillator was ticking too fast. A clock on an electronic board ticked faster than its expected 10 MHz frequency, which lengthened the reported flight-time of neutrinos and hence reduced the seeming faster-than-light effect. This explains faster than the light anomaly. After accounting for these two sources of error, the OPERA collaboration found that the neutrino speed is consistent with that of light. They, at last, think that the original measurement can be written off as owing to a faulty element of the experiment’s fiber-optic timing system.
3.2.2 Accounting for the Apparent Faster than Light Velocity Anomaly
The faster than light neutrino anomaly gives us the opportunity to examine the scientific method in action. In addition, it conflicts with the previous findings. Kalbfleisch, Baggett, Fowler, and Alspector (1979) were able to measure the maximum deviation of the velocity of movement of a neutrino vu compared to c: (v−c)/c < 4′10−5. The neutrinos from the SN1987A supernova yielded a maximum deviation of çv−c ê/c < 2′10−9 (Longo 1987). The MINOS collaboration reported in 2007 a measurement of (v−c)/c = (5.1 ± 2:9)′105 (Adamson, P., MINOS Collaboration 2007). That is, all these results are in agreement with the theory of relativity. As noted above, the OPERA collaboration reported a significant statistical deviation of the neutrino velocity compared to c. the theoretical physicists did not believe in the result of the OPERA experiment because speeds greater than that of light in a vacuum are generally thought to violate special relativity, considered to be a cornerstone of the modern understanding of physics for over a century. In terms of Quinean holism, the theory of relativity is a part of the core of theoretical physicists’ web of belief as other laws of quantum and laws of logic are (Quine 1970, 100).
One can see that had the Opera master clock operated according to the appropriate methodological rule; then the apparent faster-than-light anomaly could have been avoided. The methodological rule concerning how it should be set up should have been correctly followed. Its violation has introduced an instrumental error that contravenes the experiment. In the likes of Catherine (Elgin and Cleves 2013, 247) but here in a methodological context, contravening consideration or error (E) can be roughly stated as, A methodological action is said to contravene a system if it impedes promoting its goal G. That is, we should avoid E if we are to achieve G.
A contravening methodological action should be avoided for the epistemic credibility of the system to remain intact. So, any seemingly worthy methodological actions are not immediately entertained into a system. An empirical test is conducted on them. An empirical test is done on its instrumental role as to whether it really functions as it should further the system’s objective of which it is supposed to be a part (criterion I). An empirical test should also be done on whether it meets the coherence criterion (C), its coherence contribution, or its relation to the already credible part of the system. If the empirical test shows that it does not cohere with the other credible parts (s) of the system, then it has to be rejected. The optic-fiber system has failed on these due to its instrumental error that resulted in the apparent faster-than-light neutrino velocity anomaly. It violates one of the methodological prescriptions concerning the master clock system as mundane: “if one wants to measure the timing correctly, then one ought to screw the master clock tightly and properly.” This introduced instrumental malfunctioning. The OPERA collaboration was quick to publish the results partly due to the absence of a competing collaboration, unlike in the CMS that has ATLAS as its competing collaboration. For the same reason, they could have done the research a bit longer. In other words, they did not put enough empirical examination along the line required by the holistic criteria to ensure the experimental set-up works appropriately.
The OPERA collaboration re-examined the experimental set-up to see whether it operates correctly (Adam, T., OPERA Collaboration 2012). They identified the methodological lapses–the fiber cable being not fully screwed during the data gathering-that resulted in erroneous measurements. Every other part of the OPERA experiment works well. Their methods are applied correctly. However, the master clock did not work properly due to a violation of methodological prescription. As it fails in its instrumental role, the master clock fails to cohere with the rest of the properly functioning parts of the system. Even minor methodological negligence could prove so contravening that it can fail the system and question its credibility. That is why utmost methodological attention should be given while setting up experiments, especially sophisticated ones such as the OPERA experiment. A violation of the methodological rule and hence the holistic criteria is unacceptable. Therefore, the apparent faster-than-light neutrino velocity anomaly can be explained by construing the instrumental error that gives rise to it as a violation of the holistic criteria, specifically the instrumental criteria which resulted from the failure of the master clock to provide the correct timing of the neutrino, instrumentally valuable information to precisely measure the neutrino velocity.
We can observe that the individual rules governing the subsystems of the experiments such as CMS and OPERA experiments can be taken to assume the instrumental form “If C obtains, one ought to do Y.” It is in the context of the experiment in which they play instrumental roles that the individual rules are said to be interdependent. The interdependence relation among the rules is such that without some of them, especially the indispensable ones such as the CMS magnet and Electromagnetic Calorimeter of the CMS experiment, the system may not come about. Because of this, we can assume that the indispensable ones constitute the system’s core and carry heavier epistemic weight than the rest. Without one such rules, the system’s goal may not be achieved. That is why the rules are said to jointly promote the system when they operate collectively under the conditions (1) – (4) laid out above. This is evident when the rule concerning the clock system in the OPERA system gets violated. The resultant erroneous measurements in the clock system eventually break the supposed coherence of the OPERA system. Upsetting the system results in an erroneous overall outcome, viz., the greater than velocity of light of the neutrino. The OPERA experiment apparently fails to accurately measure the neutrino velocity just because one of its rules get violated. That is, when one of the rules gets violated or ignored; no matter how well the rest of the system works, it will hinder the system from achieving its intended goal. This shows how the interdependence among the rules is crucial to the system’s coherence and achieving its goal.
We can also observe that since an error, even on one part of the system, can upset the whole system, violation of rules must be avoided. However, the error does not pose a severe threat of introducing an irreparable breakdown of the system’s coherence. An empirical check is in place. An experiment has to be carefully examined to identify any malfunctioning components in order to eliminate any possible sources of error (Chalmer 1999, 200). That is, if an error arises, the task is to identify the particular part (s) of the system whose rule gets violated and consequently rectified them. For the same reason, it is not a problem for the holistic view if two or more errors happen to mask each other. Since the rules are admitted into the system based on the instrumental value, errors may arise from violating one or more rules. The causes, viz., the malfunctioning components can be identified, segregated, and amended accordingly. The coherence of the system ultimately gets restored. A further worry may arise as to whether coherence might not be enough, but ultimately a focus on the individual rules is required. Although the focus is primarily on the system as a whole, the focus on the individual rule is still retained. Because the instrumental value of the individual rule serves as the basis for consideration of its acceptability into a system. However, its eventual acceptance rests upon how well it coheres with the other parts of the system. It is a matter of epistemic priority that the system comes first against which the individual rules get considered. The individual rules are evaluated in light of how they can contribute to the system’s coherence.
4 An Empirical Consideration, Epistemic Standings, and Pluralism of Systems
In a naturalistic scheme of things, empirical judgment enjoys a certain privilege in that any claim to be epistemically plausible must withstand an empirical evaluation. Even the laws of Quantum physics and logic are best considered empirical hypotheses in that they are open to refutation or revision in light of empirical findings (Quine 1970, 100). So, it is not surprising that the initial starting point from which to consider a methodological consideration is an empirical matter. Establishing the instrumental significance of a methodological rule and its coherence with the rest of the system is an empirical matter. Before empirical consideration, there are no intrinsically privileged kinds of methodological rules. There is no uncritical acceptance of any methodological consideration prima facie, but an empirical checking is conducted on the kind and extent of the instrumental roles they play in the system. HCAL (MH) is acceptable to the CMS collaboration for empirical evidence of its ability to stop and measure the energies of the Hadrons. Again, the fact that an empirical finding would tell us that a methodological rule promotes the system’s goal gives it an initial measure of plausibility. Even that small measure of empirical plausibility is epistemically significant. For instance, when the CMS scientists ponder over which tracking method/device to use, they would have chosen the particular CMS tracking method they used, even if it might have only a small measure of initial plausibility. Only because of initial empirical plausibility can we imagine beginning to construct the system in the first place. Without initial empirical plausibility, however small the measures may be, we cannot begin to entertain the idea of incorporating a methodological rule into a coherent system.
A theory or system can be considered an evolving one (Elgin and Cleves 2013, 253). More specifically, it can be visualized as an empirically incremental process once the cognitive goal is set. Over time, its epistemic status can be amended depending on how its parts cohere with one another with the introduction of new elements. So, any plausible system of commitments is empirically grounded. With the discovery of new instrumentally vital facts, a new methodological rule may be introduced, old ones discarded or modified that makes the system more comprehensive. Since any possible addition has to meet the holistic criteria, it prevents us from adding any methodological rules that are irrelevant to the overall purpose of the system or its parts in one way or another into the system. This is especially the case if the new fact is indispensable to achieving the goal. Regardless of how effective a methodological rule may look, unless it can be conjoined with a coherent system, it cannot be a part of it. No matter how comprehensive and integrated an account may look, following Quine (1970), an empirical judgment would have the capacity to discredit it. Nevertheless, it does not follow that empirical considerations must be utterly immune to revision or rejection. It also does not imply that the epistemic privilege granted to empirical consideration is independent of coherent consideration. It only means that the credibility of any methodological consideration must be established through empirical means.
Again, even though a given methodological rule may be unacceptable, its instrumental value may become worthwhile with suitable modifications. The modifications themselves may become acceptable (Elgin and Cleves 2013, 247). Methodological rules may be modified in light of further empirical findings. Sometimes the indispensability of the instrumental value of a given methodological rule is so obvious that coherence is achieved directly. The instrumental value of some other considerations can be vague that coherence is achieved by modifying them. Still, there may be other considerations with disrupting character such that coherence is achieved by discarding them immediately. Again, mere coherence with an organized body of commitments does not justify a given methodological rule. Coherence can result in epistemic acceptability only when the best explanation of the coherence of a constellation of rules is their mutual instrumental connection. For example, the instrumental values of the rules RM, RT, and RE undergird the mutual connection that further explains why they form a coherence body called the CMS collaboration. Since the best explanation of the resulted meshing is expressed in an instrumental term, the acceptance of the rules with the requisite instrumental values to the CMS collaboration is justified. We cannot insist all the rules cohere since one disrupting consideration, howsoever remote it is from the entire constellations (Elgin and Cleves 2013, 248), or collaboration’s goal-for instance, minor methodological negligence while setting up the fiber optic timing system in OPERA experiment-could disrupt the whole coherent system.
The CMS experiment also reveals that the holistic approach has the resources to recognize that different methodological rules can have different epistemic weights, some more credible than others. As the CMS experiment’s central feature, methodological consideration of the CMS magnet (Mm) will have a higher epistemic weight than the rest. Some will have higher epistemic weight than others in the systems (Elgin and Cleves 2013, 252). In extreme cases, there may be utterly irrelevant methodological rules because they do not have any instrumental value. They lack justification or suitable connections (Elgin and Cleves 2013, 252), namely instrumental connections, to other parts of a coherent body. Hence, there is no reason to credit any epistemic significance to them in the light of the epistemic cause. Because they are either irrelevant to the system or introduce inconsistencies if incorporated. They are epistemically indefensible due to the lack of instrumental ties with the rest of the system. Rejecting a methodological rule that plays a dispensable marginal role is justified. If its role is so central that its contribution is indispensable to the system, then its acceptance is justified. Being indispensable, they can immediately fit into the system. Again, there can be subsidiary methodological considerations. The relevance of these subsidiary considerations to the system can come in one way or another, for instance, by being conducive to developing the indispensable parts. Accepting these subsidiary considerations into the system is justified to the extent that they sub-serve the system indirectly in advancing its overarching objective. So, such considerations may still convey some epistemic warrant.
A methodological rule may have epistemic weight in one system but not in another. It may have instrumental value in one system but not in all systems. So, it may not have a coherent relation to all systems. For its acceptability, a rule must be reinforced by other parts of the system. On its own, the instrumental value of the Tracking part (hence its method, M T ) does not have any coherent contribution. Only when its instrumental value is considered in light of the rest of the system will the tracking part, or any other part for that matter, have coherent contributions. Since the goals of different systems generally differ, the exact way the instrumental condition of coherence is laid out differs from one system to another. No methodological consideration is apriori or universal. All that we have is particular methodological rules that stand or fall together. Their mutual accords enhance their epistemic standings (Lewis 1946, 346). Even if we may not be able to demonstrate that a methodological rule is epistemically justified by its initial empirical plausibility, it does, nonetheless, give us reason to think that it has a claim on our epistemic allegiance. Following Laudan’s (1984) inductive requirement, its instrumental or explanatory success in the past may serve as a reason for its epistemic plausibility. We have a better reason to incorporate it into a system of commitments than the irrelevant ones. Considerations of overall coherence often require revision or rejection of initially plausible parts. A methodological rule that is initially plausible can later turn out to be irrelevant to the overall coherence of the system. It may be further examined on whether it cannot give or gain support from other system components.
As a particular cognitive goal can be achieved with different methods (Laudan 1984), it can also be achieved with different cognitive systems. Because there can be as many configurations of multiple methodological rules, each of which can be interpreted as an instrumentally efficacious system to achieve the same goal. The Higgs boson has been detected by both the CMS and ATLAS collaborations. Each of these possible collaborations can be construed as a separate, independent collaboration in its own right. The possibility of many systems with the same cognitive end, each representing one of the many configurations of the various requisite considerations, asks for a minimal coherence value. Those considerations that do not conflict may stand in mutual support to one another in various ways. Among a set of instrumentally coherent methodological rules, following Quine (1970), some will be more central to a particular system, and others lie further out in the peripheral. We, thereby, find something like Quinean holism at work but in its instrumental-coherent orientation. The extent to which methodological rules on balance support one another and the system’s goal is a measure of the coherence relation. This shows how a methodological rule’s instrumental efficacy can be seen as a coherence requirement for a particular system. The instrumental efficacy moderates how a methodological rule may fit into a coherent body. So, whether or not a methodological rule occupies a place in a coherent system depends on how efficacious it is in achieving its goal. Therefore, instrumental efficacy may figure as a threshold requirement for coherence and serves as the key to the idea of the coherence of a system.
5 Conclusion
With a holistic account of scientific methodology, we have a normative framework answerable to many actual scientific systems, each involving the collaboration of different normative enterprises. Many scientific systems can be construed as inter-disciplinary matrixes of different methodological rules, all of which are amenable to empirical observations. The holistic account is naturalistic in that it is empirically grounded. It demands the methodological commitments to be instrumental in focus because they are said to cohere in the system. It is normative for it retains the normative dimension of traditional epistemology in that it provides guidelines as to which and how methodological commitments could be entertained in a coherent system.
The case study on the CMS and OPERA experiments has lent credence to the claim of the holistic account that scientific experiments can be considered as systems of methodological rules. We have argued that such a system can be treated as an evolving one with its components added in accordance with how they are instrumentally relevant to it. The interdependence of the rules in the sense of how they collectively further the system makes up for its coherence. More specifically, a system can be seen as a composite of interdependent rules that are instrumentally connected into a coherent body. Although the rules can be individually evaluated in terms of their instrumental values, it is ultimately on the basis of how they cohere with the rest of the system that they carry certain epistemic weights. Any error that breakdown the coherence of the system can eventually be rectified by identifying the relevant rules that get violated. Any error is not so serious a threat to cause an irreparable loss of the system’s coherence as it can eventually be amended and the system’s coherence restored. The significance of the system’s coherence is such that talking about the methodological rules makes sense only in light of it. That is why only when they fulfill the holistic criteria and jointly function under the conditions (1) – (4) laid out in the paper that it makes sense to talk about the system and its cognitive goal.
We have also seen how the justification of methodological rules in a system is primarily a property of the system as a whole such that they stand or fall together. This holistic view yields the exclusion of contravening methodological rules from a system. It has also been observed how the holistic view entails that different methodological rules may have different epistemic standings in a system by virtue of their differential instrumental value. By understanding methodological rules in this way and maintaining that the mutual instrumental support among them underwrites the coherence of a system, the holistic account offers an effective analysis of a scientific system, which is naturalistically respectable. It could satisfactorily account for the success of CMS collaboration in discovering the Higgs boson in terms of fulfilling its holistic criteria. Again, it could also explain the apparent faster-than-light neutrino velocity anomaly of the OPERA collaboration by interpreting the instrumental error giving rise to it as a violation of the holistic criteria. It could similarly account for many other scientific pursuits. Thus, the holistic account could serve as the normative framework for many scientific systems and account for their apparent successes or failures.
Acknowledgments
I would like to thank the anonymous reviewer and the editor for their insightful and constructive feedbacks. The paper benefited greatly from their comments. I am also hugely indebted to Prof. Vikram Singh Sirola (Indian Institute of Technology Bombay, India) and Prof. Prasanta Bandyopadhyay (Montana State University, USA) for their extensive help and guidance in developing the paper.
References
Acquafredda, R., and OPERA Collaboration. 2009. “The OPERA Experiment in the CERN to Gran Sasso Neutrino Beam.” Journal of the Institute of Metals 4: P04018.10.1088/1748-0221/4/04/P04018Suche in Google Scholar
Adam, T., and OPERA Collaboration. 2012. “Measurement of the Neutrino Velocity with the OPERA Detector in the CNGS Beam.” Journal of High Energy Physics 93 (2012).Suche in Google Scholar
Adamson, P., and MINOS Collaboration. 2007. “Measurement of Neutrino Velocity with the Minos Detectors and Numi Neutrino Beam.” Physics Reviews D. 76 (7): 072005.10.1103/PhysRevD.76.072005Suche in Google Scholar
Agafonova, N., and OPERA Collaboration. 2011. “Study of Neutrino Interactions with the Electronic Detectors of the OPERA Experiment.” New Journal of Physics 13 (5): 053051.10.1088/1367-2630/13/5/053051Suche in Google Scholar
Bertolin, A., and N. T. Tran. 2009. OpCarac: An Algorithm for the Classification of the Neutrino Interactions Recorded by OPERA. OPERA Public Note 100. http://operaweb.lngs.infn.it:2080/Opera/publicnotes/note100.pdf.Suche in Google Scholar
Chalmers, A. 1999. What is This Thing Called Science. St. Lucia, Queensland: University of Queensland Press.Suche in Google Scholar
CMS Collaboration, et al.. 2008. “The CMS Experiment at the CERN Large Hadron Collider.” Journal of Instrumentation 3: S08004.Suche in Google Scholar
Di Giovanni, A., A. Candela, N. Di Marco, M. D’Incecco, C. Gustovino, M. Lindozzi, D. Orlandi, and E. Tatanani. 2006. “The Veto System of the OPERA Experiment.” Nuclear Physics B, Proceedings Supplements 40–3, https://doi.org/10.1016/j.nuclphysbps.2006.07.015.Suche in Google Scholar
Doppelt, G. 1990. “The Naturalist Conception of Methodological Standards: A Critique.” Philosophy of Science 57 (1): 1–19, https://doi.org/10.1086/289527.Suche in Google Scholar
Elgin, C., and J. Cleves. 2013. “Can Belief be Justified through Coherence Alone?” In Contemporary Debates In Epistemology, edited by S. Matthias, J. Turri, and E. Sosa, 244–73. Chichester, West Sussex, UK: Wiley-Blackwell.10.1002/9781394260744.ch10Suche in Google Scholar
Gianotti, F., and T. S. Virdee. 2015. “The Discovery and Measurements of a Higgs Boson.” Philosophical Transactions of the Royal Society A 373: 20140384, https://doi.org/10.1098/rsta.2014.0384.Suche in Google Scholar
Giere, R. 1989. “Scientific Rationality as Instrumental Rationality.” Studies in History and Philosophy of Science 20 (3): 377–84, https://doi.org/10.1016/0039-3681(89)90013-7.Suche in Google Scholar
Goldman, A. 1986. Epistemology and Cognition. Cambridge: Harvard University Press.Suche in Google Scholar
Goldman, A. 1988. “Strong and Weak Justification.” In Philosophical Perspectives 2, Epistemology. Atascadero, CA: Ridgeview. Reprinted In A. Goldman, Liaisons: Philosophy Meets the Cognitive and Social Sciences, edited by J. Tomberlin. Cambridge: MIT Press.10.2307/2214068Suche in Google Scholar
Kalbfleisch, G. R., N. Baggett, E. C. Fowler, and J. Alspector. 1979. “Experimental Comparison of Neutrino, Antineutrino and Muon Velocities.” Physical Review Letters 43 (19): 1361–4.10.1103/PhysRevLett.43.1361Suche in Google Scholar
Kitcher, P. 1990. “The Division of Cognitive Labor.” Journal of Philosophy 87 (1): 5–22, https://doi.org/10.2307/2026796.Suche in Google Scholar
Kitcher, P. 1992. “The Naturalist’s Return.” Philosophical Review 101 (1): 53–114, https://doi.org/10.2307/2185044.Suche in Google Scholar
Kornblith, H. 1993. “Epistemic Normativity.” Synthese 94 (3): 357–76, https://doi.org/10.1007/bf01064485.Suche in Google Scholar
Kornblith, H. 2002. Knowledge and its Place in Nature. New York: Oxford University Press.10.1093/0199246319.001.0001Suche in Google Scholar
Kuhn, T. S. 1962. The Structure of Scientific Revolutions. Chicago: University of Chicago Press.Suche in Google Scholar
Kuhn, T. S. 1970. “Logic of Discovery or Psychology of Research.” In Criticism and the Growth of Knowledge, edited by I. Lakatos, and A. Musgrave. London: Cambridge University Press.10.1017/CBO9781139171434.003Suche in Google Scholar
Kuhn, T. S. 1974. “Second Thoughts on Paradigms.” In The Structure of Scientific Theories, edited by F Suppe, 459–82. Urbana IL: University of Chicago Press.Suche in Google Scholar
Laudan, L. 1984. Science and Values. Berkeley: University of California Press.Suche in Google Scholar
Laudan, L. 1987. “Progress or Rationality? The Prospects for Normative Naturalism.” American Philosophical Quarterly 24 (1): 19–31.Suche in Google Scholar
Laudan, L. 1990. “Normative Naturalism.” Philosophy of Science 57 (1): 44–59, https://doi.org/10.1086/289530.Suche in Google Scholar
Lewis, C. I. 1946. An Analysis of Knowledge and Valuation. La Salle: Open Court.Suche in Google Scholar
Longo, M. J. 1987. “Tests of Relativity from Sn1987a.” Physical Review 36 (10): 3276–7, https://doi.org/10.1103/physrevd.36.3276.Suche in Google Scholar
Nozick, R. 1993. The Nature of Rationality. Princeton: Princeton University Press.10.1515/9781400820832Suche in Google Scholar
Quine, W. V. O. 1951. “Two Dogmas of Empiricism.” Philosophical Review 60 (1): 20–43, https://doi.org/10.2307/2181906.Suche in Google Scholar
Quine, W. V. O. 1970. Philosophy of Logic. Cambridge MA: Harvard University Press.Suche in Google Scholar
Quine, W. V. O. 1990. Pursuit of Truth. Cambridge MA: Harvard University Press.Suche in Google Scholar
Resnik, D. 1992. “Are Methodological Rules Hypothetical Imperatives?” Philosophy of Science 59 (3): 498–507, https://doi.org/10.1086/289688.Suche in Google Scholar
Rosenberg, A. 1996. “A Field Guide to Recent Species of Naturalism.” The British Journal for the Philosophy of Science 47 (1): 1–29, https://doi.org/10.1093/bjps/47.1.1.Suche in Google Scholar
Strassler, M. 2012. OPERA: What Went Wrong. Also available at https://profmattstrassler.com/articles-and- posts/particle-physics-basics/neutrinos/neutrinos-faster-than-light/opera-what-went- wrong/.Suche in Google Scholar
Verhaegh, S. 2018. Working From within: The Nature and Development of Quine’s Naturalism. Oxford: Oxford University Press.10.1093/oso/9780190913151.001.0001Suche in Google Scholar
Worrall, J. 1988. “The Value of a Fixed Methodology.” The British Journal for the Philosophy of Science 39: 263–75, https://doi.org/10.1093/bjps/39.2.263.Suche in Google Scholar
Worrall, J. 1989. “Fix it and Be Damned: A Reply to Laudan.” The British Journal for the Philosophy of Science 40: 376–88, https://doi.org/10.1093/bjps/40.3.376.Suche in Google Scholar
© 2022 the author(s), published by De Gruyter, Berlin/Boston
This work is licensed under the Creative Commons Attribution 4.0 International License.
Artikel in diesem Heft
- Frontmatter
- Research Articles
- A Teleofunctionalist Solution to the Problem of Deviant Causal Chains of Actions
- A Holistic Understanding of Scientific Methodology: The Cases of the CMS and OPERA Experiments
- Time Does Not Pass if Time Began from an Infinite Past
- Practical Identity, Contingency and Humanity
- Miscellaneous
- Compatibilist Libertarianism: Advantages and Challenges (Conference Report)
Artikel in diesem Heft
- Frontmatter
- Research Articles
- A Teleofunctionalist Solution to the Problem of Deviant Causal Chains of Actions
- A Holistic Understanding of Scientific Methodology: The Cases of the CMS and OPERA Experiments
- Time Does Not Pass if Time Began from an Infinite Past
- Practical Identity, Contingency and Humanity
- Miscellaneous
- Compatibilist Libertarianism: Advantages and Challenges (Conference Report)