# Philosophy of Science 101: Models of Explanation

*Foreword*

*The Philosophy of Science series explores both general questions about the nature of science and specific foundational issues related to the individual sciences. When applied to such subject areas, philosophy is particularly good at illuminating our general understanding of the sciences. This 101 series will investigate what kinds of serious—often unanswered—questions a philosophical approach to science exposes through its heuristic lens. This series, more specifically, will look at the ‘Scientific Realism’ debate throughout, which questions the very content of our best scientific theories and models.*

*Philosophy of Science 101 *will be divided into the following chapters of content:

__1.____ Philosophy of Science 101: The Relationship Between Philosophy and Science __

__2.____ Philosophy of Science 101: Scientific Realism __

__3. ____Philosophy of Science 101: Anti-Realism __

__4. ____Philosophy of Science 101: Realism and Anti-Realism ‘Compromise’ __

__5. ____Philosophy of Science 101: Causation__

__6. Philosophy of Science 101: Scientific Models__

**7. Philosophy of Science 101: Models of Explanation**

8. Philosophy of Science 101: Laws of Nature

9. Philosophy of Science 101: Science and Social Context

*Philosophy of Science 101: Models of Explanation*

Previously, *Philosophy of Science 101 *explored scientific models. In particular, the previous article in this series investigated how models in science represent their target systems (i.e., ‘T’ —a particular part or aspect of the world that the given model is concerned with, such as the atom in Bohr’s (1913) model of the atom). The article considered the different types of models that *represent *T*,* and various accounts of representation which arise due to growing philosophical interest in this (crucial) part of scientific practice. Indeed, philosophical queries concerning representation—resulting from scientific modelling—are just one area of the philosophy of science interested in modelling. The series now turns to another area, namely models of *scientific explanation*.

Issues concerning scientific explanation have been a focus of philosophical attention from Pre-Socratic times through the modern period (Woodward and Ross, 2021). As this article will discuss, modern interest really begins with the development of the Deductive-Nomological (hereafter, ‘DN’) model. Thereafter, the philosophy of science not only sees well-known objections to the DN model in this area of study but also (i) various extensions of the DN model, and (ii) subsequent—notable—attempts to develop alternative models of explanation, oftentimes via statistical laws. Competing models of *scientific explanation *encounter interrelated issues which this article will both introduce and investigate in relation to the various conditions of adequacy. More specifically, this article will delve into the presupposition of the most recent philosophical discussion which has been that science sometimes provides explanations and that the task of a theory or model of scientific explanation is to characterise the structure of such explanations (Woodward and Ross, 2021).

**Background: The Basic Idea**

First of all, some brief background is needed. It is worth noting that science should answer so-called ‘why’ questions. For example,* why *does uranium 235 decay? *Why *did the solar eclipse occur? Answers to such questions take on the form of scientific explanations. Not all explanations, however, are scientific. Nor do all questions demand scientific explanations. Hence, the philosopher asks, what makes scientific explanations different? Before sussing out various proposed answers to these questions and the models available for scientific explanation, consider some terminology which will be used throughout the article:

*Explanandum *(Em): the ‘thing’ that has to be explained.

*Explanans *(Es): the ‘thing’ that *does *the explaining.

So, for example:

*Explanandum: *why does uranium 238 not undergo fission in a nuclear reactor?

*Explanans: *because of the large amount of energy needed.

(Cheifetz, Fraenkel, Galin, Lefort, Peter and Tarrago, 1970).

**The DN Model of Explanation**

Made famous by its proponents Carl Hempel and Paul Oppenheim (1948), the DN model of explanation—broadly speaking—is that one may explain something by subsuming it under general law. Hence ‘nomological’ (i.e., referring to *laws*). One may then show that Em is an instance of a general pattern. Consider the general structure of *explanans *and *explanandum *to constitute the explanation on the DN model (Hempel and Oppenheim, 1948):

L1, L2, … … … LK – Laws

A1, A2, … … … Am – Auxiliary Assumptions

B1, B2, … … … Bn – Boundary Conditions

–––––––––––––––––––––––––––

Implies, via logical deduction

E (conclusion).

The Es above (i.e., laws, auxiliary assumptions, and boundary conditions) imply Em, which provides the model of explanation.

Consider a simple example of a solar eclipse to illustrate this general structure:

*Laws:* celestial laws of motion (i.e., both describing and showing how (a) planets move in elliptical orbits with the sun as a focus, (b) a planet covers the same area of space in the same amount of time no matter where it is in its orbit, and (c) a planet’s orbital period is proportional to the size of its orbit) (Russell, 1964)

*Auxiliary assumptions:* linear optics (i.e., the behaviour and properties of light)

*Boundary conditions:* the position of objects (i.e., the moon interposes itself between the Sun and the Earth, casting its shadow over the Earth) (Brown and Brown, 2017)

–––––––––––––––––––––––––––

Implies, via logical deduction

The solar eclipse occurrence *x *at *t*.

Most importantly, the conclusion is reached by logical deduction. Hence the ‘deductive-nomological’ model name, whereby an explanation is a deductively valid argument. Whether this is sufficient as a model of explanation, however, requires some further investigation into the conditions of adequacy *for *such a model of explanation.

**Conditions of Adequacy **

*Adequacy *is the state of being sufficient for the purpose concerned (Fay and Moon, 1977) and is of utmost importance here, for an argument of this sort (i.e., the DN model) is an explanation *if *the following four conditions occur (Hempel and Oppenheim, 1948):

Em is a

*logical consequence*of Es (i.e., the explanation is a valid deductive argument).Es must contain at least one

*law*, and this law must be used in the derivation of Em.Es must have

*empirical content*(i.e., it must at least in principle be empirically testable).The sentences contained in Es must be

*true*.

This is a *deductive *(i.e., structure of deductive) *nomological *(i.e., contains laws) explanation. On the DN model, explanations are arguments as such. The DN model can also, therefore, explain laws by appealing to more general laws (Cartwright, 1979) (as this article will turn to later). Moreover, Hempel and Oppenheim (1948) assume that the DN model applies equally well to both scientific explanation and scientific prediction. Arguably, it wouldn’t matter whether the DN model is used to show how theories *explain* certain events or how theories *predict* results. If the event has already occurred, it can be explained by the antecedent conditions and the theoretical laws. If the event has not occurred, it can also be predicted by the antecedent conditions and the theoretical laws (Pitt, 1988). Whether applied to explanation or prediction—and before considering statistical laws and probabilities—it is first worth considering the types of problems that the DN model encounters. There are two kinds of difficulties here, namely those against sufficiency and those against necessity. The first kind is concerned with arguments that satisfy all requirements but nevertheless fail to be explanations. Such arguments show that these requirements are not sufficient. On the other hand, the second kind is concerned with explanations that *are* considered to be real but do not satisfy the above requirements. Thus, such explanations are not necessary.

**Against Sufficiency **

The first problem of this first kind of difficulty—against sufficiency—results from the asymmetry of explanation. Intuitively, an explanation is asymmetric (Hausman, 1998). That is, if A explains B, then B does not explain A. This, however, stands in contradiction to the DN model, which does not have the means to rule out spurious—symmetric—explanations. A famous example of this difficulty faced by the DN model comes from the *flagpole problem *(see figure 4)*. *Given a flagpole standing vertically, and the sun shining brightly, a shadow will be cast by the flagpole. If one knows the height of the flagpole, and the position of the sun, then one can deduce the length of the shadow (Hausman, 1998). Imagine the sun has an elevation of 53.030 and the shadow is 9 feet long: one can compute that the flagpole is 12 feet tall. If someone asks why the shadow is 9 feet long, one can explain this by saying that the flagpole is 12 feet in height (Hausman, 1998). Likewise, if someone asks why the flagpole is 12 feet tall, one can compute the height of the flagpole from the length of the shadow and the position of the sun. The issue is that since it is generally a universal law that light travels in (roughly) a straight line, the angle of the sun and the height of the flagpole entails the length of the shadow. This, therefore, explains the length of the shadow. Yet, once the length of the shadow is shown, given the universal law and height of the flagpole are known, it then follows that one may explain the angle of the sun above the horizon (Hausman, 1998). However, the height of a flagpole and the length of a shadow obviously do not (and cannot) explain why the sun is at a certain angle, thus presenting a serious problem for the DN model (Woodward and Ross, 2021).

Another problem against sufficiency faced by the DN model comes from common causes (Pearl, 2000) (see figure 5), which the DN model does not rule out. The so-called ‘barometer example’ provides yet another famous example of a problem for the DN model here. Imagine there is a sharp drop in barometric pressure, resulting in the assurance that a storm is on the way (based on some kind of law that ties the two together). Indeed, one can predict the storm on the basis of barometric pressure. One does not want to claim that the storm is *explained *by the drop in barometric pressure, however, since both the storm and the drop in barometric pressure are caused by atmospheric conditions (Salmon, 2006). They have a common (i.e., the same) cause. Yet, as mentioned, the DN model does not rule this out. The laws given in the theory allow for predictions of all kinds. The problem is that those predictions are not the same thing as explanations: the storm does not explain the barometer reading, nor does the barometer reading explain the storm. Rather they are both explained by a third factor, the atmospheric conditions, which both cause *and *explain the two. The same issue around prediction versus explanation goes for the flagpole example too, for the shadow does not explain the height of the flagpole since it is instead the combination of the sun and the flagpole that *causes *the shadow (Hausman, 1998). Before moving on to the second kind of problem that the DN model encounters, it is worth noting that Hempel (1965) introduces the 'thesis of structural identity'* *to overcome the issues around prediction versus explanation. Hempel’s (1965) thesis supposes that (a) every adequate explanation is potentially a prediction, and (b) every adequate prediction is potentially an explanation. Hempel in fact defends (a) generally but acknowledges that (b) cannot be maintained in general (i.e., only in some cases). Overall, however, the ‘explanation and prediction’ problem against sufficiency remains and deserves attention elsewhere.

**Against Necessity**

As discussed, the DN model is in trouble due to a second kind of problem too. To reiterate, problems of this second kind faced by the DN model result from the explanations that are real (or are considered to be real) but do not satisfy the conditions of adequacy (i.e., that the *explanandum* is a logical consequence of the *explanans*, the *explanans* contain at least one law and contain empirical content, and the sentences contained in the *explanans* are true). Explanations here show that the requirements are not necessary. This article will briskly consider the problems which go against necessity before considering *statistical laws *to make up a model of explanation.

First, it is important to remember just what a significant role *laws *play in the *nomological* model. According to DN, one may explain something by subsuming it under general law. One may then show that Em is an instance of a general pattern, which is why a logical deductive argument constitutes an explanation. Further, the DN model can also—supposedly—therefore explain laws by appealing to more general laws. As Nancy Cartwright (1979) argues, however, regularities do not *explain. *The problem here for the DN model is that subsumption under a general law does not explain anything. Cartwright (1979) claims that what is needed are *causes* instead, which *do *have explanatory power. On this view, the DN model is simply wrong when it focuses on general regularities (if this is what laws consist of). Nothing is explained by laws, thus going against the DN model as a whole.

Another problem of this kind—which is somewhat similar to the problem Cartwright raises—comes from singular events. Singular events, as suggested by the name, are not regularly occurring events. Hence there are no regularly occurring laws (or patterns) involved here as such, therefore posing a threat to the DN model. A severe worldwide economic crisis like the stock market crash in 2008, for example. Though the 2008 financial crisis was notably the most serious financial crisis since the Great Depression (Barro and Ursúa, 2009), and there are of course other financial crises resulting from events that are ‘similar’ to that of the events occurring in 2007-2008, the stock market crash was nevertheless a singular event (with various Es *explaining* the Em, i.e., the crash) but which does not satisfy the DN model’s requirements to contain a law. Indeed, it is an example of an explanation that *is *considered to be real, showing that the DN model’s requirements are not necessary. Perhaps the use of *statistical *laws in a model of explanation is better?

**Statistical Laws – The Inductive and Deductive Statistical Models of Explanation**

Not all laws have the form (x) (Fx → Gx). Some laws, instead, have the form ‘*some* F are G’. More specifically, if x is an F then there is a *probability* (p) that x is also a G. These sorts of laws indeed play an important role in many of the sciences. The probability that a plutonium atom decays within an hour, for instance, or the probability of tunnelling in quantum physics (to name but just a few examples of many). Probabilities are undeniably essential in all the sciences and in the process of *explaining. *Probabilistic—statistical—laws are therefore used in various alternative models of explanation, namely the inductive statistical model (‘IS’) and the deductive statistical model (‘DS’).

** **

According to Hempel (1965), an IS explanation is good or successful to the extent that its Es confer high probability on its Em outcome. The relation between Es and Em, unlike the DN model, is rather inductive. The difference is that IS explanations involve the subsumption of individual events under statistical laws (Woodward and Ross, 2021). Thus, the IS model explains a particular occurrence by subsuming it under a statistical law. This is unlike that of a DN or DS explanation which is *deduced *from the Es since it is a particular (individual) occurrence. Consider an example to illustrate how the IS model works (Woodward and Ross, 2021):

There is a 0.95 probability that patients with a streptococcus infection recover quickly after the administration of penicillin:

James had a streptococcus infection and received treatment with penicillin.

-----------------------------------------------------------------------------------------------------

-----------------------------------------------------------------------------------------------------

[0.95]

James recovers quickly.

So:

Statistical law: if x is an F then there is a probability, p, that x is also a G.

Particular condition: object a is an F.

------------------------------------------------------------------------------------------------

------------------------------------------------------------------------------------------------

[p]

Object a is a G.

Here, it was to be expected that object a is a G (i.e., that James recovers quickly, for instance) given certain explanatory facts and laws (i.e., the high probability of penicillin successfully treating streptococcus infections). Indeed, on the IS model, the value of p must be high, like in the streptococcus example. Crucially, not any probability explains: it must be a high (or practically certain) probability (Hempel and Oppenheim, 1948). Above all, the premises (i) make the conclusion highly probable, but (ii) do not imply the conclusion. The argument is not deductively valid like that of the DN model (hence ‘inductive’) (Woodward and Ross, 2021).

The IS model is different to the DN model and encounters its own problems, as this article will briefly discuss. First, there is the problem of *ambiguity, *for the premises of two IS explanations can be mutually compatible but where the conclusions contradict each other (Salmon, 1971). The problem here is that not all relevant information is included in the premises, allowing for contradictory conclusions. Second, the problem of *low probabilities *arises: not only high probabilities have the ability to explain (Salmon, 1971). Such probabilities go against the necessity of high—or practically certain—probabilities to constitute IS explanations. Particular events and occurrences indeed may go against all the odds (i.e., have a low probability, perhaps near to impossible) and still *explain, *thus threatening the IS model. Last but not least, *mechanistic explanation* creates another issue for IS explanations. Such explanations produce a variety of outcomes with both high and low probabilities (Salmon, 1971). The point is that the values of these probabilities simply don’t matter. Arguably, probabilities don’t have anything to do with explanation (consider figure 7, the example of a roulette wheel to reinforce this point).

On the one hand, IS explanations are inductive and apply to individual events or occurrences. On the other hand, DN explanations are deductively valid whereby the Es imply the Em. Now contemplate a third alternative: a deductive *statistical *model of explanation *for laws*. Resulting from the DN model, the DS subsumes under more general regularity (i.e., the regularity of laws). Here is a quick example, whereby the laws of nuclear physics explain, *statistically, *the time it takes for carbon-14 (14C) to radioactively decay (described by its half-life):

Laws of nuclear physics

------------------------------------------

------------------------------------------

14C has a half-life of 5730 years.

After 5,730 years, only half of the original amount of 14C remains in a sample of organic material. After an additional 5,730 years–or 11,460 years total–only a quarter of the 14C remains (Engelkemeir, Hamill, Inghram, and Libby, 1949). The DS model is indeed another variety of statistical explanation (Woodward and Ross, 2021). It involves the deduction of a narrower statistical uniformity from a more general set of premises, at least which involves a more general statistical law. In this sense, the DS model conforms to the same general pattern as the DN explanation of regularities since it involves the deduction of the Em from a law (Woodward and Ross, 2021). One problem here is a pseudo (i.e., empty) DS explanation of a statistical law from itself. Deducing a law from itself arguably does not provide an explanation, meaning that the DS, therefore, faces a similar issue—argued by Cartwright (1979)—to that of the DN model. One must decide whether any model of explanation discussed is necessary and sufficient with regard to the conditions of adequacy, amongst all the (oftentimes similar) difficulties addressed.

**Concluding discussion**

What does *explanation *amount to? This article has explored the philosophical endeavours to answer this very question. As mentioned, modern philosophical discussion very much begins with Hempel and Oppenheim's (1948) *deductive-nomological *model. The article thus started by introducing both some background and the model itself, noting how explanation is a deductively valid argument made up of laws, auxiliary assumptions, and boundary conditions which imply a conclusion on this model. The model indeed encounters various (quite serious) issues, meaning that the conditions of adequacy were important to consider throughout. Problems may take on two kinds, either going against sufficiency or against necessity. Such problems, interestingly, apply to other models of explanation, too. Suffice it to say, this article went on to inquire into the kinds of models which use *statistical laws* to account for explanation (both inductive and deductive). Laws, whether statistical or simply understood as regular occurrences, prove to be a particularly important part of most models of explanation. The 101 series next investigates the laws of nature as such.

**Bibliographical References**

Barro, R. J., & Ursúa, J. F. (2009). Stock-market crashes and depressions.* NBER Working Paper*(14760). National Bureau of Economic Research.

Bohr, N. (1913). On the constitution of atoms and molecules. *The London, Edinburgh, and Dublin Philosophical Magazine and Journal of Science*, *26*(151), 1-25.

Brown, T., & Brown, K. (2017). In the shadow of the Moon, what type of solar eclipse will we see?. *Science Activities*, *54*(2), 48-57.

Cartwright, N. (1979). Causal Laws and Effective Strategies, *Noûs*, *13*(4), 419–437. doi:10.2307/2215337

Cheifetz, E., Fraenkel, Z., Galin, J., Lefort, M., Peter, J., & Tarrago, X. (1970). Measurement of the Prompt Neutrons Emitted in the Fission of Bi 209 and U 238 Induced by 155-MeV Protons. *Physical Review C*, *2*(1), 256.

Engelkemeir, A. G., Hamill, W. H., Inghram, M. G., & Libby, W. F. (1949). The half-life of radiocarbon (C 14). *Physical Review*, *75*(12), 1825.

Fay, B., & Moon, J. D. (1977). What would an adequate philosophy of social science look like?. *Philosophy of the Social Sciences*, *7*(3), 209-227.

Hausman, Daniel M. (1998). *Causal Asymmetries*, Cambridge: Cambridge University Press. doi:10.1017/CBO9780511663710

Hempel, C. G. (1965). *Aspects of Scientific Explanation and Other Essays in the Philosophy of Science. *Free Press.

Hempel, C. G; Oppenheim, P. (1948), Studies in the Logic of Explanation. *Philosophy of Science, 15*(2), 135-175.

Pearl, J. (2000). *Causality: Models, Reasoning and Inference*, Cambridge University.

Pitt, J. C. (ed.) (1988). *Theories of Explanation. *Oxford University Press.

Russell, J. L. (1964). Kepler's laws of planetary motion: 1609–1666. *The British journal for the history of science*, *2*(1), 1-24.

Salmon, W. C. (1971), “Statistical Explanation”, in Salmon 1971: 29–87.

Salmon, W. C. (2006). *Four decades of scientific explanation*. University of Pittsburgh Press.

Woodward, J., & Ross, L. (2021). Scientific explanation. *The Stanford*.

**Visual Sources**

Cover Image. Rothstein, D. (2014). *Albert Einstein *[Photograph]. Wbur. __https://www.wbur.org/cognoscenti/2014/03/14/ask-the-question-dan-rothstein__

Figure 1. Rogers, K. (2012, May 21). S*cientific modelling* [Diagram]. *Encyclopedia Britannica*. __https://www.britannica.com/science/scientific-modelin____g____ __

Figure 2. *Carl Hempel*. (2023). [Photograph]. Princeton University. __https://philosophy.princeton.edu/about/great-and-good/carl-g-hempel#:-:text=hempel__

Figure 3. *Karl Popper*. (1991). [Photograph]. Encyclopædia Britannica. __https://www.britannica.com/biography/Karl-Popper#/media/1/470154/234973__

Figure 4. Roberts, B. (2014). *The Flagpole Problem* [Diagram]. Personal LSE, Philosophico-Scientific Adventures. __https://personal.lse.ac.uk/robert49/ebooks/philsciadventures/lecture19.html__

Figure 5. Hitchcock, C., & Redei, M. (2021). *Common Causes* [Diagram]. Stanford Encyclopedia of Philosophy. __https://plato.stanford.edu/cgi-bin/encyclopedia/archinfo.cgi?entry=physics-Rpcc__

Figure 6. Duffy, A. (2019). *Quantum Tunnelling* [Diagram]. Cosmos. __https://cosmosmagazine.com/science/physics/quantum-tunnelling-is-instantaneous-researchers-find/__

Figure 7. Chambers, A. (2013). *Roulette Wheel* [Photograph]. Intermathematics. Retrieved March 11, 2023, from __https://ibmathsresources.com/2013/10/12/the-gamblers-fallacy-and-casino-maths/__

Figure 8. Kitcher, P. (1981). [Photograph]. *Philosophy of science*, *48*(4), 507-531.