Invariance in James Woodward's Account of Causation
top of page

Invariance in James Woodward's Account of Causation

Philosopher of Science James Woodward proposes a manipulability account of causation. Causes, according to Woodward (2016), are therefore to be regarded as handles or devices for manipulating effects. In other words, causes are a means to produce an effect (which may explain and account for causation). Indeed, manipulability accounts often need to involve an agent who can intervene on the cause c to bring about the effect e. Woodward’s account takes this idea—that c causes e if and only if (iff) someone manipulates c and thereby brings about e—even further, however, since his account includes the notion of invariance too. This article will explicate the role of invariance in Woodward’s account to reflect its importance. It will (i) outline the relationship between causation and manipulability, (ii) introduce and explore Woodward’s particular account, including his use of intervention and counterfactual dependence, and finally (iii) spell out the subsequent role of—and need for—invariance.


Causation and Manipulability


As put by Woodward (2005), “a commonsensical idea about causation is that causal relationships are relationships that are potentially exploitable for purposes of manipulation and control” (p.7). So, if c has truly caused e, then manipulating (or changing) c therefore means that one is also manipulating (or changing) e as a result. Broadly speaking, to manipulate c is also to manipulate e. Hence manipulability accounts assert that c causes e iff someone manipulates c and thereby brings about e (Cartwright, 2006). This is the relationship between ‘manipulability’ and ‘causation’ in accounts like Woodward’s.


Figure 1. Photograph of Louis Vervoort (2014) whose philosophy applies manipulability accounts of causation to physical systems.

Of course, more is required. Such a simple kind of account of causation is not viable since there are instances where human manipulation is not possible (Craver et al., 2021). Humans cannot manipulate past instances of c, for example. Further, phenomena are often too large and/or too far away (for example, in space) to manipulate (Eberhardt, 2007). Overall, since causation can occur where humans are unable to manipulate or change anything, serious elaboration is required for a manipulability account to successfully explain causation.


Woodward’s Account: Making Things Happen


Woodward (2005) develops a manipulationist theory of causation and explanation in his book Making Things Happen. His account is a much more thorough manipulability account than that of manipulability accounts described (generally) above since Woodward incorporates counterfactual dependence and intervention to explain causal relations. Woodward (2005) claims that his account avoids the difficulties and counterexamples that have infected alternative accounts of causation and explanation, thus writing the following:


“...an explanation ought to be such that it can be used to answer what I call a what-if-things-had-been-different question: the explanation must enable us to see what sort of difference it would have made for the explanandum [i.e., what is to be explained] if the factors cited in the explanans [i.e., what does the explaining] had been different in various possible ways.” (p.11)


Woodward’s account involves counterfactuals to explain both intervention and causation. One can only imagine the ‘hypothetical’ alternatives (i.e., the ‘what-if-things-had-been-different’ question). First, consider a simple example with variables a and b. Whilst a is the accelerator pedal position in a car, b is the speedometer measuring/displaying the speed of the said car. Since they are variables, both a and b can ‘take on’ different values so to speak. On Woodward’s account, a causes b iff the value of b would change if one was to intervene on the value of a. As the driver pushes down on the accelerator pedal (intervening on a), the speed therefore increases (the value of b shows the increased speed of the car and changes).


Figure 2. Peter Fazekas (2022): Mutual Manipulability in Flat Mechanisms.

Woodward’s account develops the so-called ‘simple’ manipulability account discussed briskly earlier. Via counterfactuals and intervention, Woodward explains causal relations more rigorously. A more developed example is as follows to show Woodward’s account of causal relations via counterfactual conditionals (note that intervention is a causal concept in Woodward’s account which is further formulated in terms of counterfactual conditionals, thus making his theory non-reductive). Let a, b, c, d, and e act as the factors involved in a car’s journey from London to Bristol. The factors each correspond to something:


a is the accelerator pedal

b is the speedometer

c is the clutch pedal

d is the brake pedal

e is the engine mount

As mentioned, the driver can change the speed by pushing down the accelerator pedal and ‘intervening’ on a (thus altering the value of b too). If a would have been different on the journey to Bristol, then b would also have been different. This is therefore a causal relation (Woodward, 2005). Likewise, the driver can alter the speed of the car by engaging the clutch. The clutch disengages the wheels allowing the driver to shift gears and change speed accordingly (i.e., increased or decreased speed depending on the gear). Hence, if c would have been different, then b would have been too. This is another example of a causal relation. The driver could then also push down on the brake pedal, d. By pressing down on the brake pedal, the driver would either reduce the speed of the car or stop the car altogether. In either case, if d had been different, then b too would differ. Yet another causal relation.


There are non-causal relations at play in the journey from London to Bristol too. The car is also fitted with an engine mount, e. This factor is important for the journey since e provides support for the engine by isolating it from the vehicle’s frame and absorbing movement between the engine and frame as a result. If damaged or absent, this can indeed affect both the safety—and perhaps comfort—of passengers in the car during their journey. Whether damaged, needing replacement, or working perfectly well, however, the speed remains unaffected by any of these instances specifically. If e had been different, then b would not have been different. This relation is therefore not causal.


Figure 3. Correlation does not imply causation (Plotly, 2016).

That being so, Woodward formulates his account of causation in terms of counterfactual conditions: if the value of a had been different, then the value of b would have been different. Hence the “what-if-things-had-been-different” question is answered (Woodward 2005, p. 11). Now recall the various issues surrounding the ‘simpler’ manipulability account of causation, namely that causation can occur in the domains where humans cannot manipulate or change things. Woodward’s account notably deals with such issues since intervention need not be attainable or practicable in an account structured with counterfactuals. Counterfactual dependence is enough ('dependence' since one event e counterfactually depends on another event c just in case c had not occurred then e would not have occurred) (Menzies, 2005), allowing for the domains where humans de facto cannot manipulate or alter circumstances. Put differently, causal relations can still be accounted for in the kind of things where human intervention is not feasible (such as things in the past, or too large or far away to manipulate). Since interventions are structured via counterfactuals, Woodward’s account reworks and elaborates on the simple manipulability view of causation nicely. There are remaining issues, however. Woodward’s account still has a further requirement: invariance.


Figure 4. In interpretable machine learning, counterfactual explanations can be used to explain predictions of individual instances (Dandl and Molnar, 2022).

Role of Invariance


Counterfactual dependence is crucial in Woodward’s account. There is an additional requirement, however, since not every single counterfactual dependence necessarily suggests causality. This article hereafter notes the important role of invariance. This article, by and large, is not suggesting that Woodward’s account is uncontroversial because of the inclusion of invariance (i.e., Woodward's account assumes that (indefinite) repeated intervention on X would lead to a change of Y even in cases where such interventions are physically impossible, therefore proving unacceptable if it is meant as a theory of causal inference or testing), but simply aims to recognise the significant role of invariance.


Firstly, it is true that counterfactual dependence ‘works’ in the previous example given: if a had been different in the car journey from London to Bristol, then b would also have been different. Now consider an example concerning the speedometer. In a 60mph speed limit zone, say that anything over 60mph is legally too fast. On the journey to Bristol, imagine a long stretch of the motorway which has this 60mph limit. If the speedometer reading was too high (i.e., above 60mph) then the driver would get caught and would perhaps receive some kind of penalty as a result. There are two counterfactuals thus:

  1. If the accelerator pedal, a, had been different in the car, then the speedometer, b, would also have been different.

  2. If the speedometer reading b had been higher than 60mph, the driver would have been caught, therefore receiving a penalty too.

It has already been established that (i) is a causal relation. It is not the case, however, that (ii)—despite being a counterfactual—is causal. Woodward (1997) thus includes invariance in his account to distinguish between the counterfactuals that are causal and the counterfactuals that are not. Invariance is the dividing criterion.


Figure 5. Photograph of Wolfgang Spohn (2019). Spohn's 'Laws of Belief' interventionist account of causation makes for an interesting contrast to Woodward's.

Woodward (1997) writes the following:


“...the account sketched requires that the generalisation appealed to in a [causal] explanation continues to hold as we change in various ways the system whose behaviour, we are trying to explain... I will say that a generalisation that continues to hold or is stable in this way under some class of interventions that change the conditions described in its antecedent and that tells us how the conditions described in its consequent would change in response to these interventions is invariant under such interventions.” (p.31)


Woodward proposes that causal relationships must be invariant, meaning that they continue to hold under the intervention. Hence this is why counterfactual (ii) is non-causal. The relation does not correctly describe how the value of the dependent variable would change under an intervention. Moreover, unlike counterfactual (i), the event of the driver being caught/fined in counterfactual (ii) would not necessarily differ because of the speedometer measure also being different. Indeed, if the driver was travelling faster than 60mph, this is illegal. If the driver was travelling at 60mph or less, this is within the limit and is legal. Yet, intervening on the speedometer b so it displays 55mph, even if the driver is travelling faster than 60mph (and the speedometer is incorrect), does not result in any kind of driving that is within the speed limit. This is failure of invariance. Here, the driver would still be likely to get caught and is still driving above the speed limit despite the speedometer reading 55mph. Counterfactual dependence of course describes the outcomes of interventions, whereas invariance plays an explanatory role. Variables (like speedometer measures) are thus invariant if they continue to hold (i.e., remain stable or unchanged) as various other conditions change (Woodward, 2000). The counterfactual dependence in (ii) is not invariant, though. One cannot understand how the value of the dependent variable would change under an intervention. Hence the relation is not causal (Bühlmann, 2020). Woodward’s (1997) account of causation therefore says that c causes e if intervening on c can bring about e and if the relation between c and e is invariant under the intervention.


One must note the importance of invariance. Without such criterion, Woodward’s account would allow for intervention formulated counterfactually, but counterfactuals may not actually be causal (regardless of their truth). The sorts of phenomena that humans simply cannot manipulate would be accounted for, but one would still have to try and distinguish between causal counterfactuals and non-causal counterfactuals under intervention resulting in an utterly insufficient account of causation.


Figure 6. A lunar eclipse is an example illustrating why Aristotle is not committed to the view that everything has all four kinds of causes (Dobrijevic, 2023).

Conclusion


Three features prove particularly important for the plausibility of Woodward’s account: intervention, counterfactual dependence, and invariance. This article explained, albeit briefly, the need for a simple manipulability account to be revamped. Woodward’s account elaborates on this ‘simple’ account of causation by formulating intervention via counterfactuals. His account thus starts by overcoming the problems encountered where there are phenomena that humans de facto cannot manipulate. As mentioned, Woodward (2005) addresses the ‘what-if-things-had-been-different’ question. The car journey from London to Bristol example was therefore a useful tool to illustrate Woodward’s counterfactual account. Since the account is structured via counterfactuals, intervention does not have to be a real possibility. However, with the London to Bristol car journey example still in mind, it becomes clear that there is a further requirement to explain the difference between counterfactuals that are causal and non-causal. Invariance is the further requirement. Counterfactual dependences must be invariant for the relation to be causal. This is to say that the relationship is invariant under an intervention if it continues to hold under that intervention. Invariance thus plays a huge role in Woodward’s account, since intervention structured on counterfactual dependence is still not quite enough. The very relationship that Woodward explains between intervention, counterfactuals, and invariance therefore makes up his manipulability account of causation. More specifically, though, the notion of invariance strengthens the account generally. Invariance by no means creates an entirely unproblematic theory, but it nevertheless contributes to a stronger theory on the whole.


Bibliographical References

Bühlmann, P. (2020). Invariance, causality and robustness.


Cartwright, N. (2006). From metaphysics to method: Comments on manipulability and the causal Markov condition. The British Journal for the Philosophy of Science.


Craver, C. F., Glennan, S., & Povich, M. (2021). Constitutive relevance & mutual manipulability revisited. Synthese, 199(3-4), 8807-8828.


Eberhardt, F. (2007). Causation and intervention. Unpublished doctoral dissertation, Carnegie Mellon University, 93.


Menzies, P. (2005). Causation as counterfactual dependence. In Causation, further themes. In The Routledge Encyclopedia of Philosophy. Taylor and Francis. Retrieved 19 Feb. 2023, from https://www.rep.routledge.com/articles/thematic/causation-further-themes/v-1/sections/causation-as-counterfactual-dependence. doi:10.4324/9780415249126-N114-1


Woodward, J. (1997). Explanation, Invariance, and Intervention. Philosophy Of Science, 64, S26-S41. doi: 10.1086/392584


Woodward, J. (2000). Explanation and invariance in the special sciences. The British journal for the philosophy of science, 51(2), 197-254.


Woodward, J. (2005). Making things happen: A theory of causal explanation. Oxford university press.


Woodward, J. (2016). Causation and Manipulability. In The {Stanford} Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University.


Visual Sources

Cover image. MHJ (2012) Little Man Puppet Manipulated by Giant Businessman's Hand, iStock. Available at: https://www.istockphoto.com/vector/little-man-puppet-manipulated-by-giant-businessmans-hand-gm165980431-21929959 (Accessed: February 19, 2023).


Figure 1. Vervoort, Louis (2014) The Manipulability Account of Causation Applied to Typical Physical Systems. Lato Sensu, revue de la Société de philosophie des sciences, 1 (1). pp. 63-70. ISSN 2295-8029


Figure 2. Fazekas, P. (2022). Flat mechanisms: a reductionist approach to levels in mechanistic explanations. Philosophical Studies, 179(7), 2303-2321.


Figure 3. Plotly. (2016). Spurious Correlations: Sour Cream Consumption and Motorcycle Deaths. Plotly Graphs. Retrieved February 15, 2023, from https://plotlygraphs.medium.com/spurious-correlations-56752fcffb69.


Figure 4. Dandl, S., & Molnar, C. (2022). Counterfactual Explanations. Interpretable Machine Learning. Retrieved February 15, 2023, from https://christophm.github.io/interpretable-ml-book/counterfactual.html.


Figure 5. Spohn, W. (2019). Reply to Jim Woodward’s Comments on Wolfgang Spohn’s Laws of Belief. Philosophy of Science, 86(4), 773-784.


Figure 6. Dobrijevic, D. (2023). Lunar Eclipse Photograph. Space.com. Retrieved February 15, 2023, from https://www.space.com/33786-lunar-eclipse-guide.html.



Author Photo

Rebecca Ivory

Arcadia _ Logo.png

Arcadia

Arcadia, has many categories starting from Literature to Science. If you liked this article and would like to read more, you can subscribe from below or click the bar and discover unique more experiences in our articles in many categories

Let the posts
come to you.

Thanks for submitting!

  • Instagram
  • Twitter
  • LinkedIn
bottom of page