Cognitive Philosophy 101: How Irrational is Rationality?

Foreword

Cognitive Philosophy 101 is a series of articles on Cognitive Psychology and Philosophy of Mind that aims to provide a quick and accessible overview of some of the topics currently being debated in both fields. It focuses particularly on areas in which the two fields intersect and provides discussions of some of the theories currently being put forward by some of the most important practitioners of both areas of analysis.


The series is divided into the following articles:

  1. Cognitive Philosophy 101: Can We Read Minds?

  2. Cognitive Philosophy 101: Spectatorialist Mind Reading

  3. Cognitive Philosophy 101: How Irrational is Rationality?

  4. Cognitive Philosophy 101: Emotions and Decision-Making

  5. Cognitive Philosophy 101: A Cognitively Effective Approach to Explaining Global Warming


How Irrational is Rationality?


When asked the question "Are human beings rational?", most people’s first instinct is to answer with a resounding "Yes". This certainty of our rationality is based on a long-held assumption that humans are inherently rational beings. In his Nicomachean Ethics, Aristotle states that "reason is the best thing in us" (10.8.1177a20). He identifies our application of rationality to life as an important requirement in living well (1.4.1095a18-20). Descartes expands on this precedence of reason with his statement cogito ergo sum [I think therefore I am], which first appears in his 1637 Discourse on Method.


Figure 1: Study for The Thinker, by Auguste Rodin.

Nowadays however, this assumption is questioned by many studies carried out in cognitive psychology (Kunda, 1990). Rationality is now something that is far from a given. In order to explore whether rationality takes precedence in human thought and behaviour, we must first define rationality. Then, in light of this definition, we can examine arguments put forward by cognitive psychologists as to why humans are not the rational beings we assume we are.


The concept of rationality has been defined in numerous ways. For the sake of simplicity, this paper will focus on the definition suggested by Jonathan Evan and Keith Frankish (2009). They divide the concept of rationality into two types: Instrumental Rationality and Normative Rationality. The former is the ability to choose the means most likely to achieve one’s goals. The latter is the ability to follow the rules of a normative theory. Such a theory is one that follows the rules of logic (Baron, 2004). Normative Rationality therefore describes what a perfectly logical agent might do, say, or think in a given situation.


Traditionally, we incorporate both of these concepts into our view of rational behaviour - coolly logical and unaffected by emotions or preponderances. Starting from a set of premises, a human adult examines them and reaches a logical conclusion. And yet, even when engaged in seemingly rational analysis, humans can introduce illogical, irrational elements into their thinking. And these occurrences are not as rare as we would like to believe. The two main types of irrational elements of thinking that will be discussed here are motivated reasoning and implicit attitudes.



Figure 2: Facts can't fight beliefs

Motivated reasoning is something that everyone has seen in action, both in themselves and in others. This type of reasoning happens when the subject has a particular goal to reach within the reasoning. The ironic thing is that this goal, while "irrationally favoured", has to be socially recognized as a justified conclusion - the person to whom the argument is being presented has to believe that it is a rational conclusion to reach (Epley, Gilovich, 2016). Although motivated reasoning in some ways follows the logical path of traditional rationality (in that it examines premises and draws justified conclusions) it is geared towards one preferred conclusion. There are three mechanisms commonly used to reach this conclusion. Two of them relate to the type of evidence that is selected for the argument. (Epley and Gilovich, 2016) One can either just ignore a set of evidence or, if that is impossible, concentrate on a specific subset of evidence that more closely supports the desired conclusion. The third way is to modify what is required of the evidence. In effect, the agent requires the evidence to justify an answer to the question "Can I believe this?" rather than "Should I believe this?" An affirmative answer to the first question accepts far less convincing evidence than does the second. These are some of the very subtle ways in which reasoning can be biased and yet still appear justified and logical.


Let’s imagine a situation that is commonplace enough for most people to have experienced it. While reading this article (or, for that matter, writing it) in a cafè, we overhear two girls talking heatedly. One of them, let’s call her A, is complaining about how her boyfriend is neglecting her and making her feel underappreciated. When A is finished venting, her friend, B, offers a simple solution: break up with him. The logic behind B's advice is quite simple: if his neglectful behaviour is unacceptable, then A should break up with him. The moment this possibility is brought up, however, A’s whole attitude changes. She now moves from listing all of her boyfriend’s faults to disregarding them and citing all of the nice things that he has done for her. In this case, she cannot ignore the damning evidence. But she can shift the focus to evidence that supports her preferred conclusion. And this is motivated reason in action.


The same mechanism as the one described above is present in a lot of everyday situations. As much as economists love the idea of a rational consumer (Hall, 1990), even when making shopping decisions we are not safe from motivated reasoning. Frozen chicken nuggets are not a healthy dinner, but when making our choice in the supermarket, perhaps we put that fact to the back of our minds and tell ourselves that nuggets are filling, tasty, and a good source of protein. We ignore the evidence we don't want to consider and shift our focus to the evidence that supports our preferred conclusion.


Figure 3: Tasty and easy to justify

The second type of irrational element within seemingly rational analyses are implicit attitudes, which can be defined as subconscious attitudes that can be more or less favorable towards a certain social object, such as a person, an object, or a concept. (Greenwald and Banaji, 1995). These implicit attitudes are examined in the Implict Association Test (IAT) as described by Rudman and Ashmore (2007). It works by measuring the time taken by participants to connect good or bad attributes to certain concepts, thereby uncovering unconscious biases towards, for example, racial groups, gender, age, sexual orientation, or religious affiliation. The faster the connection is made, the more likely it is to be subconscious rather than conscious, and therefore an indication of an implicit belief.


The IAT has been proven to be internally consistent, valid and generally more useful than self-reports in identifying implicit attitudes (Rudman, Ashmore, 2007). This is significant because it means that even at the subconscious level, our reasoning can be greatly impacted by attitudes over which we have little control. In addition, Antonio Damasio’s work on emotions (1994) has shown the part played by emotions in decision making. Motivated reasoning and implicit attitudes further demonstrate that what we traditionally see as rational, logical thinking is rarely that - it usually incorporates biases, emotions, and predispositions that are not governed by logic.


It is not that deductive reasoning has no place in our decision making. But the idea that rational thought is based solely on cold, hard logic provides an insufficient explanation of how we make decisions. Recent examinations of unconscious biases, motivated reasoning, and the role of emotions in rational decision making show that there are more irrational elements in rationality than we previously supposed.

Bibliography:

  • Jowett, B., Davis, H. W. C. (1920). Aristotle's Politics. Oxford: At the Clarendon Press.

  • Aristotle (2009)The Nicomachean Ethics, (W. D. Ross, trans.) Oxford University Press

  • Baron, J. (2004) "Normative Models of Judgment and Decision Making," Blackwell Handbook of Judgment and Decision Making, ( D. J. Koehler, N. Harvey, eds.) pp. 19-36. Blackwell.

  • Damasio, A. 1994 [2021], L’errore di Cartesio: Emozione, ragione e cervello umano, (Macaluso F, trans.) Adelphi

  • Descartes, R. (1912) A Discourse on Method, London and Toronto, J. M. Dent and Sons Limited.

  • Epley N.,Gilovich, T. (2016) "The Mechanics of Motivated Reasoning," Journal of Economic Perspectives, Vol. 30, No. 3, pp. 133-140

  • Veitch, J. (1901) The Method, Meditations and Philosophy of Descartes, New York, Tudor Publishing Company.

  • Hall, R. E. (1990), The Rational Consumer, MIT Press.

  • Greenwald, A. G., Banaji, M. R. (1995) "Implicit Social Cognition: Attitudes, Self-Esteem and Stereotypes," Psychological Review, Vol. 102, No. 1, pp. 4-27

  • Kunda, Z. (1990), "The Case for Motivated Reasoning," Psychological Bulletin, Vol. 109, No. 3, pp. 480-493.

  • Frankish, K., Evans, J. (2012) The Duality of Mind: An Historical Perspective, Oxford Scholarship

  • Pavco-Giacca, O., Fitch Little, M., Stanley, J., Dunham, Y. (2019), "Rationality is Gendered," Psychology, Vol. 5, No. 1

  • Rudman, L. A., Ashmore, R. D. (2007). "Discrimination and the Implicit Association Test, Group Processes and Intergroup Relations,"SAGE Publications, Vol. 10, No. 3, pp.359-372


Images:

Author Photo

Giulia Domiziana Toffoli

Arcadia _ Logo.png

Arcadia

Arcadia, has many categories starting from Literature to Science. If you liked this article and would like to read more, you can subscribe from below or click the bar and discover unique more experiences in our articles in many categories

Let the posts
come to you.

Thanks for submitting!

  • Instagram
  • Twitter
  • LinkedIn