Submitted by Prof. Sterling on
Image by Gerd Altmann from http://Pixabay.com
Recent psychological research has been interpreted as casting serious doubts on many crucial aspects of the human experience: that we have “free will” (it’s complicated, hence the scare quotes), that we are at the least capable of rational thinking, and even that we are conscious. Indeed, it has become both fashionable and a bit of a cottage industry to “show,” scientific data in hand, that all those facets of mentation simply do not exist, they are illusions, figments of our imagination (though nobody has really provided an account of why on earth we have them, as metabolically costly as the apparatus that makes them possible is). All of this, of course, despite the staggering crisis in the replicability of results from psychology, which ought to make anyone reading anything in that field a bit cautious before agreeing that we are lumbering rationalizing and self-deluded robots.
The latest salvo on this topic that I’ve come across is an article by Keith Frankish, an English philosopher and writer, published in Aeon magazine with the title “Whatever you think, you don’t necessarily know your mind.” Let’s take a look.
To begin with, the title itself is interesting — and I’m perfectly aware that authors often don’t get to pick the titles of their articles or books. “Whatever you think, you don’t necessarily know your mind.” Well, no, we don’t necessarily do, of course. That would be like arguing, say, that whatever we see with our eyes is necessarily a true reflection of the external world. But we know better: we understand about illusions, mirages, the unreliability of our senses under certain environmental conditions, and how internal states (e.g., being inebriated, or under the influence of drugs) may alter our visual perceptions, sometimes drastically so. Heck, people sitting in sensorial deprivation tanks often develop very vivid hallucinations that appear terrifyingly real to them, even though they know that there is nothing out there. So, taken at face value, the title of Frankish’s article argues for close to nothing: the question isn’t, and never has been, whether our access to our own thoughts is always reliable, but only whether it is reliable enough for the purposes of reflecting on what we do and why.
Frankish tells us that many philosophers think that we have privileged access to our inner thoughts, and that moreover this access is largely immune from error. I think the first part is hard to doubt (though people have tried), while the value of the second part hinges on just what “largely” means. There is no reason to think that our inner sense of awareness is more reliable than our outer senses, and it may be less so. Indeed, even our regular senses differ among themselves in both precision and reliability, just as they do for other animals. Our sense of smell, for instance, is poor compared to our vision, but for dogs it is the other way around.
Frankish briefly summarizes the ideas of two philosophers who fall outside of the mainstream as he defined it: Gilbert Ryle and Peter Carruthers. Ryle thought that we don’t actually learn about our inner thoughts via an inner sense, but rather from our own behavior, which means that other people, somewhat paradoxically, may know our mind better than we do. This, of course, is the behaviorist position that has (justifiably, in my opinion) been the butt of a number of jokes, such as: two behaviourists have just had sex; one turns to the other and says: “That was great for you, darling. How was it for me?”
Carruthers’ idea relies on empirical results in experimental social psychology (see caveat above!) demonstrating that at the least sometimes not only we are mistaken about what we think we think, but we confabulate, i.e., make up explanations for our behaviors that cannot possibly be true. A typical experiment, for instance, shows that when people are offered a choice of several identical items they tend to pick the one on the right. When asked to justify their (unjustifiable, since the things are all equal!) choice they invent some story to make sense of what they have done.
This shouldn’t be particularly surprising, since the brain is trying to make sense of a situation in which it is faced with a series of facts that appear to be in contradiction with each other. It then produces some hypothesis about what happened: well, those objects look like they are identical, but I picked one above the others, so there must have been a reason, so they cannot possibly really be identical with each other. Confabulation is a very interesting phenomenon, and something of which we all have to be aware. But is it enough to make the stronger claims that Carruthers, Ryle, and Frankish want to make?
In The Opacity of Mind, Carruthers speculates that we and other primates have evolved systems to reliably guess about other people’s thoughts and intentions, not our own, and that we then began to direct those same inferential tools toward our inner mental processes. Since we have additional sensory data when it comes to ourselves — not just our outward behavior, but also feelings, pains, perceptions, etc., then we think we can more reliably tell what is going on inside our own minds.
The genesis part of the theory is speculative, of course, and there probably is no way to actually test it, as in many other evolutionary psychological scenarios. But I don’t have any problem with the idea that part of what constitutes our conscious thinking is an interpretation of our largely unconscious thoughts, making them explicit. The issue is that that isn’t the only thing we do consciously. We can also challenge our own subconscious thoughts, deliberately go after their logical implications, evaluate how they square with our beliefs and priorities, and so forth.
Which brings me to the major example brought forth by Frankish in support of Carruthers-type interpretations of conscious thinking. Turns out that we are all, deep down, “racists.” Meaning that psychological experiments (again, see caveat above!) seem to show that — when we are not paying attention — even people who claim to be opposed to racism behave in ways that indicate a subconscious level of racial bias. From this, Frankish concludes: “Such behaviour is usually said to manifest an implicit bias, which conflicts with the person’s explicit beliefs. But [Carruthers’] theory offers a simpler explanation. People think that the stereotypes are true but also that it is not acceptable to admit this and therefore say they are false. Moreover, they say this to themselves too, in inner speech, and mistakenly interpret themselves as believing it. They are hypocrites but not conscious hypocrites.”
I beg to differ. First off, it isn’t clear by what measure of “simpler” this second interpretation would allegedly satisfy Occam’s razor better than the implicit bias explanation. Most importantly, though, no, sorry, when I say that I firmly believe people should be treated equally regardless of their ethnic background I’m not lying, nor am I being a hypocrite, unwittingly or not. What I’m doing is to consciously override my unconscious biases, on the basis of rational deliberation over the issue. That is what makes human beings so different from any other animal on earth, so far as we know, and it is a precious thing indeed. But of course if you don’t believe that we are conscious, and if you believe that we always confabulate, then your must conclude that people are latent hypocrites, about everything. Which raises the obvious self-referential question: was Frankish just confabulating when he wrote the Aeon article?
https://platofootnote.wordpress.com/2016/10/03/you-dont-really-know-your-mind-or-do-you/#more-1342
- 307 reads