Matt Perryman Matt Perryman

Two Minds and a Flame War

I’m fascinated by the human mind. The mind, such as there is a thing we can refer to with that word, is where all the interesting things about humankind go on. It’s also poorly understood, even by the legions of bright people who have studied and reflected upon it for thousands of years.

Vagueness aside, you’ll notice that we’ve got a Pretty Good intuitive grasp of thoughts and sensations, such that we can communicate reasonably well most of the time. The fact that you can read my words and (hopefully) understand what I want to convey attests to that. Sometimes, Pretty Good is good enough.

Knowing how people operate is a crucial skill in any instance that involves other people. Case in point, fitness training and nutrition. These fields are applied science, and on paper at least we should be able to craft perfect workout programs and diets — at least, you’d think that according to much of the internet.

Often, though — likely more than not — these perfection-seeking schemes fail. Why can’t people just do what we know is right? Why do all these pig-headed people disagree with my perfectly-designed workout? Why do people not eat according to these scientifically-derived principles that ensure success?

It’s human nature. I don’t mean anything so trivial as the ever-popular “people are stupid” meme; “stupid”, in the exasperated lament of the frustrated, really means “doesn’t agree with my presumptions”. You might jump to the conclusion that everyone around you is irrational and you, of course, are the only bright light of reason.

This betrays an unacknowledged failing of rationality. Reason, contrary to increasingly popular belief, is not a cold tool of rational calculation that can subsume the irrational, emotional self. Reason, to paraphrase David Hume, is a slave of the passions.

I’ve recently read two books which explore the premise of emotionally-driven rationality. The first, “Thinking, Fast and Slow” by behavioral economist Daniel Kahneman (review here), explores some 50 years of research into what Kahneman calls the “two systems” of thought. We don’t literally have two brains, nor anything that we could even identify as separate reasoning processes, but psychological research has nevertheless found that we have two distinctive “modes” of thought.

The first, called System 1, is fast, intuitive, and makes quick associations. System 1 is a storyteller and a pattern-matcher; it’s how you see a snake on the ground and then, only after you’ve jumped and taken a second glance, notice that it’s really a twig. System 1 makes connections, preferring what Kahneman calls “causal stories”, and weighting information by how easily it comes to mind.

Then we have System 2. This is a relatively new feature of the brain, exclusive to humans and perhaps in lesser degree, other primates and cetaceans. System 2 is where math, science, and language come from, and it’s what we’d consider our bona-fide reasoning. The only problem is, System 2 isn’t refined and it’s not all that powerful.

System 1 has had millions of years, give or take a few orders of magnitude, to evolve its way into a powerful form. It does its job well, pushing us and pulling us with emotional intutions (or “gut feelings”) that drive much of our behavior. System 2, as it exists now, might have a million years behind it. It’s still working out the kinks, so to speak, and it relies heavily on System 1’s unconscious processing to do what it does (to see more on how disturbingly involved your emotions are in rational thinking, see Antonio Damasio’s research into psychopaths).

Johathan Haidt uses a wonderful metaphor for the two systems. He likens System 1 to an elephant, large, massive, and not all that bright. System 2 rides the elephant as a tiny man with the reins, and he does all the talking. If you come across this pair, you’ll think the rider’s in charge — but that’s a three-ton elephant. It’s going where it wants.

Haidt, author of “The Righteous Mind” (review here), begins his book by reinforcing Kahneman’s findings. We are capable of rationality, but this isn’t the same as saying we’re rational beings. We are instead driven by the unconscious emotional processing of our elephants — System 1 — which, in broad scope, determines how we think. The rider, representing System 2, steps up and works out justifications for these unconsciously-made decisions. The elephant moves, and the rider explains why he’d always meant to do that.

This doesn’t mean we can’t ever reason our way to a decision, or whisper in the elephant’s ear to coax it in a different direction, but on the whole we go where the elephant wants. As Haidt says, strategic reasoning is intuitive. Partly this is because System 2 is costly. Think about how you feel physically exhausted after studying for three hours. System 2 actually does cost physical energy and, as more research shows, it’s eerily similar to physical exertion — we can actually measure cognitive effort with HRV and pupil dilation as well as glucose usage, a model which jives with Baumeister’s findings about willpower as well.

Thinking is literally difficult, and the effort of thinking can exhaust our mental resources. This is why most of us default to System 1 and the “cognitive ease” it provides.

Haidt’s book focuses on moral psychology, the way we arrive at our ideas of right and wrong. According to mainstream moral psych, we come with a range of built-in moral intuitions that create that visceral sensation of right and wrong, and some of us are more sensitive to these than others. Haidt notes that a good chunk of our beliefs are socially-driven, no doubt thanks to that “cognitive ease” factor again — the information that is immediate and ever-present, like the beliefs of parents and peers, is what we “know” to be true.

This leaves us at a curious impasse: yes, we are capable of rationality, but we’re also cliqueish, biased, and outright lazy thinkers who are never as rational as we think (think again before deciding how “stupid” everyone else is). More frightening, this applies to the educated and the intelligent as much as anyone — Kahneman suggests that a more robust System 2 only leads to better stories and justifications, not any better guarantee of factual correctness.

This has clear relevance to all of us who talk about determining effective ways of physical training and eating food. It’s easy to say “read the science”. Yet there are profound, and rarely acknowledged, issues with this approach.

Ever since reading Kahneman’s book, these questions have been at the forefront of my mind. Where is the line between fact and nonsense? It’s us, the readers and interpreters, that are the first and last step in translating the abstract findings of science into a meaningful story, and it’s in our brains that objective reality is skewed.

It’s all the rage these days to hate on “Bros” and their “stupid” training and diet practices. In fairness, some of this is deserved. People lift and exercise and eat according to feelings, blatantly pseudo-scientific ideas, and ideas that don’t even make sense when you say them out loud. But plenty of “stupid” methods really aren’t, certainly not with the kind of certainty we’d like, and most arguments of this nature are actually two fighting elephants.

The problem to me isn’t the fact that science is used as a tool, but rather the reverence for, and certainty given to, the findings of published research with no further context. There’s an awful lot of confidence there for an awfully shaky set of assumptions, especially given the realities of exercise science and nutritional research. This is not to say that the research is useless — far from it — but more often than not it becomes fodder for a rider looking to justify the elephant’s intuitive choice. Constructing a story out of Pubmed abstracts is not a particularly compelling case.

Personally speaking I’m less confident than ever that looking at published research can provide all the answers; and likewise, I’m much less willing to be so vocal in pushing arguments which have so much inherent uncertainty (and when I can’t be sure of eliminating my own biases).

[On to Part II.]