tl;dr: Read this book, even if it’s the only one you ever read.
Earlier this year I read The Undoing Project by Michael Lewis (author of Moneyball and The Big Short), an account of the unique relationship between the author of the above book, Daniel Kahneman, and his colleague Amos Tversky. It details their collaboration in redefining theories of decision making and behavioral economics in the 70s and 80s. Thinking Fast and Slow, which I read last year, is an excellent compendium of Kahneman and Tversky’s research, and I think it should be required reading in high school.
The short of Thinking Fast and Slow is that most of the decisions you make, big or small, you don’t make for the reasons you think you made them, and this property of human behavior is a consequence of the way our brains are wired. We have “many brains” in our heads, or rather many subsystems in our brains, each vying for control over our behavior. At the most abstract level (the level at which the book makes its primary distinction) there are two main subsystems that operate in parallel. When the decisions these subsystems make are in conflict, one decision must win out over the other, since we only have one body to control. More often than not the “right” decision is made based on the context—our brains wouldn’t be much good if they were wrong most of the time. But often, especially for modern humans, brains make decisions that seem like the best option to our conscious minds, but are actually suboptimal or detrimental, either immediately or down the road.
Our brains work this way because of how they came to be. Evolution is a necessarily greedy algorithm. It can’t go back to the drawing board when it realizes that a major restructuring would produce a much better outcome, indeed because it can’t have such a realization. It can only make small changes to existing solutions, either modifying a piece of what’s already there slightly, or adding something new on top of it. Of course these small changes accumulate over time to produce an incredibly diverse array of creations, which is what makes it such a powerful algorithm. When it comes to brains, this greedy process necessitates designing new modes of behavior on top of all the existing modes. The result is a cacophony of voices constantly shouting their orders, with the loudest voices at any given time winning control over the muscles. Marvin Minsky called this The Society of Mind, though there are countless theories and interpretations of this principle in psychology, neuroscience, cognitive science, and artificial intelligence.
What this means for the way we behave, unfortunately, is a whole lot of inner conflict, both conscious and subconscious. The reflexes and impulses that are excellent at catching flies to eat and running away from murderous predators aren’t sensible solutions to complex logical problems that require weighing alternatives from multiple, very deep branches of a possibility space. Yet the parts of our brains that evolved the ability to solve the more complex problems had to be bootstrapped from the older ones that solved the simpler problems. Since the older parts don’t always get kicked to the curb as the new ones come online, all of the parts cast votes for moving our arms and legs and tongues every second of our lives. What makes humans special is that our brains evolved enough new technology to recognize this fact and have it significantly influence the voting process. We can stop, reflect, and invalidate the votes of the older parts of the brain in some cases. This doesn’t come naturally though. It has to be learned and practiced.
Acknowledging this fact and adjusting our behavior accordingly is one of the most important things humans can learn to do, and why the concepts in this book are so important. No one will ever be able to completely overcome the biases built into our brains or the way we learn and perceive our world; that’s a biological impossibility. In the coming decades we will likely design machines that are better at this than we are, or perhaps augment our brains with machinery that makes this feasible. But for now, just recognizing that these biases exist and taking the extra few seconds or minutes to think more objectively through important decisions (even small ones), can have a profound impact on our lives for the better.
Unfortunately the very neural structures that allow us to think slowly and deliberately about complex problems in this way have provided us the means to invent technology that reinforces exactly the opposite behavior. Our current ability to communicate instantly with anyone and everyone, anywhere, at anytime has produced a culture of sound bites, instant gratification, and 140 character summaries of topics that should take pages to explain properly. The deluge of information we receive daily precludes taking the time to understand it properly. We form opinions instantly based on very little information and tout it as fact, and many are proud of their “talent” for making these quick decisions, never doubting their (often low) accuracy.
This type of thinking is epitomized, personified, and glorified by our current president, who reasons almost exclusively using what Kahneman calls System 1—the subconscious, subjective, reactive, quick-acting, emotion-driven decision-making system governed primarily by the evolutionarily older parts of the brain; the fly-catching, predator-escaping, sex-obsessed parts. This is not meant to be a political post. I only use Trump to make the following example. As soon as you read the words “our current president”, you immediately formed a subconscious (and subsequently conscious) opinion about this post. If you lean left, it was likely to some extent a “fuck yes” feeling that resulted in some shade of agreement. If you lean right, or for some other reason are a Trump supporter, it was likely a subconscious eye-roll or middle finger which blossomed into a “this is pretentious bullshit” conclusion that you feel is entirely justified by the fact that I wear gauges and live in San Francisco. The point is, you likely determined your interest in reading the above book based on this reaction, when it in fact it should have little to no bearing on that decision.
The initial subconscious reactions that led to this conclusion were unavoidable. System 1 is always running. You can’t turn it off. You can only override it. My choice of the word “likely” instead of “definitely” in the previous paragraph was made by my System 2—the slow-moving, deliberative, cautious, uncertain, logical, and statistically-aware parts of my brain. If I were generating this post off-the-cuff (or under the influence), my System 1 would have produced something like “All Trump supporters are ignorant System 1 zombies that have no fucking idea what they’re talking about”. This is the immediate, visceral reaction that happens in my brain when I see his name because of the associations with him I have built up over time, and the kind of thing you see on most internet comments. That immediate reaction is unavoidable (barring deliberate, long-term reconditioning). But it would be horrifically irrational for me to let those parts of my brain control my fingers while typing this, just as it would be horrifically irrational of me to grab the crotch of someone I find attractive, but who hasn’t given me permission to do so.
All Trump supporters are not Trump. It is irrational to equate the two and their ideologies without knowing more information about each person individually. Of course it is generally prohibitive to acquire that amount of information, which is exactly why System 1 exists, and why it evolved before System 2. System 1 operates on heuristics—general rules of thumb that are more or less true more often than not. Heuristics (e.g., stereotypes) are extremely useful when high-stakes decisions must be made in seconds or less. These rules mean the difference between life and death for nearly every animal on the planet, but not for most modern humans. Yet most modern humans still use System 1 to make their high-stakes decisions, even though there is plenty of time to let System 2 do its thing.
Part of this has to do with culture. In America at least there seems to be a bizarre marriage of two diametrically opposed attitudes toward decision-making: anti-intellectualism and fear of appearing ignorant. Mainstream media often paints rational thinkers, scientists, and scholars as bookish, intellectual elites that sit in ivory towers in lab coats and disseminate indisputable facts; a separate portion of society from which we obtain some information needed to set policy, but which doesn’t know anything about living in the “real world”. It would require a much longer post to list all of the reasons why this is completely ridiculous. At the same time, it also paints anyone that hesitates in their explanation of complex topics, or provides probabilistic answers conditional on further information or study as incompetent, unconvincing, and wrong. The direct outcome of this is that many people speak with extreme confidence on matters about which they have spent very little, if any time contemplating because they are afraid to say “I don’t know”. Yet they also can’t be bothered to spend the time to understand the issues better because they’re “not a scientist”.
Getting past this barrier is a matter of education. People need to understand not only basic probability and statistics, but all the ways in which their brains conspire against them to subvert the laws of basic probability and statistics. This is precisely what Thinking Fast and Slow attempts to do. Only through understanding how their brain functions can people recognize when System 1 is making their decisions for them and instead take the time to think slower and engage System 2. Hint: it’s pretty often.
Do I think that introducing the principles of this book into high school curricula will produce a significant difference in the behavior of subsequent generations? I don’t know. But I’ve got a good feeling about it. So maybe we should think (slowly) on it.