Bits of Books - Books by Title
You Are Now Less Dumb
Particularly violent football game in 1951 Princeton vs Dartmouth. Psychologists got interested when they read two radically different accounts of game. So showed film of game to undergrads of each school and asked them to record every infraction they saw. Even tough they were watching the same game, each individual scored twice as many infractions for opposition as their own team. And, when asked, 90% were certain it was the other side which started the fighting.
The world is not what it seems - every different animal sees it different. And all it takes is one optical illusion to show that we don't actually see what is there. The stars are in the sky all the time, but we don't perceive them when the sun is shining.
These are simple examples, but illustrate a big problem - eye witness testimony is grossly unreliable - our memories are representations, not replicas. A memory is least accurate when most reflected upon, and most accurate when least reflected on. This isn't what people believe. 2011 survey found most (63%) think memory works like a video camera, and half (48%) thought memories were permanent.
We convince ourselves that when we stop and think about a memory, our recollections are 100% accurate. But we have no way of telling what is 'real' because everything that has happened to you, happens inside your skull. Every brain constructs it's own version of reality, broadly similar to others, but infinitely different and fundamentally flawed.
With most arguments, you begin with a conclusion already in mind, and work towards proving that you were not stupid to have drawn that conclusion.
We tell ourselves stories to keep ourselves sane and stable, and cling to those stories even when they are pretty far from the truth. In the fields of science, specifically neuroscience, psychology and economics, the major ways our brains fool us have been known for 50 years or so. They are now starting to trickle down into lay knowledge, so we now have a chance of getting wider recognition.
Some of it is to do with the wiring of the brain, some of it is cultural influences, and some is just ancient behavioural routines (ie we all react to snakes, even those who have lived away from snakes for generations).
We are grossly over-confident of our judgement and gullibility. Through this giant, easily opened back door waltz con artists, magicians, PR flacks, advertisers, pseudoscientists and fake healers.
Psychologist Milton Rokeach wrote book The Three Christs of Ypsilanti describing his experience running a mental ward in Michigan in 1950's. He had 3 guys who each believed themselves to be Jesus, so he roomed them together and set them tasks where they had to work together every day. He hoped that the confrontations would spur at least one of them to reconsider. But it didn't work out that way. Each man's story of how Jesus came to be locked up in an asylum in Michigan was a maze of logic that made sense until Rokeach poked a hole in it. But as each story fell apart, it would be swiftly rebuilt without any self-doubt. But each man remembered the intricate details of the other men's explanations, and gleefully picked them apart. All the time remaining convinced that their personal story was completely reasonable. Every time Rokeach brought up the impossiblity of three christs, none of them came to a realization that they were deluded; they simply dismissed the others' claims.
They were not rational men, but they were rationalizing men.
We all work the same way. We can see the inconsistencies and fallacies and rationalizations in other's reasoning, but are blind to our own.
Centrifuge training for jet pilots to resist G forces in combat. As pilots go in and out of consciousness, they report all the same things that accident or hospital patients do when having 'Near death experiences' - tunnel of light, friends and family greeting you, mixed up memories. Both of these circumstances are actually the brain trying to make sense of what is happening as its systems go haywire with oxygen deprivation. narrative is so important to our survival that it is the last thing the brain gives up on.
Cotard's syndrome part of a family of symptoms found in patients who have lost their ability to connect emotionally with others. Their sensory input is weird and non-sensical, so they make up weird stories to account for it - everyone else has been replaced by robots or aliens and I'm the only one who can tell the difference.
There is only a degree of difference between their stories and yours. Your explanations could also be wrong, but you aren't being fact-checked. These false accounts are called confabulations - unintentional lies. They are not true, but the person making them does not realize that.
People are capable of amazing confabs - blind people will confidently claim they can see, and doctor only recognizes it when they start banging into the furniture. Patient with a paralyzed arm will deny that it is theirs ("My mum's hiding under the table, playing a trick on me")
A confused mind comes up with a new story asap, and it doesn't matter at all if it is several time zones away from being likely. All these stories occur in our left brains. The right brain's job is to check the stories against reality. But if something goes wrong with that part of the brain, or with the connections, the supervisor isn't available.
We need satisfying stories. But we don't usually come by them via logic and careful reasoning. We seek causes and effects that will explain the world in a way that suits our self-image. Life makes sense with a narrative, because you can edit it into a coherent story. If you look back on an action, you have an intense desire to explain it, and that explanation affects your future actions. And narratives need characters to be explanatory, so inanimate objects take on agency (we think of them as acting like people would, with emotions and motivations).
This sense of agency is so strong that people throughout history have assumed the sun, moon, winds and seas were conscious beings, gods to be propitiated with sacrifices.
You don't experience thinking, you experience the result of thinking. You don't have access to the processes that came up with your current narrative explanation of what is happening to you today. But you are convinced that that narrative is true. If three men couldn't agree as to who among them was or was not really Jesus, your chances of swaying someone on the Internet to change his belief system about religion, politics, art or favorite sports team are pretty slim.
We are strongly biased to prefer 'truth' delivered by story. Raw data may be more accurate, but an emotional appeal gets into our head faster. Truth and accuracy lose out to a riveting tale delivered with pictures and a bit of humour.
Your 'self' is nothing more than a story. It is the explanation for your own memories. For the three Christs of Ypsillum, it is their story that lets them cope with their madness.
Belief is a fragile thing. People used to believe cured disease by letting blood. A century ago, Americans believed women should never have a vote. Fifty years ago, many were certain black people should not be allowed to marry whites. Today many attack belief that gays can marry.
Much of what we know is not individual intelligence but shared cultural inheritance. If a timewarp suddenly transported you back to a medieval village, what modern improvements would you be capable of providing?
Before we had a method for examining reality, 'truth' was a slippery thing. Even the great thinkers of antiquity were pretty dumb by modern standards. Until the scientific method invented, it was very difficult to change anything because 'everybody knew' that the existing belief was true (even though that belief might differ radically from one culture to the next). Every explanation was as good as the next one, but one would become the official explanation for generations: why thunder and lightning? Because you'd done something to upset the gods.
Science invented because we are so bad at coming up with explanations that work. We prefer easy to understand stories, so complicated problems get simplified. The stories, full of biases and fallacies, come up against facts, and crash.
Do you wear a suit because you are a professional, or do you act professionally when you put on a suit? Do you vote Democrat because you believe in helping the worse off, or do you decide you should help them because you voted democrat? It's always the latter - we become what we pretend to be. The Benjamin Franklin Effect - he wouldn't have lent a rare book to someone he didn't like, so he must like BF.
In most offices, thermostats on wall are not connected to anything - owners install dummy switches to stop people constantly adjusting temperature (and costing them money).
Mind is like a spacecraft. As long as it is travelling in a straight line it burns little fuel. But as soon as the pilot takes over, to make any choices about where to go, it burns fuel at an alarming rate. So any attempt to make choices, avoid temptation, suppress emotion or conform with social expectations depletes your ego. Takes willpower fuel, and once fuel depleted, it becomes harder to resist next time.
The fuel is glucose. Ego depleting tests run down the body's supply.
Expt where subjects had to choose between one of two pics to keep. Half group told choice irrevocable, other half told they could change their minds for a couple of days. First group happy with their choice; never looked back. Second group almost paralysed with regret, even when changed mind several times.
For most of humanity's existence we have had very little choice about job, wife, where lived, place in society. Overwhelming advantage to being able to make most of it and just get on with it. And little exposure to other options - no TV of Rich and Famous.
We look at other people and think "Why did you make that choice?" "You would have been far better off doing XYZ". But the other person has made the quite healthy decision that he's quite satisfied with outcomes. Sure it could have gone this way or that, but then he wouldn't have had ABC, which he is grateful for. And of course the reverse applies - he's judging you for your 'mistakes'.
Backfire Effect. We think we are rational enough to change our beliefs when confronted with evidence. But in fact, when your deepest beliefs are challenged with contradictory facts, your belief gets stronger.
Website LiterallyUnbelievable.org records Facebook reactions to stories from The Onion. (Takeaway: there's a lot of ignorant, gullible people out there, but it's fun to point at them and laugh.)
Anyone who's got into an Internet argument about gun control, abortion, climate change, vaccinations, evolution etc knows this - the other party doesn't thank you for the enlightening and educative exchange, he eventually descends into a vitriolic ad hominem attack. When confronted with something that challenges a deep-held belief, you pore over it, trying to find a weakness, a way to deal with it that keeps your ego intact. Once you finally move on, your original convictions are stronger than ever.
Plus, a simple myth is much more attractive than a complex explanation. The harder it is to process a line of statements, the less credence you give them.
Pluralistic Ignorance. We misjudge what the majority of people believe. Study of segregation in US in 1960's. Only one in five actually wanted it, but nearly half thought that most people did. This weights cultural behaviour towards conservatism, because misjudge how much support there is for a change.
Great comedians short-circuit this process by addressing forbidden topics and subverting accepted norms, forcing audience to rethink attitudes.
The more masculine clothes a woman wears to a job interview, the more likely they are to be hired.
When you put on a costume or a uniform, join a crowd, or even isolated in a car, you drop the cultural chains of inhibition.
More books on Mind
Books by Title
Books by Author
Books by Topic
Bits of Books To Impress