Bits of Books - Books by Title

Great Myths of the Brain

Christian Jarrett

We are in the middle of a brainwave. The development of functional magnetic resonance imaging about 20 years ago has allowed researchers to see how the brain works with far greater accuracy than ever before. The result has been an increasing focus on such things as neural pathways, synapses, serotonin and the limbic system - all words that crop up frequently not just in news reports but also in 'neuromarketing' claims by advertisers and entrepreneurs. But just because neuroscience research is having a media-sexy moment doesn't mean everything you learn about the brain is true. As the author of Great Myths of the Brain, Christian Jarrett, says, 'A little brain knowledge can be a dangerous thing.'

Jarrett is a man with substantial knowledge of the brain. Great Myths has been hailed by Ben Goldacre (author of Bad Science and Bad Pharma and scourge of sloppy statisticians and dodgy pharmaceutical marketers) as 'a masterful catalogue of neurobollocks'. Professor William Flew calls it 'a tour de force of critical thinking from ... one of the world's great communicators'.

In the book, Jarrett tackles the most popular, enduring and dangerous of brain myths and misconceptions, from the persistent and widely accepted notion that we use just 10% of our brains to more specific and harmful misunderstandings, such as the mistaken idea that you should place an object in the mouths of people having epileptic fits to stop them swallowing their tongues. Among the myths he busts are that right-brained people are more creative; that women lose their minds when they are pregnant; that mood disorders arise from a chemical imbalance in the brain; that the internet is making us stupid; that co-ordination exercises can improve the integration of function between the brain hemispheres; and that computerised brain-training games can make you smart.

We should all be on the lookout, he says, for gratuitous neuro references, which he calls 'neurobunk'. He cites the way clinical neuropsychologist Vaughan Bell, writing in the Observer in 2013, called out a politician who tried to bolster an argument by claiming unemployment is a problem because it has 'physical effects on the brain'. In February 2013, the UK's Daily Mail reported on research by a German neurologist who claimed to have discovered a tell-tale 'dark patch' in the central lobes of the brains of killers and rapists ('There is no such thing as a central lobe,' says Jarrett). He mentions an American educational speaker who has reportedly been telling audiences around the country that girls have a 'crockus' that's four times larger than in boys - 'an astonishing claim given the crockus is an entirely imaginary brain area'.

Sitting in an Italian cafe in Brighton on the south coast of England, Jarrett appears to be almost the archetype of the 'boy scientist'. At 35, he has that puppyish look common to many of the world's science and technology celebrities such as Larry Page, co-founder of Google, and Brian Cox, the astrophysicist and one-time rock-band keyboard player who has appeared on a host of BBC science shows. With his floppy hair and shy smile, he's all bright-eyed, bushy-tailed enthusiasm.

These days, Jarrett is a psychologist and a neuroscientist, but at school the man rapidly gaining a reputation as 'Mr Neuroscience' wasn't even a science major. After completing his psychology degree, he did his PhD in cognitive neuroscience at the University of Manchester and his day job now is editor of the British Psychological Society's Research Digest. In the past 10 years, the journal has become an internationally respected resource, explaining and elucidating the latest research in neuroscience and psychology. And it's not just for students and psychologists: more than 30,000 subscribe to its fortnightly emails, which seek to demonstrate how fascinating and useful psychological science can be, while also casting a critical eye over the methods used.

Out of this came Great Myths, which aims to sniff out the stuff and nonsense and provide the evidence for the facts as rigorously as Jarrett can. As Thomas Jefferson once put it, 'He who knows nothing is closer to the truth than he whose mind is filled with falsehoods and errors.'

So, let's start with Myth No 11: we only use 10% of our brains. 'It's easy to see the appeal,' says Jarrett. 'Who wouldn't want to believe they have vast reserves of latent brain power just waiting to be unlocked?' He cites the recent Scarlett Johansson movie Lucy. The poster proclaims 'The average person uses 10% of their brain capacity. Today she will hit 100%.' The statistic is used by a lot of marketers, even for airlines. And, alarmingly, a 2012 survey of primary and secondary schoolteachers in England and Holland found more than four out of 10 of them believed it to be true.

It's not. We use all the brain we have: 'There is no spare neural matter lying around waiting to be given a job to do.' This has been confirmed by thousands of brain scans, in which waves of activity can be seen coursing through the entire brain, even when participants are asked to think of nothing.

'In fact, there is an entire network of areas - dubbed the 'default mode network' - that becomes more active when people do their best to disengage from the outside world,' says Jarrett. And if you are injured and part of your brain can't work, over time the brain can work its way around lost neurons, adapting to change.

'The idea that we use only a small fraction of our brains makes no sense from an evolutionary point of view. The brain is a notorious gas-guzzler, accounting for 20% of our energy consumption even though it only makes up 2% of our body mass. Evolution by natural selection tends to weed out the inefficient, so it's implausible that we'd have such a costly organ that's mostly redundant. Imagine a company in which most of the staff sat around doing nothing - they'd be fired. It's the same with our brain cells. Do we have huge potential to learn new skills and recover from injury? Definitely. Do we only use 10% of our brains? No way.'

The 10% myth seems to have started with William James, often referred to as the father of American psychology (and older brother of esteemed novelist Henry James, himself a pretty astute observer of human behaviour). His remark about untapped potential of the brain was warped into a comment in the preface to Dale Carnegie's How to Win Friends and Influence People that read 'Prof William James of Harvard used to say that the average person develops only 10% of his latent mental ability'. When misinterpretation of 20th-century neuroscientific research became mixed up with reports on patients with hydrocephalus, who had smaller-than-normal brains but appeared to function without difficulty, the myth grew.

Many of the myths, though, are new, and Jarrett says it's often because people are trying to sell us stuff. 'How many times have you seen a headline referring to which part of your brain is active when you fall in love or use your iPhone? Or why your brain is addicted to junk food?'

So how are we to tell fact from mad tabloid fiction? Jarrett recommends scepticism, and suggests we get to know how research works. Most studies represent a single snapshot in time, so whatever has been found is likely to be a correlation, though it will often mistakenly be reported as a causation. For example, a study of the effects of gaming on teenagers' behaviour may suggest that they are less social than teenagers who don't play games. But that does not prove that gaming changes the brains of people who play them; it may only prove that people with certain kinds of brains like gaming. If you want to know more than that, you need to do more research.

Without meta-analysis, in which the results of a large number of similar studies are put together and analysed, 'there's not enough to tell anything definitively. Journalists can still use the research information to make a story, but they shouldn't extrapolate certainties. Any single study has errors in it, and by pooling research you reduce that error.'

Jarrett notes, for example, that researchers have recently looked at the optimal time to learn new things before going to sleep, because it is clear that sleep is important for consolidation of material learnt during the preceding day. In 2012, sleep medicine specialist Johannes Holz and his colleagues proposed on the back of their research that 'declarative memories, such as vocabulary words, should be studied in the afternoon and motor skills, like playing soccer or piano, should be trained in the late evening'. But as Jarrett cautions, 'Before adjusting your weekly timetable, bear in mind this small study needs to be replicated.'

Jarrett believes that some of the myths are consciously propagated, as some in the media follow an ideological agenda. 'Mail Online for example, one of the five most visited sites in the world, will look out for anything about differences between working mothers or gay men or women or unemployed people. It's almost like a regular column: online porn, video gaming, multitasking - you name it, it can 'shrink your brain'.

'It really amused me the other day to see that there was a study that showed that in chess grandmasters some brain areas were smaller than those of others who didn't play as well. In this particular case, the headline would be: 'Playing chess can shrink your brain!'. (If you want to know exactly which part of the chess champs' brains were smaller than those who weren't as good at chess, read the blog Jarrett wrote about it on the American site Wired.)

The story of how Jarrett got his gig on the highly influential Wired site is a case study of what can go wrong in this high-speed, super-accurate world. American journalist and writer Jonah Lehrer was until recently acknowledged as the leading neuroscience and psychology writer in the world. A staff writer at the New Yorker, he had written three best-selling books on the subject (Proust Was a Neuroscientist, How We Decide and Imagine: How Creativity Works) and he blogged on Wired. Jarrett admits that he looked across the Atlantic at Lehrer's reputation with a certain amount of envy.

But in 2012 came the fall. Lehrer was accused of recycling his own work and plagiarising the work of others. Two books were recalled, the New Yorker sacked him and he has largely disappeared from view. Jarrett got the Wired blog as a result of the scandal, and it turned out that much of Lehrer's plagiarism was from Jarrett's British Psychological Society digest. As someone who admits he wanted to hit the heights of best-sellerdom himself, he must have been flattered? Jarrett looks rather uncomfortable, and a little sad. 'It was quite weird, actually. I was a big fan of his writing - I still am.'

The fierce competition in this new field of neuroscience isn't just among the writers who try to popularise the field: the scientists themselves are pitted against each other to get the funding, attention and kudos that a great discovery could bring. And in this competitive area, there are goodies and baddies. Susan Greenfield, a senior research fellow at Lincoln College, Oxford, is notorious in the UK for what Jarrett calls her 'slippery approach' when criticised on her theory of 'mind change', the 21st-century response to technology. Specifically, she raises fears among parents of children with autism by linking internet use and the condition.

'I don't understand her agenda,' says Jarrett. 'She could make the point in a much more responsible way - she could do the research for one thing. But she doesn't.'

His take on internet use is that it's probably best to control our screen time, but far-reaching claims - that the net is stealing our identity, rotting our brains and warping our values - are not proven. 'It's how we use these devices, to our advantage, or whether we let ourselves fall victim to bad habits.'

Jarrett may be about to meet a few new challenges himself. This year, his wife - a paediatric clinical psychologist who works in the NHS - gave birth to their first children. Twins Charlie and Rose were hitting six months two days after we spoke and now the proud father is starting to see where science can clash with anecdotal experience.

He stands by his debunking of Myth No 16: pregnant women lose their minds, but he found that his own wife struggled mightily with pregnancy as she had hyperthyroidism and serious sickness, which affected everything in her life. 'Problems with memory would have been the least of her worries,' he says.

He suspects that this is one of his myths that further research will turn into fact but he also believes that if scientists more closely followed animal research, they might discover that other abilities, best suited to nurturing and ensuring the safety of young, are enhanced in the pregnant woman's brain. 'Any pregnancy-related impairments are likely a side effect of what is ultimately a maternal neuro-upgrade that boosts women's ability to care for their vulnerable offspring.'

Now, with a male and female baby, he can see that gender stereotypes are being played out. 'My son is heavier and everyone responds to that - 'What a bruiser!' and my daughter is much smaller - 'Isn't she pretty?' But then you notice how much quieter he is and how she cries for attention much more. Already I'm wondering, 'Is it them, or are they responding to how they're being treated?'

Myth No 13 says men's and women's brains are wired differently. But claiming that as a myth doesn't mean the genders are the same. Jarrett writes that it is tempting to see the differences between average male brains and average female brains as the reason for behavioural differences such as 'men's usual superiority on mental rotation tasks and women's advantage with emotional processing'. But it's also long been a way of setting up self-fulfilling prophecies, pushing girls into pink and boys toward Lego, and so on into adulthood.

Still, it is simply too early to tell how much truth there is in many of the myths. Neuroscience is a very young discipline and there remains much to be discovered about what it means to us. As Jarrett says, we shouldn't close down any argument - we should always be looking to see what the evidence tells us, sifting, assessing. That's what the grey matter is for, after all. The rest is just neurobollocks.


Physical sensations such as warmth can affect our judgments about other people, notes Jarrett. One study of the links between bodily sensations and the way we think found that participants who'd held a cup of hot coffee subsequently judged a stranger in more 'warm terms', for example, describing him as good-natured and generous, as compared with other participants who described the stranger after holding an iced coffee.

The most dramatic demonstration of the deep connections between mind and body has to be the placebo effect, says Jarrett. 'My favourite example comes from the treatment of Parkinson's disease. In one trial testing the benefits of surgically implanting embryonic dopamine neurons, there was a group of control patients who, unbeknown to them, received all the injections but no new neurons. Amazingly, patients in both the treatment and control conditions showed benefits over the following year, presumably because the control group had good reason to believe they too were receiving the revolutionary new treatment.'


A paper published in 2013 surveyed 662 New Zealand undergraduates about their handedness and personality. Left- and right-handers did not differ on any personality factor, the researchers reported. However, there was a tendency for people with a weaker preference for either hand (that is, the mixed-handed) to be more introverted. What about IQ? One massive study found no link with handedness; another found a slight IQ advantage for right-handers (put both studies together and any intelligence/handedness link is negligible.

Other myths: studies show no increase in mortality with left-handers, and no systemic tendency to suffer from disorders of the immune system. Nor are they still persecuted: five of the last seven US presidents have been left-handed. One evolutionary account for why left-handedness has survived is that it confers a fighting advantage - left-handers tend to win more often on average than do right-handers in sports like boxing and fencing.


In his 2008 effort Why Mars and Venus Collide, John Gray explains how men's brains use 'a specific part of a single hemisphere to accomplish a task' while women use 'both hemispheres for many tasks'. Gray extrapolates from this putative brain difference to explain the 'fact' that men tend to think about one thing at a time: 'He may be focusing on how to get that promotion, so he forgets to bring home the milk.'

The evidence tells a different story. John Gilmore and his team at the University of North Carolina scanned the brains of 74 newborns and found no evidence of the claimed smaller left hemispheres in male babies compared with females. A related idea, says Jarrett, is that women have a thicker corpus callosum than men - that's the bridge that connects the two brain hemispheres. But that, too, is a myth. 'A 2012 diffusion tensor paper actually found stronger inter-hemispheric connectivity between the frontal lobes in males than females.' There is modest evidence that women are better at processing emotions than men - for example, a 2010 study by researchers in Canada and Belgium found that women were better than men at categorising facial expressions of fear and disgust.

It's tempting, says Garrett, to see brain differences between the sexes and think that they explain behavioural difference. In fact, in many cases, we simply don't know the implications of sex-related brain differences.

It's even possible that brain differences are responsible for behavioural similarities between the sexes. This is known as compensation theory, and it could explain why men's and women's performance on various tasks is similar even while they show different patterns of brain activity.

Of relevance here is a new brain-imaging study of girls and boys watching funny videos. The girls' brains showed a more heightened response to humour, but their subjective appreciation of the videos was no different from the boys.

It is important to look for sex differences, says Jarrett, because they might help cast much-needed light on conditions like autism and depression that tend to be found much more in men and women respectively. 'But the idea of using supposed brain differences to argue for distinct education practices for girls and boys is ridiculous and potentially harmful. Oh, and in case you're wondering, a massive meta-analysis published in 2014 of 184 studies found that the best-quality research provides no evidence that single-sex education brings academic advantages for boys or girls.'


'So important is sleep for learning that it's not worth students staying up late to revise for longer, if doing so means they miss out on quality sleep,' says Jarrett. 'Researchers at the University of California tested this in a study published in 2012, for which hundreds of students kept sleep and study diaries for two weeks. The results showed that extra studying at the expense of sleep led to academic problems the next day, including trouble understanding lessons and poorer test performance.'


Research has consistently failed to demonstrate that sugar rushes cause short-term hyperactive behaviour in children. This makes sense given that glucose levels in healthy children's (and adults') brains are kept at a steady level by natural regulatory mechanisms.


Amnesia is an incredibly popular plot device in fiction, says Jarrett, and there is evidence it is skewing public perception in a far-fetched way. 'A classic example is the assassin Jason Bourne (above), played by Matt Damon, who in the first part of the [Bourne] film franchise is pulled from the ocean with gunshot wounds and no memory of his own identity.' But, like many other fictional amnesiacs, 'Bourne has a preserved ability to lay down new long-term memories and is perfectly able to look after himself. In fact, in Bourne's case, he is more capable than most healthy people, showing supreme ingenuity, quick-wittedness and spy skills.' Jarrett points out that 'in her witty analysis of amnesia at the movies (published in the British Medical Journal), neuropsychologist Sallie Baxendale highlighted another ridiculous myth perpetuated by many films - the idea that a second knock on the head can have a curative effect'.


In 2013, researchers at the University of Oslo and University College London (UCL) conducted a meta-analysis combining the available 23 studies into working-memory training that had included a control group. The results from adults and children were absolutely clear: working-memory training leads to short-term gains in working-memory performance on tests that are the same as, or similar to, those used in the training. 'However,' the researchers wrote, 'there is no evidence that working-memory training produces generalisable gains to the other skills that have been investigated (verbal ability, word decoding, arithmetic), even when assessments take place immediately after training.'


'Beer Makes Brain Release Pleasure Chemical Dopamine,' announced a headline in the Huffington Post in 2013. Similar news articles appear almost weekly, says Jarrett. 'People's favourite music evokes the same feelings as good food or drugs, claimed the Guardian in 2011, because it releases 'the brain's reward chemical dopamine'. In reality, says Jarrett, 'dopamine is involved in many different brain pathways, only some of which are related to reward. Dopamine is released not just when we score a high from food, sex, or money, but also when we fail to get a reward we were expecting. And there's research that found increased dopamine activity in so-called 'reward pathways' when distressed bereaved people looked at pictures of a lost relative. This dopamine went hand in hand with participants' reports of sadness and yearning, certainly not pleasure in the usual sense of the word.

'A more accurate characterisation of dopamine is to say that it is important for motivation and finding salience in the world. Indeed, schizophrenia is associated with excess dopamine activity and one theory proposes that many of the illness's problematic symptoms, such as paranoia, are related to reading too much importance and meaning into largely random situations.'

More books on Mind

Books by Title

Books by Author

Books by Topic

Bits of Books To Impress