Bits of Books - Books by Title
How To Make Good Decisions
Panic over-reaction to 9/11. Osama bin Laden reckoned cost less than $1/2 million to stage, but the lowest estimate of America's post-9/11 spending was $500 billion/ So every dollar he spent defeated a million dollars.
If reason conflicts with a strong emotion, don't try to argue. Enlist a stronger conflicting emotion.
Guy tried to use stats to convince wife that flying still safer than driving long distance, but she was unswayed until he put in terms of risking lives of their children.
How many miles would you have to drive before the risk of dying is the same as on a long-distance flight? Answer is 12. So if you make it to the airport, the most dangerous part of the trip is already behind you.
IQ test: there is an old car which has to drive over a hill which is a mile up and a mile down. Because it is old, it can only drive the first mile - up the hill - at 15mph. How fast will it have to drive down the hill to average 30mph for the whole trip?
(Trick Q - consider if hill was 15 miles up and 15 miles down - to average 30mph you would have to cover the distance in 1 hour/ But at 15mph it has already taken an hour to get to the top of the hill).
Catheters cause 80,000 infections and 28,000 deaths in US every year. They can be eliminated if doctors follow the checklist:
1. wash hands with soap 2. clean patient's skin with antiseptic 3. put sterile sheets over patient 4. wear sterile mask, hat, gown and gloves 5. put a sterile dressing over catheter once line is in.
If a new drug was found that was as effective at preventing these infections, every ICU unit would have it, no matter what the cost. The checklist costs nothing. But it still struggles to get acceptance.
Don't ask doctor what they would recommend for you; ask them what they would choose for themselves.
Germans love christmas trees with real candles. Americans get very upset at idea, because fear fires. Yet about 10 Germans die each year from flames associated with candles; the same number of Americans die from xmas lights (children swallowing bulbs and adults electrocuting themselves).
Poll showed 40% of Americans either believed they were already in top 1%, or that they would get there one day. In fact upward mobility in US has regressed to the mean (about as likely as was before WW2). yet large numbers of Americans support tax cuts for the rich, from which they will never benefit.
Predictions always work except when this year is not like last year.
Choking: expert players get worse the more they think about their actions. Beginners get better.
About 10 million American women have unnecessary PAP smears for cervical cancer - unnecessary, because, having had complete hysterectomies, they no longer have a cervix. They waste millions of dollars, yet doctors do the test because they get a share of the fees.
Someone arrives at Emergency with suspected stroke. He gets put in a MRI at cost of about $1000 a time. Or, doctor can do a HINTS exam, which is 3 tests, taking 1 minute, and which can be performed bedside, immediately. In a study of 100 patients with suspected stroke, 75 actually had had one. The MRI missed 8 of those; the HINTS exam caught all of them and missed none. It had 1 false positive, which is by far the less harmful error.
Prostate cancer: mortality rate is much the same in Britain and US, yet the 'survival rate' is much "better" in US. What's happening is that in UK, prostate cancer is diagnosed by symptoms. So it is detected in a 67 yo man, who then dies aged 70. In US, it is diagnosed by PSA test, so the same man detected at 60, but he still dies at 70. The 5 year survival rate (from detection) is zero in UK, 100% in US, but mortality stays same (100% dead at 70). There is no evidence that early detection and subsequent removal of prostate cancer prolongs or saves lives.
But big problem is over-diagnosis. The PSA test cannot tell difference between progressive, malignant cancer (kill you soon) and benign (you'll die of something else before the cancer gets you).
So in Britain, with no screening, 1000 men develop progressive prostate cancer. They are all operated on, and after 5 years 440 are still alive, which is a 44% survival rate. In US, screening finds 1000 men with progressive cancer, but it also finds 2000 men with non-progressive, benign cancer. They are all operated on, and same number die. 440 of the malignant cancer are still alive, plus the 2000 who didn't need the operation, because they would still be alive without it. But now the apparent survival rate is 81%, because its counting men who would have survived anyway.
And now, of course, those 2000 men are walking around in diapers, and they can't have sex any more.
Same thing applies to breast cancer. Two groups of women aged 50. One group screened regularly, the other not screened at all. After 10 years, 4 out of a thousand in screened group had died. In the other, 5 deaths. In other words, one in a thousand. (But this is usually presented, misleadingly, as a 20% reduction).
Now consider the drawbacks. In the screened group, about 100 got false alarms at one stage or another, with all the accompanying anxiety and unnecessary biopsies. And another 5 out of the 1000 developed benign cancer, which is only diagnosed after the unnecessary mastectomies.
And that's without taking into the account the women who may get cancer from the xrays.
The risk of dying from radiation-induced cancer from a single full-body CT scan is higher than the risk of dying in a traffic accident.
A lot of cancer is due to lifestyle. Immigrants get the cancers of the country they move to, not the cancers of their homeland. Japanese men have significantly more stomach cancers, but they lose that when move to Hawaii where it's a lot harder to get salted or pickled fish. But they get more prostate and breast cancers.
Risk is integral with life. It ceases only when life ceases. If you manage risk badly, your exit may be sooner than you wish: a good reason for studying the subject carefully. Much of the literature of risk is excellent, and Gerd Gigerenzer - whose work Malcolm Gladwell drew on extensively in his best-selling book Blink: The Power of Thinking Without Thinking - is a leading contributor to this excellence. A very important paragraph is nearly the last in this book; it should have been the first. In it Gigerenzer explains why Risk Savvy is not an academic textbook: he wishes to give something back to the taxpayers who fund his research; he does not wish people to be further estranged from science; he seeks to enable readers to take more control of their lives by making more informed decisions. Would that these principles were compulsory throughout the academy.
Similarly, perhaps the book's most important chapter, 'Revolutionize School', is left to last. Gigerenzer advocates a risk literacy curriculum for schools, starting young. His curriculum has three topics: health literacy, financial literacy and digital risk competence, each subdivided into statistical thinking, rules of thumb and psychology of risk. This is a revolution that we should all actively support; it would equip children to deal with the real world. It would put touchy-feely dream-world teaching into context and, with luck, into oblivion.
Much of Risk Savvy focuses on the natures and forms of incoming risks, and the various corruptions, games and mechanisms adopted by those seeking to benefit from misleading you on the risks that they present. Gigerenzer is right to highlight the fundamental failings of bureaucratic countermeasures. They are complex, costly and ponderous, a waddling reset of the battlefield defences, no great impediment to an agile enemy who evades them with new corruptions. Your most reliable defence is you. Become risk-savvy. Understand risk. Understand the mechanisms of deception. Understand how your own mind may deceive you. Understand how to take evidence-based decisions.
Unfortunately, in the process you will need to grasp some statistics. Gigerenzer is very good at explaining their misuse, and in doing so examines two major pitfalls. The first relates to probabilities, which he illustrates with a cautionary tale of the 1995 UK Committee on Safety of Medicines. The committee warned that third-generation oral contraceptive pills doubled the risk of thrombosis. The risk associated with the second-generation Pill, 1 in 7,000, was increased to 2 in 7,000 in the new Pill. Distressed women stopped taking the Pill. Unwanted pregnancies and abortions - with all their associated risks - resulted. Although the relative risk of thrombosis did indeed double, the absolute risk, the real risk, increased by only 1 in 7,000. In added irony, the risk of thrombosis is greater with pregnancy or abortion than with the third-generation Pill. First statistics lesson: always ask, what is the increase in absolute risk?
Gigerenzer's second lesson is that, if figures are expressed not as probabilities but as natural frequencies, their significance is much more obvious. His example is of 1,000 women, 10 of whom have cancer and 990 of whom have not. Mammograms indicate cancer in 9 of the 10 who have it, and in 89 of the 990 who have not. Thus 98 of the 1,000 women test positive. All will be horrified. But how reliable is their test? What is their true probability of cancer? Expressed as a natural frequency the answer is clear, under 10 per cent, 9 (who have cancer) divided by 98 (the 9+89 who test positive). The same calculation using conditional probabilities (technically by using Bayes' rule) has baffled many doctors over the years, thereby greatly and unnecessarily distressing hundreds, perhaps thousands, of women. Second statistics lesson: reject conditional probabilities and insist on natural frequencies.
In these examples, the supposed expertise of clinicians most certainly did not extend to statistics. Their unfortunate patients trusted them, unaware of their illiteracy in statistics. In courts of law the same ignorance in 'expert' witnesses, and in the presiding judges, can lead to dreadful miscarriages of justice. In Gigerenzer's view, even more culpable is the deliberate misuse of statistics by vested interests, notably the pharmaceutical industry. They do not compare like with like. They cherry-pick their sample groups and/or their statistical methods to 'prove' the efficacy of their product. Gigerenzer describes this with the musical term 'double-tonguing'. Whatever you call it, never forget that snake-oil salesmen speak with forked tongue. Your best defence is to become a well-informed manager of your own risks. To do this, you must grasp elementary statistics. Gigerenzer's clear explanations will be a great help to all. The daunted may prefer his earlier book, Reckoning With Risk: Learning to Live with Uncertainty, which covers statistics at greater length, and thereby with greater clarity, than Risk Savvy.
The subtitle of this book is How To Make Good Decisions. It rightly includes the mind itself as an important source of bad decisions. Your mind is always trying to make itself comfortable. First and foremost it likes quickly to make sense of things. To do this it gets up to all kinds of games, seeking to take rapid decisions that it then defends by means foul and fair, grasping for supporting information while firmly rejecting conflicting information. Risk Savvy explains some of these mechanisms: false certainty, the certainty illusion, and the turkey illusion, ie, because the nice farmer has fed us every day, he always will. Gigerenzer draws valuable lessons: do not be impulsive in data-rich situations where you should be thoughtful; do not be thoughtful in data-poor situations where you should be intuitive. Remember the ancient Persians; they never took a decision when drunk that they did not review when sober, and vice versa. Cheers.
Although Risk Savvy is intended for a general readership, it remains a thoughtful treatment of a wide-ranging and demanding subject. Few readers seeking to absorb its contents will do so at first reading. They will wish to revisit tricky concepts and detailed arguments. Here the publisher should have helped. The book's notes do not state the pages to which they refer; in following a series of references the reader will soon run out of fingers. The chapter headings are jokey; this is probably not a good idea in a thoughtful book, and is most certainly a bad idea when not accompanied by brief chapter synopses. In contrast, Better Doctors, Better Patients, Better Decisions: Envisioning Health Care 2020, a collection of papers Gigerenzer co-edited with J. A. Muir Gray, is a model of clarity, each paper starting with an abstract and ending with a summary or conclusions.
We are now in a position to implement Gigerenzer's proposal for revolutionising the way we teach risk. An informed structure and curriculum of the decision-making process could readily be based on the existing literature. For example, the statistical aspects and quantitative presentation of risk are well covered by Gigerenzer and by David Spiegelhalter. Some of the workings of the mind are well understood (as the work of Jonathan Haidt and Daniel Kahneman shows), as are accidents and system behaviour (Charles Perrow, Cass R. Sunstein) and both strategy and risk itself (Lawrence Freedman, John Adams).
Meanwhile, we must all be more risk savvy. Do not assume that the gods will always protect you. Make your own luck. Take care, be thoughtful. Mistrust experts. Distrust vested interests. Do not ride tigers or motorcycles. Avoid gold-diggers. And have a nice day.
More books on Mind
Books by Title
Books by Author
Books by Topic
Bits of Books To Impress