Bits of Books - Books by Title
How Innovation Works:
Serendipity, Energy and the Saving of Time
by Matt Ridley
More books on Inventions
Steam locomotives were unwanted until the Napoleonic Wars took away most of the horses and hay needed to feed them. It became uneconomic to cart coal more than eight miles. George Stephenson saw the need to innovate both engine and rails. The first successful combination in 1825, but not until cheap finance avilable in 1840's that railways boomed. Many investors ruined in subsequent bubble, but Britain left with an established network.
The story of the internal combustion engine shows the usual features of innovation - a long prehistory of failure, a shorter period marked by an improvement in affordability, characterised by simultaneous patents and rivalry, and then a subseuent period of evolutionary trial-and-error.
Incandescent light bulbs reigned supreme for over a century, and only gave way to new technology because govts were persuaded to ban them, in favour of compact fluorescent bulbs. But these took a while to warm up, didn't last as long as predicted, were hazardous to dispose of, and were much more expensive. Estimated that cost Britain over £2 billion wasted in forced changeover.
Worse still, if govts had just waited a few more years, LEDs became cheap enough to provide all advantages with none of disadvantages. In just 6 years, compact fluorescents abandoned. LEDs use so little power that can be run off solar panels. They waste so little energy on heat that they can be used as grow lights.
Main reason West stopped building nuclear power stations was not fear of accidents or cost of disposing of nuclear waste, but the escalation of costs due to excessive regulation.
Ironically, polio only became dangerous bc of improvements in public health. Before water supplies were chlorinated, everyone encountered sewage in their drinking or swimming water, and they were immunized before the virus caused paralysis. But when encountered later in life, it was much more virulent.
What do Marcellin Jobard, William Robert Grove, Frederick de Moleyns and Alexander Lodygin have in common with Thomas Edison? The answer is that they all have a credible claim to have invented the lightbulb. Then again, so do 16 other people.
When we think about inventors and invention we tend to picture the lone genius sitting in their laboratory - or under the apple tree, or in the bath. Suddenly, there is a moment of inspiration, or just a happy accident, and the world is changed for ever.
Take the story of penicillin. In 1928 Alexander Fleming goes on holiday for the summer, leaving a culture of a bacterium called Staphylococcus aureus in his London lab. A floating spore of Penicillium happens to drop on to the plate. When Fleming returns he finds a circular patch where the bacteria had been prevented from growing. Cue the invention of antibiotics and the saving of millions of lives.
Yet as Matt Ridley shows in How Innovation Works, such eureka moments are vanishingly rare - and even when they occur they don't matter that much. What really changes the world, he argues, is innovation rather than invention; the painstaking trial-and-error work of discovery, adaptation and improvement that slowly turns an idea into a fact.
In other words, this is a book that celebrates Carl Bosch rather than Fritz Haber; not the academic who worked out how to turn nitrogen into fertiliser, but the businessman who poured enormous resources into making the reaction happen in the factory rather than the laboratory, a process that required assembling the largest engineering team until the Manhattan Project 30 years later and trying more than 20,000 possible variations, in addition to solving a host of other formidable practical challenges.
Edison is a hero to Ridley not because he was one of many to send current through a filament, but because of the hard yards his team put in to test thousands of possible materials until they settled on carbonised bamboo. Even Fleming's 'mould juice' took more than a decade, and the work of another team of researchers, to become a product, treating its first patient, a 43-year-old policeman called Albert Alexander, in February 1941.
If this makes How Innovation Works sound joyless, nothing could be farther from the truth. Ridley - as readers of his Times columns will be aware - is a writer enraptured by progress, one of the New Optimists who make the unfashionable case that life really is getting better, give or take the occasional pandemic, and that we have capitalism and innovation to thank. It is, as he acknowledges here, in the blood; his family were among the first customers for Thomas Newcomen's steam engines, which fired the starting pistol for the industrial age.
The bulk of this fascinating book, then, is devoted to a gallop through the glories of the past as Ridley explains how we came up with everything from inoculation to U-bends to artificial intelligence. We learn that corrugated iron was once so fashionable that Prince Albert made a ballroom out of it at Balmoral; that the Smithsonian Institution tampered with a rival failed prototype to claim it was the first plane capable of manned flight in an underhand attempt to write the Wright brothers out of history; that close to ten million defecations take place in London every day; and that there really was someone called the Honourable Clotworthy Skeffington (the intended husband of the public health pioneer Mary Wortley Montagu).
Many of the people here deserve to be celebrated. Newcomen, for obvious reasons. Malcom McLean, whose introduction of containerisation slashed freight costs overnight in 1956 from $5.83 to $0.16 a ton, accidentally starting the Asian trade boom. Norman Borlaug, who doubled Mexico's grain harvest in three years in the 1950s and did the same to India. Charles Algernon Parsons, who demonstrated the superiority of his design for ships by gatecrashing Queen Victoria's diamond jubilee celebrations, weaving his nippy little steam craft Turbinia through the assembled battleships of the Royal Navy.
Yet as Ridley also shows, all of them drew hugely on others' work and ideas, and spent years refining their own. In virtually every case the concepts concerned were already on the technological slipway. If Otto Frederick Rohwedder hadn't worked out how to mass-produce sliced bread in the little town of Chillicothe, Missouri - the secret was in the packaging, apparently - someone else would have done it soon enough. If Watson and Crick (and Rosalind Franklin) hadn't discovered the structure of DNA, it would have been Astbury and Beighton, or Wilkins and Gosling. Indeed, Astbury and Beighton had the key evidence a year before their Cambridge rivals, but didn't realise what they were looking at.
Yet while Ridley's book is primarily about the past, his underlying thesis is very much about the present. "Innovation," he argues, "is the most important fact about the modern world, but one of the least well understood." It is the secret sauce of human progress.
In 1880 a minute's work would buy you, on average, four minutes of artificial light. By 1950 it was up to seven hours. By 2000 the figure was 17 times higher. Thanks to innovation, America's farms use 25 per cent less fertiliser and 22 per cent less water than at their peak, yet are more productive than ever.
So working out how innovation happens, and how to generate more of it, is crucially important. And that means the villains of this book are not so much the fakes, fraudsters and hype merchants - although regarding the latter there is a bravura demolition of the physics behind Elon Musk's proposed Hyperloop transit system - as the forces of reaction. Ridley is scornful of the government regulators who denied AT&T permission to set up a cellular phone service in 1947 because they couldn't see the need, and of the patent systems that do far more to retard innovation than reward it. (One startling statistic from the book: outside chemistry and pharmaceuticals, four times more is spent on patent litigation than is received by patent holders.)
Ridley is also deeply sceptical about the role of government. Rebutting the recent left-wing hypothesis that an entrepreneurial state is the secret to innovation, he cites a study by the Organisation for Economic Co-operation and Development showing that research and development spending by businesses drives growth, but spending by governments doesn't. Trying to create innovation by top-down fiat, he says, is "an essentially creationist approach to an essentially evolutionary phenomenon". Of Europe's 100 most valuable companies, he points out, not one was formed in the past 40 years, a period that coincided with the EU embracing a stultifying hostility to innovations ranging from GMO crops to bagless vacuum cleaners.
"Innovation happens," Ridley concludes, "when people are free to think, experiment and speculate. It happens when people can trade with each other. It happens when people are relatively prosperous, not desperate. It is somewhat contagious. It needs investment. It generally happens in cities." It is gradual, not sudden. It involves multiple wrong turns and an awful lot more perspiration than inspiration. And we'd all be better off if there were a great deal more of it.
Innovation, Matt Ridley tells us at the start of his new treatise on the subject, "is the most important fact about the modern world, but one of the least well understood." Even as it functions as a powerful engine of prosperity - the accelerant of human progress - innovation remains the “great puzzle” that baffles technologists, economists and social scientists alike. In many respects, Ridley is on to something. After decades of careful study, we're still not entirely sure about innovation's causes or how it can best be nurtured. Is innovation dependent on a lone genius, or is it more a product of grinding teamwork? Does it occur like a thunderclap, or does it take years or even decades to coalesce? Is it usually situated in cities, or in well-equipped labs in office parks?
We can't even agree on its definition. Generally speaking, an innovation is more than an idea and more than an invention. Yet beyond that, things get confusing. We live in a moment when we're barraged by new stuff every day - new phones, new foods, new surgical techniques. In the pandemic, we're confronted, too, with new medical tests and pharmaceutical treatments. But which of these are true innovations and which are novel variations on old products? And while we're at this game, is innovation limited to just technology, or might we include new additions to our culture, like a radical work of literature, art or film?
Unfortunately, no one happens to be policing the innovation space to say what it is and is not. Mostly we have to allow for judgment calls and an open mind. As an occasional writer on the subject, I tend to define innovation simply, but also flexibly: a new product or process that has both impact and scale. Usually, too, an innovation is something that helps us do something we already do, but in a way that’s better or cheaper. Artificial light is an excellent case study. Over time we've moved from candles, to whale oil and kerosene lamps, to incandescent and fluorescent bulbs, and now to LEDs. Or, as another example, we might look to one of the great accomplishments of the 20th century, the Haber-Bosch process to make synthetic fertilizer, as a leap that changed the potential of agricultural production. On the other hand, we can regard the Juicero press - a recent Silicon Valley-backed idea that promised to 'disrupt' the juice market and burned up more than $100 million in the process - as a fake or failed innovation. And still, this leaves us plenty of room for disagreement about what falls between these extremes and why.
Ridley enters into this messy arena with the intent of organizing the intellectual clutter. The first half of his book, How Innovation Works: And Why It Flourishes in Freedom, takes us on a tour through some highlights in the history of innovation. We visit with the early developers of the steam engine, witness the events leading to the Wright brothers' first flight at Kitty Hawk, N.C., and hear about the industrialization of the Haber-Bosch fertilizer process. There are likewise forays back to the early days of automobiles and computing, the development of smallpox vaccines and clean drinking water, and stories that trace the origins of the Green Revolution in agriculture, which alleviated famine for more than 1 billion people. For dedicated science readers, Ridley's lessons may have a glancing and derivative feel. He knits together stories many of us have probably heard before - say, through the renditions of writers like Steven Johnson, Charles Mann or Walter Isaacson - but somehow misses the opportunity to enliven these sketches with a sense of wonder and surprise. More seriously, he skirts the opportunity to footnote his summarizations, leaving only a skeletal guide to sources in his back pages.
What becomes clear, though, is that Ridley is focused less on exploring the pageant of history than on fashioning a new belief system. I don't necessarily mean this as a critique; in fact, the second half of his book - where he looks closely, chapter by chapter, at the factors that shaped the innovations he's spent his first 200 pages describing - is more polemical in its approach but often more engaging, even as one might disagree with a narrative direction that arises from what I would characterize as the libertarian right. Indeed, as his book progresses, Ridley makes it obvious that he is not presenting an academic treatment of scientific history. Mainly, he'd like to proffer an argument for the importance of free-market principles and why they're crucial to improving our world and our lives.
Ridley's most important chapters, and his book's most interesting, are where he calls attention to “surprisingly consistent patterns” that describe the process of making new things. Innovation, he tells us, is usually gradual, even though we tend to subscribe to the breakthrough myth. Or as he puts it, “There is no day when you can say: computers did not exist the day before and did the day after.” The innovative journey, as he shows us, goes back to Jacquard looms and the step-by-step advances of a number of early tinkerers. And at some indistinct point, new computing machines achieved functionality; then impact; then scale. He also illustrates how innovation can be a matter of the right people solving the right problem at the right time — and that it often involves exhaustive trial-and-error work, rather than egg-headed theoretical applications. This was typically the case with Thomas Edison, who, as Ridley notes, tried 6,000 different organic materials in the search for a filament for his electric light. Edison, he points out, “remained relentlessly focused on finding out what the world needed and then inventing ways of meeting the needs, rather than the other way around.”
One problem with cherry-picking the history of innovation, however, is that you tend to leave out examples that weaken your claims for universal principles. Innovations that involve academic or state funding are given short shrift by Ridley, leaving one to naively presume that whatever governments do by way of investment or regulation hinders rather than helps the cause of progress. Thus, you won't find a lot here about the development of the atomic bomb, which depended almost entirely on state largesse, or about the subsidization of renewable energy. Nor will you read much on the transistor, many early lasers or the photovoltaic solar cell, which were created under the auspices of Bell Labs, part of a government-authorized monopoly. There isn't mention of Massachusetts Institute of Technology's Rad Lab, which (thanks to the cavity magnetron, a British invention) helped develop radar. And in Ridley's story about the origins of Google, you will not see any indication that its founders were helped in their earliest days by a grant from the National Science Foundation.
Indeed, his book consistently plays down the influence of public funding in medicine, public health, personal technology, transportation and communications; it likewise minimizes - quite strenuously, and erroneously - the role of federal assistance in the development of natural gas fracking, which was kept alive by research investments from the Energy Department in the 1970s.
It may be the case that we increasingly prefer argument to evenhanded analysis. The world is too bewildering, and the field of innovation reflects the extreme complexity of our sciences, economics and politics. Therefore a skilled polemicist can help us cut through the confusion. Yet by the end of this book, it's hard not to ask whether the author has avoided difficult questions about his subject. If you were wondering how new technological capabilities - in biology, computing or material science - have substantively changed the nature and pace of innovation since the days of the steam engine, you won't find satisfying answers here. More crucially, you won’t come to any insights about whether some economic sectors, such as energy, follow different innovative patterns because of our political systems and our legacy investments in oil, gas and coal.
Instead, Ridley's final pages focus on esoteric debates that probably mean little to most readers - disputes about 'linear innovation,' for instance, which involve whether innovation goes in one direction, from a scientific idea to an engineered product - that were all the rage in academia decades ago but are now largely exhausted.
It is, in many respects, indicative of this book's inefficient approach to solving the puzzle that innovation presents. Indeed, at Ridley's conclusion, he can tell us only that innovation "is the child of freedom and the parent of prosperity" and "we abandon it at our peril." It is unclear who would actually advocate such an absurd position or why the human urge to move forward is now at risk of being abandoned. It seems more reasonable to believe that the pursuit of innovation will be just fine, as long as we keep encouraging and incentivizing men and women who are trying to solve important problems.
And we don't necessarily have to create an ideological schema to explain what may be happening. For instance, our smartest scientists and engineers are now working around-the-clock, and around the world, to fashion a vaccine for the novel coronavirus. They are approaching a big problem with lots of funding, lots of talent, lots of teamwork and lots of ambition. Isn't that how innovation works, too?
Innovation, according to Matt Ridley, “is the reason most people today live lives of prosperity and wisdom compared with their ancestors”. If this is true, then we should obviously all be keen to learn how to generate more of it.
Matt Ridley is one the best non-fiction writers of his generation. He could be described as England’s Yuval Harari – minus the messianic vegetarianism, and the obsessions with religion and meditation. His latest book is a pleasure to read: he carries his considerable learning with an engagingly light touch. The book’s first seven chapters provide a series of vignettes of how various innovations happened – from agriculture to artificial intelligence. Few readers will fail to be surprised and delighted by at least some of these stories. The last five chapters – roughly the last third of the book – draw out the characteristics of innovation, how to promote it, and how it can go wrong.
The New Optimists
Ridley is a long-standing member of the New Optimists. This is a loosely-connected group of people including Bill Gates, Stephen Pinker, Hans Rosling, Richard Dawkins, Stephen Fry, and Andrew McAfee, who proclaim that the world is a better place today for humans than it has ever been, and that we have the ability to continue making it better – much better.
They do not claim that progress is inevitable, or consistent, and they certainly do not believe that we live in the best of all possible worlds. Indeed Ridley has pointed out elsewhere that the word "optimism" now means the opposite of what it started out meaning in the 18th century, which was the belief that life was already optimal, and therefore could not improve.
The New Optimists attribute this happy state of affairs to improvements in technology, which are fostered by free markets - with varying degrees of state involvement in the economy. Ridley, for instance, observes that during “the ten years from 2008, America’s economy grew by 15 per cent but its energy use fell by 2 per cent. … those who say growth is impossible without using more resources are simply wrong.”
The central argument of the book is that improvements in technology are not driven by inventors, but by innovators. Inventors make substantial discoveries which advance scientific knowledge. Innovators, by contrast, use trial and error to make thousands of little changes to a product or a process which may already be reasonably well established, in order to get it to work at scale, circumvent regulators, and beat competitors. “Samuel Morse did more to shrink the world than anybody before or after him. … Morse’s real achievement, like that of most innovators, was to battle his way through political and practical obstacles.”
Invention is often downstream of innovation: “techniques and processes are developed that work, but the understanding of them comes later.” This is clearly true of agriculture, vaccines, and powered flight, for example.
Innovation has interesting and sometimes surprising characteristics. One is that it is often unclear who deserves the credit. For instance, “who invented the motor car running on an internal-combustion engine? ... Ford made it ubiquitous and cheap; Maybach gave it all its familiar features; Levassor provided crucial changes; Daimler got it running properly; Benz made it run on petrol; Otto devised the engine’s cycle; Lenoir made the first crude version.” “Simultaneous invention marks the progress of technology as if there is something ripe about the moment. It does not necessarily imply plagiarism.” Apparently, twenty-one different people can lay claim to have independently designed or critically improved incandescent light bulbs by the end of the 1870s, mostly independent of each other.
More perspiration than inspiration
This does not mean that innovators deserve no praise – far from it. In 1909, Fritz Haber succeeded in extracting ammonium from air, the first step in an invaluable process for agriculture. The German chemical company BASF assigned Carl Bosch the task of scaling up the process. “The Haber–Bosch story, like so many about innovation, is often told as one of brilliant insight by an academic (Haber) followed by inevitable application by a businessman (Bosch), but this is wrong. Far more ingenuity was needed during Bosch’s perspiration than during Haber’s inspiration.” Innovators are not great men and women because of solitary genius flashes of insight. They are great men and women because they win the race, often in a crowded field, to make a technology work in the real world.
Another casualty of Ridley’s book is the old saying that necessity is the mother of invention. In fact, innovation thrives in the midst of plenty. Post-war Silicon Valley is a blessed location, and the home of the early information revolution. A much earlier revolution, the introduction of agriculture, was also enabled by improvements rather than privation. “People took up farming in at least six different places wholly independent of each other: in the Near East, China, Africa, South America, North and Central America, and New Guinea.” This happened “almost as soon as the climate changed to warmer, wetter and more stable conditions, with higher carbon dioxide levels.”
Ridley stands further toward the libertarian wing of politics than the other New Optimists. (He is also a Brexiteer and a “lukewarmer” – a sceptic about the impacts of climate change. There have been negative anticipatory comments about the book from Remainers and Greens because of that, but it should not deter them from reading it. Apart from a swipe at Greenpeace for allegedly causing over ten million deaths by impeding the introduction of Golden Rice, he downplays these themes, perhaps to avoid shrinking his audience.)
Unsurprisingly, then, he insists that innovation works best in democratic countries where the state plays a minimal role. He criticises Mariana Mazzucato’s 2014 book, The Entrepreneurial State, “which argues that the main source of innovation has been government support of research and development” by pointing out that this is a recent phenomenon. “It would be strange to argue that innovation could happen without state direction in the nineteenth century, but only with it in the twentieth. … In the second half of the twentieth century, the state did become a sponsor of innovation on a large scale, but that is hardly surprising given that it went from spending 10 per cent of national income to 40 per cent.” “Trying to pretend that government is the main actor in this process … is an essentially creationist approach to an essentially evolutionary phenomenon.”
Innovation needs freedom
The book’s conclusion is summarised in these two sentences: “the secret sauce that leads to innovation is freedom. Freedom to exchange, experiment, imagine, invest and fail; freedom from expropriation or restriction by chiefs, priests and thieves; freedom on the part of consumers to reward the innovations they like and reject the ones they do not.”
This is an admirable rallying cry, but freedom is not a straightforward commodity. It can exist where you don’t expect it, and it can be compromised where you do expect it – for instance, by over-zealous application of the precautionary principle.
Ridley has some difficulty explaining the widespread expectation that “in the coming few decades China will innovate on a grander scale and faster than anywhere else.” He suggests it is because although China’s “politics is authoritarian and intolerant … a lot of that happens at a level above the entrepreneur, who is surprisingly free of petty bureaucratic rules and delays, so long as he or she does not annoy the Communist Party, and free to experiment.”
That does not capture the full story. Anyone who has done business in China knows that the party and the state are often closely involved in commerce. A factory in Shenzhen was expanding rapidly. When the owner was asked how he was raising the capital, he replied that the local authority liked him because he created a lot of jobs. They were re-zoning some of his industrial land as residential, which would yield a big profit. This meant the banks would lend freely to him. Innovation, it seems, can thrive in many different kinds of environments.
And there is another invaluable ingredient: competition. As another of the New Optimists, Andrew McAfee put it in his recent book “More From Less”, “Technology gives us new ways to solve old problems, and capitalism provides the incentive for people to invent these new ways and to implement them once they have been invented.”
Ridley argues that innovation is more likely in small companies than in large ones, but in noting the rapid innovation in the supermarket sector, he acknowledges that the primary driver of innovation in the corporate world is not so much size, as the existence of genuine competition.
Despite being a paid-up member of the New Optimists, Ridley opines that “it is a cliché to say that innovation is speeding up every year. Like a lot of clichés it is wrong.” His main evidence for this is that vehicles – from cars to airplanes – go no faster now than they did 40 years ago. This is a curious argument. In the various stages of the industrial revolution, innovation focused on different technologies. First came the static steam engines, then canals, then railways. Later on came steel, heavy manufacturing, electricity, and so on. Progress was certainly not continuous across all fronts.
Today, although we are still in the later stages of the industrial revolution, we are also in the early stages of the information revolution. Innovation is proceeding at furious pace in the sharp, pointy end of that revolution – artificial intelligence.
Fortunately, Ridley makes some rather perceptive remarks about AI.
Artificial intelligence and the two singularities
Ridley has a seat in Britain’s House of Lords, and he gives a gracious hat-tip to Lord Tim Clement-Jones, chair of the Lords’ AI Select Committee, which did a great job of taking a number of peers up a steep learning curve on the technology, culminating in an excellent report published in 2018.
It is currently fashionable to sneer at the idea of the technological singularity - the arrival of superintelligence. Ridley does not do this. He notes without sarcasm that “some people, such as James Lovelock in his recent book Novacene, think that [humans might dispense] with the organic component altogether, as the robots take over and we transfer our minds to their computers.”
Initially, it appears he is less open to the idea of the other singularity, the economic one, which is the arrival of technological unemployment, when machines can do pretty much everything than humans can do for money. “The idea that innovation destroys jobs comes around in every generation. So far it has proved wrong.”
Which is true, so far. Ridley's evidence that this will remain the case is flawed. He quotes the zombie factoid that ATMs failed to automate away the jobs of bank tellers: “there are more tellers employed today than before cash machines were introduced, and their jobs are more interesting than just counting out cash.” In fact it was US banking deregulation which ignited the growth in bank branches, not the spread of ATMs. The growth in bank branches was a temporary phenomenon, and limited to the USA. Bank tellers are now an endangered species in the West.
“For the moment, the safest bet is that artificial intelligence will augment rather than replace people, as automation has done for centuries. … but the day when I settle into the car, tell it where to take me and go to sleep at the wheel is–in my opinion–a fair way off.” We’ll see. Prior to Covid lockdown, Waymo was running a pilot taxi service in which a proportion of clients could do exactly that. It may be a few years before the technology is ready for general release, but not decades.
In fact, Ridley goes on to demonstrate a more imaginative approach to the economic singularity. “Imagine if robots could do literally everything you conceivably wanted done–including back rubs and grape peeling–and could do them all so cheaply you’ve no need to go out to work to earn anything. What exactly is the problem? You can summon up whatever goods or services you want at zero cost. … it’s a useful thought experiment. Work is not an end in itself.” Bravo! Ridley has posited the idea of abundance as the optimal solution to technological unemployment.
Books by Title
Books by Author
Books by Topic
Bits of Books To Impress