Bits of Books - Books by Title


The Innovators

How a Group of Hackers, Geniuses and Geeks Created The Digital Revolution



Walter Isaacson



Milestone in 2011 - Apple and Google spent more on lawsuits and patent payments than they did on R & D on new products.

Effective management doesn't have to come from a single strong leader. A combo of talents is often better. Peter Drucker, in The Practice of Management, said you need an outside person, an inside person, and an action man. (At Intel, Noyce was the outside man, inspiring staff and selling company to outsiders as it was starting. George Moore was the inside man, a brilliant scientist who came up the tech innovations. And Grove was the action man, the manager driving the business.)

Geek debate over why Internet built in first place. The guys who built it hated the suggestion that it was meant to make US capable of communicating after a nuclear attack. The builders claimed that they were just making a resource-sharing network for DARPA researchers. But the managers further up pointed out that the only way they got funding for it was to strees the ability to protect communications.

Second half of 1969 three historic markers. NASA put a man on the moon. Silicon Valley engineers put a programmable computer on a chip. And ARPA created a network that wd link distant computers.

So who 'invented' computers and/or internet. Analogy of a cathedral. Over the course of several hundred years, new people come along and each lays a brick on top of the old foundation, each saying :I built a cathedral." If you're not careful, you can con ys into believing that you did the impt part. But the reality is that everything has to floow on from the work that was done before.

May 1974 meeting DEC president Ken Olsen declared that "I can't see any reason why anyone would want a computer of his own."

When Bill Gates turned 16 his parents gave him a brand new shiny red Mustang. He still has it.

Bill Gates and Paul Allen wrote a BASIC interpreter for the Altair in 1975, and then got really upset when the Homebrew club distributed it free. Gates correctly claimed that most software wouldn't get written unless people were prepared to pay for it. In his angry letter to the pirates, he claimed that he and Allen had used more than $40,000 worth of computer time writing the program. Of course he failed to mention that they hadn't paid for any of that time, and that most of it was on taxpayer-funded Harcard equipment.

1971 Wozniac built a Blue Box which could mimic the tones to fool the Bell phone system into letting them get free long-distance calls. They tested it by calling the Vatican, pretending to be Henry Kissinger, and asking to talk to the Pope (the ruse dodn't fool the Italians). Woz and Jobs built a hundred which they were selling at $150 until one sale ended with Jobs being ripped off at gunpoint.

When gates was negotiating with IBM to provide software for new PC, IBM wanted an OS, which MicroSoft didn't have. They sent IBM guys down to talk to Gary Kildall, who had CP/M (Control panel for Microcomputers). But Kildall couldn't be bothered talking to the suits, and went flying instead. So Gates paid $50,000 for QDOS (Quick and Dirty Operating System) written by a struggling Seatle company.

The graphical interface that captured Job's attention when he got a tour of the Xerox PARC labs, was made possible by bitmapping.Bitmapping allowed every pixel to be controlled by the computer - turned on or off, or any color. That control allowed displays, fonts, graphics.

Apple perfected the Xerox innovation, proving that it is execution that matters - the implementation and marketing - rather than the original conception. Jobs was furious when Microsoft copied the idea, but the courts ruled that Apple could not get patent protection for the idea of a GUI, nor cd it protect the idea of a desktop metaphor.

Linus Torvald released his Linux kernel under a General Public Licence, correctly believeing that this wd liberate an army of hackers who wd collaborate to improve it. Linus's Law - "Given enough eyeballs, all bugs are shallow."

"It was a way to get people to trust me. When they trust you, they take your advice,: said Torvald.

Demonstrated that people motivated by more than just money. Hackers wanted the esteem of their peers - they could enhance their reputation by making solid contributions.

"I don't like single-issue people. Nor do I think that people who turn the world into black and white are necessarily very nice or ultimatelyvery useful. The fact is, there aren't just two sides to any issue. There's almost always a range of responses, and 'it depends' is almost always the right answer in any big question." (Torvald again)

Al Gore's contribution: his father was also a Senator, who helped craft the bipartisan laws that made the interstate highways possible. Gore jnr took that as inspiration to promote what he called the Information Superhighway. He led hearings that led to two acts which he sponsored: the 1991 High performance Computing Act and the 1992 Scientific and Advanced Technology Act, which freed commercial networks like AOL to connect to the previously closed research networks.

Gore never actually said that he 'invented' the Internet. He said in an interview that "During my service in Congress I took the initiatiev in creating the Internet." Even Republican Newt Gingrich defended that statement "Gore is not the Father of the Internet, but in all fairness, Gore is the person who, in Congress, systematically worked to make sure that we got to an Internet."

Tim Berners-Lee, more than anyone else, drove the founding of the Internet. But he had a blind spot. He didn't want images, just text. He specifically didn't want magazines, bc he thought people would just passively consume, rather than interact with the text, editing and responding to ideas.

Ted Nelson, who had pioneered the concept of hypertext with his Xanadu project 25 years earlier, was also displeased by the path Internet browsers took. He wanted links to be two-way, so that they would require the approval of both the person creating the link and the person whose page was being linked to. Such a system would have the side benefit of enabling micropayments to content providers.

1997 John Barger came up with the term weblog to describe the diarist notes and opinions that people were sharing. Couple of years later Peter merholz jokingly broke word in two and started using word blog.

Traditional writers poured scorn on the bloggers, but Arianna Huffington was an early counterpoint voice. People got the chance to express their ideas and get feedback. This was a new opportunity for people who had previously just passively consumed whatever TV fed them.

Watson won at Jeopardy by processing huge quantities of data - 200 million pages, 4 terabytes, of which the entire contents of Wikipedia was 0.2%. But it was never an intelligent machine. Rather it was designed to simulate understanding. Far superior to humans at storing data and eprforming calculations. But worse than 4 yos at understanding and adapting.

The main lesson of AI is that the hard questions are easy and the easy questions are hard. (Stephen Pinker)

But we are constantly looking at thelist of things we know a machine can't do - play chess, translate a language, drive a car - and then checking them off the list when it turns out the computer can do them. And one day we will get to the end of the list.

(Book Forum)

WALTER ISAACSON is America's leading chronicler of Great Men - or 'geniuses,' as his publisher describes them in a PR note accompanying his latest book, The Innovators. A former editor in chief of Time, which made its name by designating Great Men (and very occasionally Women) 'People of the Year,' Isaacson specializes in expansive, solidly researched middlebrow histories, the kind that appear under your father's Christmas tree or on a talk-show host's desk. Benjamin Franklin, Albert Einstein, Henry Kissinger, and Steve Jobs have all received the Isaacson treatment, with a handful of others coming in for group portraits. The Innovators belongs to this latter category; it's a story of what Isaacson calls 'collaborative creativity' and the origins of the Internet and digital eras. There are dozens of characters here, beginning with Ada Lovelace and Charles Babbage in the mid-nineteenth century and ending with Messrs. Gates, Jobs, Wales, Brin, and Page on this side of the twenty-first.

As a vast survey of the inventions and personalities that eventually brought us networked computers in every home, purse, and pocket, The Innovators succeeds surprisingly well. Isaacson has pored over the archives and interviewed many of the principals (or at least their surviving colleagues and descendants). He also has an assured grasp of how these various technologies work, including the recondite analog proto-computers invented in midwestern basements and dusty university labs during the interwar period. Although many of these machines didn't survive their creators, they informed subsequent, more celebrated inventions while also showing that, in this long period of tinkering and experimentation, cranks and hobbyists produced some important breakthroughs.

But Isaacson's book is more than a history of how we came to live digitally, and that is its chief problem. By presenting itself as an expert's guide to how innovation works - and by treating 'innovation' as a discrete, quantifiable thing, rather than, say, a hazy, ineffable process that has been condensed into a meaningless corporate buzzword - the book founders on the shoals of its self-proclaimed mission. The ship need not be scuppered entirely; there's much to be learned in these pages. And there is something worthwhile in this survey of the networked creation of ideas and inventions - for all the solitary, heroic connotations of the term genius, The Innovators is a chronicle of intensive group collaboration, as well as the often tangled evolution of technological change across the generations (and via more than a few patent disputes). Isaacson rightly points out that nearly all of the important inventions in the history of computing had many fathers, from the microprocessor to the Pentagon's proto-Internet operation, arpanet. These academics, engineers, mathematicians, Defense Department bureaucrats, and wild-eyed tinkerers came together in sometimes uneasy joint enterprises, but the process was undeniably, and deeply, collaborative - and not just in the cross-disciplinary petri dishes of Bell Labs and Xerox PARC, where so many important technological breakthroughs were devised. These collaborations continued as their guiding conceptual insights found popular (and profitable) application as objects of consumer delight - a revolution that took place at the behest of a new breed of knowledge geek, people like Jobs and Nolan Bushnell, founder of Atari.

In such detailed stories of sudden inspiration; of late nights spent bug-hunting in university labs; of a young Bill Gates running circles around IBM executives, who, unlike him, had no idea that software, not hardware, would dictate the future of computing - here is where 'innovation' becomes something fascinating and specific, precisely because the word itself isn't used. These are simply tales of human drama, contextualized within the arc of recent American industrial history. Yet as the book's five hundred–plus pages unwind, Isaacson interrupts himself to present small bromides about what it means to innovate and what we might learn from these innovators, our presumed betters. 'Innovation requires articulation,' he tells us, after explaining how the main strength of Grace Hopper, a trailblazing computer scientist for the US Navy, was her ability to speak in the languages of mathematicians, engineers, programmers, and soldiers alike. 'One useful leadership talent is knowing when to push ahead against doubters and when to heed them,' he offers later.

The book is peppered with these kinds of passages, which often intrude on the narrative, depriving us of moments of real emotional power. Isaacson also tends to rush through some of The Innovators's finer passages, material that could transcend rote technological history and show us something personal. He offers a believable sketch of the often cloistered life of Alan Turing, the brilliant British mathematician, cryptographer, and, ultimately, martyr to a bitterly conservative society. Having played an important role both in the development of computing and in Britain's World War II code-breaking operations, Turing was arrested for 'gross indecency' - that is, being a homosexual - in 1952. Quiet about but content with his identity, he pleaded guilty all the same, perhaps to avoid stirring up more trouble. After a trial, the court offered him an awful choice: a year in prison or injections of synthetic estrogen - essentially, chemical castration. He chose the latter. Two years later, like a twentieth-century Socrates, he committed suicide by biting into an apple he had laced with cyanide.

Isaacson's recounting of Turing's end is hardly longer than my own, and that is unfortunate. It is among the most disturbing episodes in postwar British history and deserves more explication and indignation. Though one hesitates to make it stand for something too large, perhaps this veritable homicide at the heart of the Anglo-American quest for technological mastery does represent something. After all, as Isaacson writes, 'war mobilizes science,' and many of the great technological breakthroughs of the past century were subsidized, if not wholly steered, by the defense establishment. (To name one: The original microchips found homes in Minuteman ICBM guidance systems.) Isaacson gestures in this direction, making some references to the military-industrial complex and showing that while the men behind ARPANET - nearly all of them working for the Pentagon or under the sponsorship of Pentagon grants - thought they were building a distributed system for academic collaboration, senior Defense officials pushed the project because they thought it would provide a distributed command-and-control system that would survive a nuclear attack. Each side seems to have thought it was using the other.

The defense industry, then, becomes the subject of a submerged narrative in The Innovators, one that's never fully brought into the light. Perhaps it's uncomfortable to think that some of our most remarkable technologies not only owe themselves to the ever-churning American war machine but were in fact funded and designed on its behalf. Accepting this symbiosis - which some have, like Steve Blank, a tech-industry veteran and the author of a popular online lecture, 'Hidden in Plain Sight: The Secret History of Silicon Valley' - would also mean tamping down the perpetual hosannas heaped on the digital trailblazers and asking some harder questions. For example, do they, and their successors, bear any responsibility for the current state of the Internet, in which a tool designed for liberating people from the strictures of physical media and geography has been corrupted into a mass-surveillance apparatus collecting incredible amounts of personal data for corporations and intelligence agencies alike? Is Alan Turing's story an aberration, or perhaps a toll extracted by this devil's bargain?

Probing these questions would mean chucking aside all the jabber about innovation. It would mean that technological development is also a human story - one that involves politics, war, culture, discrimination, social upheaval, and a great deal of human exploitation thousands of miles down the production line, in Congo's coltan mines and Shenzhen's brutal factories. We hear little about these messier elements from Isaacson, except for an enjoyable tour through the Bay Area counterculture, in which Ken Kesey's acid tests become a kind of analog trip through the doors of perception for these would-be digital psychonauts. I think that says something both about Isaacson's worldview - moderate and apolitical in the thought-leader mode, as might be expected of the CEO of the prestigious Aspen Institute think tank - and about what has become the standard style of our big histories. They are supposed to record the past but not account for it. They tend toward the instructional: This innovation talk is really a thinly disguised form of self-help. Here is what makes a good research group. Here are the advantages of filing patents. Here are the types of people you need to start a company - a gifted, shy, freethinking engineer like Steve Wozniak and a sociopathic lapsed-hippie cutthroat like Steve Jobs. Sure, you can invent something and give it away, like free-software advocates Richard Stallman and Linus Torvalds. Or you can start calling people 'shithead,' steal their ideas, and make a billion dollars - a precis that, I was sad to discover, describes not only Jobs but several other figures of industry lore.

More books on Inventions











Books by Title

Books by Author

Books by Topic

Bits of Books To Impress