The Innovators: How a Group of Hackers, Geniuses and Geeks Created the Digital Revolution by Walter Isaacson
What’s the most important story of our time? Global warming? The industrialisation of China and other parts of the world? Religious fundamentalism? Or the development of computer and communication technologies and their effect on our lives?
The latter must be a candidate, and is a neglected story compared to the others. Walter Isaacson steps in to put that right with an ambitious sweep across tech history, joining the dots between nineteenth century pioneers of the “analytical engine”, which borrowed ideas from the mechanisation of weaving, and the founding of Google.
It’s a lively mix of profiles of key individuals - many of whom seem to have been inspired by their childhoods in the mid-West, tinkering with radio sets -, and technical breakthroughs. The science is bravely explained for the general reader, not always, for me at least, successfully: I’m still not quite sure what a transistor does.
The glue that sticks it all together is Isaacson’s account of, and reflections on, the way science, institutions and business worked together to make some ideas mainstream while others, however brilliant, led to nothing except a footnote in history – and that often only because of acrimonious legal disputes years later.
Isaacson is good on what makes an effective team in technology, the balance between giving space to mavericks and providing direction and purpose. And having previously studied Steve Jobs for his best-selling biography, he’s alert to what makes partnerships work, often with a combination of very different skills between two leaders – the personable and the introverted, the visionary and the practical, the scientist and the politician. Being able to project a “reality distortion field”, as was said of Jobs, is also a rare and key skill.
In the twentieth century, American leadership in computing was possible because of a successful blurring of boundaries between government, academia, commerce and the military, with key figures slipping easily between those different worlds. Isaacson’s account will disabuse anyone who believes private enterprise does best when left entirely to itself. He shows how Bill Gates, for instance, owes a debt to the military for the use of the computer that allowed him to win Microsoft’s first software contract. And he explains how Al Gore pushed through political changes that gave the USA a lead in the development of the internet, despite his much ridiculed slip of the tongue implying that he thought he’d invented it.
The only place where, for me, Isaacson’s thesis goes slightly off the rails is towards the end, when he explores the ideological battle between two visions of Artificial Intelligence: will it eventually surpass and perhaps even threaten human intelligence – as Singularity theorists believe – or will it remain a servant of humanity, always requiring human direction? Isaacson believes the latter, and makes a strong case for it, which is reassuring for anyone still worried about grey goo.
AI is an interesting issue, but, for my money, not the one to which this grand tale leads us. Isaacson has shown how today’s capabilities are the result of the intersection of personal computers with network technologies, which for more than a decade, because the internet was in government and academic hands, ran on separate, parallel tracks. He highlights the transition from technology owned and run by institutions, because it was so big and expensive, to devices designed for individuals.
The first transistor radio, from Texas Instruments, was originally marketed as a way to keep in touch after the Russians had dropped an atom bomb, but quickly sold out as a teenage fashion statement. Young consumer desire became the power behind much of the tech economy, which, it turned out, “could also empower individuality, personal freedom, creativity, and even a bit of a rebellious spirit.”
The first "tranny", the Regency, from Texas Instruments, came out in 1954, and cost the equivalent of $430 in today's money - roughly the same as an iPhone
More pressing than AI, surely, is the question of how connected devices in the hands of billions of ordinary people have already changed our world: what is access to so much information, and the ability to communicate with limitless numbers of other users, doing to human society? What’s its impact on the nation state, elected government, local culture, and the continued advancement of science?
Isaacson has a riveting account of the wonder that is Wikipedia and how it came about, as well as the story of the Open Source movement. Surely those stories, and others such as the liquidity of financial markets, the global reach of internet businesses, the disappearance of middle class jobs, point to huge changes that we have yet to fully understand. To that extent, it’s not so much the threat of uncontrollable AI in the future, as uncontrolled changes to economies and social structures today that deserve attention. Maybe that's for his next book?
Whatever my reservations, they're in the context of this being a wonderful, enlightening book that goes a long way to explaining how we find ourselves surrounded by - or drowning in? - technology.
View all my reviews