IDM advertising slogan, 1960
It is more than 50 years since we stepped into the so-called Information Age, and we are still struggling to get to grips with it. Information is at the heart of neo-liberalised economies, of life and work and societies. Yet it is very difficult to pin down exactly what it is: is it simply new bits and bytes, or a substance like atoms by which the world is fashioned? Or is it just sensations that bombard our minds? I’ve been trying to make an edition of In Business on Radio 4 about this new (or is it new?) centrality of information to our lives, and I still can’t define what information is myself.
According to the very latest online definition of “information” from the Oxford English Dictionary, the term “Information Age” seems to have been coined in 1960 by businessman and later Pentagon official Richard Leghorn. American author James Gleick points out that it was intended as a temporary term for something newly apparent: Leghorn was talking about “present and anticipated spectacular informational achievements” but he did not expect his term to catch on. It would be replaced, he thought, by something more symbolic. That has not happened.
Gleick’s new book The Information: A History, a Theory, a Flood is anchored on Clive Shannon, the AT&T Bell Laboratories researcher who identified the information age before it was named in a monograph in the Bell System Technical Journal in 1948. His paper “A Mathematical Theory of Communication” came up with the notion of the “bit”. A unit for measuring information is what Shannon called it. Sounds modest, but upon that concept grew the computer networks that are still transforming the way we live.
Nine years ago, before the public share flotation that made it the biggest media company in the world, I went round the comparatively modest building that was then the Googleplex in Mountain View, California. This was before the search engine company moved to its current grand campus around the corner. The corridors were full of bicycles, there was a grand piano in the reception for lunchtime recreation, and lava lamps everywhere. But above the reception desk was something I had not seen before. On a big screen they projected live (albeit selected to stop the list from becoming a blur, and with sex omitted) the Google searches being undertaken by users from all over the world. My guide and I read them out one by one into my recorder, and then I stopped. “Hey,” I said, “No one has ever been able to do this before. We are looking at the mind of the world.”
But that was then; now the internet is many millionfold larger, and Google and other search engines are expanding at a similar pace to keep up with it. And the internet is enabling the creation of what rapidly become vast new media in their own right: Facebook, Twitter and many more to come. The Library of Congress is now archiving everything that appears on public Twitter feeds, every tit and tattle of it.
And out of this rush of bits and bytes, all this information, new economies are being formed. They are as radical as was the idea of mass production that shaped the 20th century. But because we do not understand all of this information, we do not readily understand the new economy yet. When financial markets are overwhelmed by frightening momentary crashes, it is a reminder that they are systems driven sometimes by human interaction with screen-based electronic information, and sometimes by trading performed with a computer. As electronic systems, they are subject to “howl round”, similar to what happens when a microphone picks up feedback from a speaker. The result is a temporary catastrophe when it occurs in the marketplace, where it may wipe out some companies. And some people.
The forces that are changing our lives are a melange of ever-increasing connectivity combined with Moore’s Law, the doubling of computer power on a silicon chip every two years or so, which has been going on now for more than 40 years. Wider pipes make possible ever-greater data flows, which increasing computer power enables people and organisations to process. Out of that emerge new functions and potentially new economies. Is Google really just a search engine? Is Amazon.com really just an e-retailer? Or are they accumulators of information about individual customers and aggregated ones, which enables these companies to build and enrich their knowledge of how people in general – and me in particular – behave by analysing the information data I provide every time I browse for something I may not even buy? At the moment, this information is used primarily to suggest other things we might like, but to what other uses could it be put when the database has matured in 20 years?
My friend the veteran futurist and Silicon Valley watcher Paul Saffo divides the past 100-plus years into three distinct phases, which helps me to grapple with the idea of what information is. Saffo calls the first 50 years of the 20th century the Industrial Age of Mass Production. The industrial worker was its heart; the Fordist production line was its power. Its symbol was the time clock. Goods were scarce and the factory owner was in a prime position of power. Then, in the 1950s, after World War II, came the Consumer Age, with the customer in the driving seat and a proliferation of consumer choice. The symbol of the age was the credit card. Now we have entered what Saffo calls the Creator Economy (and he does not mean creative). It’s a new mesh of enterprises using the data generated by our internet activity: not so much the words we write, but the trails we leave behind. Computer analysis can create hugely detailed and potentially even predictive patterns out of vast amounts of bits of random data. The defining symbol of this new world is the individual key stroke, billions of times over. A building block, a currency, an atom... whatever it is, it is information. If we can process and analyse all this information, it ought to increase our intelligence, or at least the sensitivity of our antennae.
Can we cope with this flood of stuff generated by us and aimed at us? Gleick tells me he thinks we can, observing that humans seem to have learned to cope with more and more daily data since time began – and that everything contemporary always appears overwhelming. (He is also the author of a book on this phenomenon entitled Faster: The Acceleration of Just About Everything.) Can we stop this digitally generated information being misused? Gleick refuses to be paranoid about it, but I wonder whether any internet user anywhere can really feel secure about any kind of privacy, even in the benign parts of the world, let alone those with intrusive and intolerant governments.
Great benefits and opportunities have emerged from the Information Age for which Clive Shannon paved the theoretical way in 1948, one year after I was born. But even clever people are now dazzled and distracted by the information flood – so says the London-based McKinsey consultancy partner Caroline Webb. With former colleague Derek Dean, she wrote an article for the last McKinsey Quarterly on what she calls the dark side of the information revolution. The authors argue that “always-on, multitasking work environments are killing productivity, dampening creativity and making us unhappy.” It’s not rocket science, you might say – hardly an insight worth the big fees consultants charge by the hour. The significance of this is the place in which it appeared: the McKinsey Quarterly is aimed at a business elite. This is a rather homely, personal message compared with the firm’s other big ideas. When Webb, who specialises in leadership, says that bosses who pride themselves on multitasking are damaging themselves and their companies, people ought to pay attention.
Webb told me that this is more than mere grumbling round the water cooler. There is strong neuroscientific evidence that our brains can’t successfully tell us to perform two actions concurrently. Multitasking, she says, unequivocally damages productivity. No wonder so many organisations are so intensively but so indifferently managed. The McKinsey article sets out personal strategies not for coping with the information overload, but for reducing it. And if the management experts can’t cope with the information explosion they are supposed to be on top of, how about the rest of us? Television has already reduced the world to a shot-change every second and a half, and now we experience life in the same distracting way. Half the people walking on Hampstead Heath on a recent, glorious Saturday in spring seemed to be on their mobiles. Even when they were hand in hand, half their attention was elsewhere. The slogan with which we might regain some kind of grasp over the information flood is surely, “Not sent from my BlackBerry”. §