In less than two weeks, The Two-Second Advantage by Vivek Ranadivé and Kevin Maney will make its debut on bookshelves across the U.S. Below is an excerpt from the first chapter, Wayne Gretzky’s Brain in a Box.
By guest authors, Vivek Ranadivé and Kevin Maney
On March 28, 1955, Time magazine reported on a new generation of machinery called computers. The cover featured a drawing of IBM’s Thomas Watson, Jr. in front of a cartoonish robot, over a headline that read, “Clink. Clank. Think.” The story marveled at a computer built by IBM, working inside a Monsanto office building. “To IBM, it was the Model 702 Electronic Data Processing Machine,” the story reported. “To Monsanto and awed visitors, it was simply ‘the giant brain.’”
Technologists have long tried to build computers that can do brain-like things. They’ve worked on artificial intelligence and robots and on making computers that can beat grand masters at chess. But those projects have all had narrow success at best. The basic structure of computers works very differently from brains. Computers can do some things better than humans, like instantly calculate long equations or sort through millions of documents looking for a few key words. But they can’t do some of the simplest things even a three-year-old can do – like knowing that a line drawing of a cow and a real cow are both a cow. Computers definitely can’t match the brain’s higher-level processes, like putting disparate ideas together in a flash of inspiration. Building a computer that thinks like a person is a long way out and perhaps a quixotic quest in the first place.
And yet, computer scientists are learning from human brain research and are building computer systems to operate in new ways borrowed from the human predictive model. These systems, in their own way, build memory chunks and generate behavior based on predictions. Sensors can feed information back to the computers to both build patterns and test predictions.
Forward-thinking companies are starting to use these new systems to operate more like talented humans than bureaucratic organizations. These companies can use technology to sense what’s happening in the market, constantly adjust and act just a little bit ahead of time – a two-second advantage.
As it turns out, getting just a little bit of the right information just ahead of when it’s needed is a lot more valuable than all the information in the world a month or a day later. Using a database to analyze piles of data after the fact would be like Wayne Gretzky pouring through all his hockey memories to analyze why he didn’t score in the last game and make a plan for the next game. While that might be valuable, it’s not enough anymore. Enterprises will want to anticipate like Gretzky, using an efficient “mental model” to get a little ahead of events and make instant judgments about what to do next. Companies will be able to anticipate customers’ needs. Stores will no longer carry too much or too little of a product. Law enforcement will be able to stop criminal acts before they happen.
A number of trends are coming together to facilitate two-second advantage technology.
For fifty years, we’ve lived in a database world in terms of technology. Corporations and government agencies collect information from individual interactions (forms filled out, reservations made), transactions (at ATMs, on the Web, credit card purchases), and recorded events (baseball scores, hurricane readings from the Gulf of Mexico, airline departures from LAX). The information gets fed into a structured database, which can mix, match and analyze the data to make discoveries about things that happened.
The database might tell a retailer that it sells 50 percent more Pampers in August, suggesting the store should stock up in that month. Or a database could tell an airline that when it lowers prices by $20 on a particular route, it steals a chunk of market share from its competitors. The U.S. Census is an enormous database that can identify patterns in the nation’s population every ten years.
Databases can help an executive make informed decisions about what to do next, based on outcomes in the past, and that’s valuable. And databases have gotten increasingly real-time. A couple of decades ago, an executive had to put in a request for information from a database and wait a day or a week to get the results. In 2011, databases can update information on the fly and instantly answer a query from an executive with a flood of information about what happened earlier that day.
Database technology is critical to the operation of nearly every enterprise of any size, everywhere on the planet. Yet databases have major handicaps in today’s world. They’re inherently focused on the past. They analyze what’s already happened, not predict what’s about to happen. And databases are about to get overwhelmed by crushing waves of immense amounts of data from a constantly expanding number of sources. Database technology won’t be able to keep up.
In 2010, more than 1,200 exabytes of digital information was created. A single exabyte is equal to about 1 trillion books. Every two years, the volume of data created quadruples. About 70 percent of it will be created by individuals, including profile information on social networks, videos on YouTube, tweets on Twitter, music tastes on Pandora, and location check-ins on Foursquare. The rest is coming from an ever-expanding universe of sensors. These include chips placed on buoys to keep tabs on a bay’s water, RFID tags on luggage that tells an airline every time a bag is loaded or unloaded, and the billions of cell phones in the world – each of which constantly tells the cell company where people are and how they move around.
At the same time, storage technology is improving so fast it will be possible to gather and store all of this data that comes roaring in. While having more data can certainly be valuable, too much can get overwhelming. If database technology has to sort through all the data to answer every query, it will bog down. Answers will come too slowly. Just as Gretzky can’t search every memory during a game, a business can’t search all its data each time it needs an answer.
Rapidly escalating stores of data might be less of a problem if computer processing power could increase fast enough to keep up. But that’s not likely. Since the 1970s, processing power has improved at a pace described by Moore’s Law: roughly twice as many transistors can be packed onto a microprocessor every 18 months. From the 1980s to the 2000s, computer systems have gotten hundreds of times faster. But the individual transistors have now gotten so small – less than a dozen atoms across – that they can’t get much smaller. The kind of technology that currently runs almost all computers can only get two or three times faster.
To handle the coming data onslaught, technologists are pursuing alternative kinds of computing. One path is to develop brain-like computers that use data to build models that can take in events and chunk patterns – learning from the streams of data but not relying on the entire database.
Meanwhile, in business, government and everyday life, reaction time is dropping. Competition is driving the world to work at an ever-faster pace. No one can afford to react too late, based on information that’s too old. The new competitive advantage will be an ability to anticipate events, based on information happening right now.
Before the Internet, we were in an era we call enterprise 1.0. In a bank, for instance, customers would come in all day long – no ATM machines! – and tellers would pile up pieces of paper tallying transactions. At the end of the day the branch manager would account for everything, then send the information to headquarters where information from all the branches would be assembled and calculated. Getting a report on the state of affairs at the bank might take days or weeks. Reaction time to any single event could be measured with a calendar.
Computers and the Internet ushered in enterprise 2.0. Every transaction became a bit of digital data. As computers and networks got more powerful, that data could be calculated and analyzed faster and faster, to the point where the bank CEO could look at a computer screen and see the money flowing in and out of his banking company in almost real time. Reaction time to any single event could be measured with a stopwatch.
We’re entering enterprise 3.0. Now every event can become a bit of digital data. A transaction is one kind of event, but there are many others, too. Each time a customer logs onto the bank’s website, even if no transaction is completed, that’s an event. Cell phone signal analysis may tell a bank how many people walk by a branch every day – more events. Debit card purchases at far-flung retailers are events. A bank should be able to recognize patterns of events and anticipate what a customer might want next, proactively capturing that business. Reaction time to any single event will have to be measured with a time machine – because the idea is to act in anticipation.
In the era of enterprise 3.0, making decisions based on information even just a few seconds old could be disastrous. Trying to make decisions based on all the events coming in would be mindbogglingly difficult. The new systems need the right information in the right place at the right time, and anticipate what’s coming next.
The basic idea of using data to be predictive in business doesn’t come out of thin air. Companies have been deploying mathematics and software to try to foresee events for decades. Statistical analysis proved that events could be predicted within levels of probability – like the mean time before failure of mechanical equipment, or the likelihood of people within a given ZIP code to respond to a certain direct mail campaign. Big software systems in categories such as business process management (BPM) and customer relationship management (CRM) have tried to gather all of what’s going on in a corporation and help managers anticipate when, for instance, an assembly line will need a new shipment of parts, or a customer will be ready to buy an upgraded product.
In more recent years, companies have employed analytics to understand trends and anticipate events. Analytics can look at a person’s pattern of spending and bill paying, compare it with patterns of millions of other consumers, and make a pretty accurate prediction about whether that person will default on a loan. Analytics help airlines predict demand so they can adjust schedules to make sure planes fly as close to 100 percent full as possible.
We’re not suggesting that we’re inventing the idea of predictive technology. We’re not saying anyone should throw out their BPM, CRM or analytics systems. There will always be great value in making projections that are days, months or years out – just as people need to make long-range plans, or coaches need to make game plans they think will work against an upcoming opponent.
But in today’s world, enterprises need something more. They need that instantaneous, proactive, predictive capability of a Gretzky. In the 24/7 ongoing rush of events, enterprises need to be able to put their mountains of data to the side, and act using small, efficient “mental” models that can spot a series of events, anticipate what’s about to happen, and initiate action in a split second.