Skip to content

The Simulation Economy

2013 January 6

In 1982, Steve Jobs first made the cover of Time magazine, where he was celebrated as the 26 year-old college-dropout-wunderkind who created the personal computer industry and made a fortune in the process.  It seemed like a new age had dawned.

Unfortunately, tangible results were frustratingly hard to find. By 1987, the economist Robert Solow complained that “You can see the computer age everywhere but in the productivity statistics,”  a phenomenon which came to be known as the productivity paradox.

Today, nobody questions that computers have fundamentally changed the way we create, deliver and capture value.  Erik Brynjolfsson, who coined the term “productivity paradox,” even has a new book out touting technology’s impressive contributions.  What’s changed? I would argue that a big part of it is our ability to enjoy success while simulating failure.

A Universal Computer

Computers, in the broad sense, have been around a long time.  The abacus was used as early as 2700 BC in Mesopotamia.  Blaise Pascal invented the mechanical calculator in 1642 and Charles Babbage designed his analytical engine in 1837.  Analog computers were used during World War II to crack the the supremely important German Enigma codes.

However, those machines were limited to specific tasks.  The machines we know today as computers have their roots in Alan Turing’s legendary 1936 paper describing a universal computer which could be programmed to do any task.  He would later write in 1948:

We do not need to have an infinity of different machines doing different jobs.  A single one will suffice.  The engineering problem of producing various machines for various jobs is replaced by the office work of ‘programming’ the universal machine to do these jobs.

Claude Shannon (the father of information theory), then followed up by showing how Boolean logic could be represented electronically, thereby incorporating not just calculations, but statements including terms such as “and,” “if, then,” “not,” “or” and so on.  These were engineered into logic gates that would allow machines simulate thought.

This subtle shift, from specialized to universal machines, created enormous ripple effects which are still reverberating today.  As Turing himself would later write:

The survival of the fittest is a slow method for measuring advantages. The experimenter, by the exercise of intelligence, should he able to speed it up.

And that’s what Steve Jobs’ appearance on the cover of Time signified. The process of evolution was about to speed up, exponentially.

An Office in a Box

The early universal machines, such as the ENIAC, were unwieldy, both physically and financially, so were used exclusively by large institutions.  That’s why Thomas Watson of IBM thought that there was only a market for five of them in the world.  In time though, computers became small and cheap enough for personal use.

I remember those early personal computers.  They were mostly for goofing around. Our parents bought them for us as high priced toys that they hoped would be more educational than TV.  In truth, they didn’t really know what we were doing on them, but there was a sense that computers were the future, so we were given free reign.

Things really changed when Dan Bricklin created Visicalc, the first spreadsheet program. By the 1990’s, computers made many basic tasks easier to do by yourself, which put a lot of clerical staff out of work, but you can see why Solow was skeptical.  Automating office tasks did not drive significant productivity gains.

The real value, looking back, is that we began to simulate.  We would rework documents endlessly before printing them out.  Finance and budgeting became an exercise in scenario planning.  With a universal machine at your fingertips, you could spot problems before they entered the real world and could do actual damage.

Turing Goes Mobile

It is not just our computers that have become universal machines, but our phones as well. What used to be a highly specialized device, used only for communication, has become a universal one that can do a variety of tasks, from e-mail to navigation to even a level to hang pictures and a medical monitor for our vital signs.

A smartphone is just the kind of universal machine that Turing envisioned.  It’s function at any given time isn’t dependent on how it was engineered, but what software we download for it.  My smartphone is not only different from yours, but probably much different than it will be in 6 months time, without altering a single molecule of hardware.

Yet for all of the convenience of being able to stay in touch on the go, the real boost in productivity we get from smart phones is in their ability to simulate.  Much like personal computers allowed us to test out documents and spreadsheets before they become actual, our smart phones allow us to simulate the real world.

When we want to navigate through town, pick a restaurant or a movie or do just about anything else, we can simulate the experience in our smartphones.  If it looks crappy, we don’t do it, which saves time and money.  As augmented reality navigates the hype cycle and eventually gives way to holographic technology, our power to simulate will expand.

The Third Industrial Revolution

in his new book, Makers, Wired editor cum entrepreneur Chris Anderson explains that we’re now in the midst of a third industrial revolution.  The first began in the late 18th century, with the invention of the steam engine while the second got started in the early 20th century, with the assembly line and the creation of the modern factory.

The seminal technology of this new industrial revolution is computer aided design (CAD) and, again, simulation is at its core.  Designers can experiment in the virtual world before trying things out in the real one.  Then they can build rapid prototypes cheaply with 3D printers, CNC routers and laser cutters.

Much like desktop computers and smartphones, manufacturing technology is becoming universal.  It can be programmed to make anything, including airplane parts.  Even the assembly line is being replaced by a new breed of industrial robots which enable factories to retool in minutes rather than in months, further reducing the cost of failure.

And it’s not just the product that can be simulated, but the market too.  Anyone with an idea has a variety of crowdfunding options, such as Kickstarter and Indiegogo, where they can not only receive financing, but also gauge demand.  If it is successful there, they not only get money for their venture, but a built-in market to sell to.

If it’s not successful, nothing gets built and little is lost.  That’s the beauty of simulation.  It can’t tell us that we’re surely right, but it can tell us when we’re wildly off the mark.

Finding 10,000 Things that Don’t Work

Thomas Edison was probably the most successful inventor in history, creating modern day staples such as the electric light, sound recording and motion pictures.  He also failed a lot, which didn’t bother him in the least.  He said of his many false starts:

If I find 10,000 ways something won’t work, I haven’t failed. I am not discouraged, because every wrong attempt discarded is another step forward.

Therein lies the secret to the simulation economy and the dissolution of the productivity paradox.  While failing 10,000 times in the age of Edison required superhuman fortitude, today it’s relatively easy because we have the opportunity to fail in the virtual world as many times as we like at minimal cost in blood and treasure.

We can experiment with business models, tweak designs, rapidly prototype, present to investors and test the market, all during our morning coffee.  As our technology advances further, these simulations will become more realistic through holographic technology and agent based models.

The more we continue to improve our ability to experiment in the virtual world, the more we will succeed in the real one.

– Greg

9 Responses leave one →
  1. Carlos permalink
    January 6, 2013

    I just found your site and I find your blogs excellent. I would like to contrast todays blog “The simulation economy” with your blog “The lesser known revolutions that created the modern world”.

    The point of the contrast is that today’s blog reflects the reduction ad nausea that innovation is a singular business event – where culture, science, technological moment, necessity and overall world view, are not taken into the equation. In fact, what some people call “timing” is condensing all of the above and the ceasing of opportunity which then creates the business event.

    The reductionist approach from many management or business guru point of view is that it is only the market, stupid. Or further, it is only the “Buzz” of the market and management zombies latest capture of a new trend. But true innovation really depends on much more than the processes in business – such processes convergence into the “office in a Box” in my view consist only of a time stamped advance in running things, rather than the stuff of true innovation.

    True innovation is when discovery and invention enters utility – and not just making the already known to be improved. That may be product innovation but not true innovation. Example: solid state physics when utilized quantum mechanics created the path to semiconductors – this is a discovery phase. Then, when added to the need to amplify signals and make faster switches, the transistor was invented. By the time the integrated circuit became used, the innovation was fully accomplished and applications went everywhere, creating the modern world.

    So, some of your past blogs show a brilliant link to the substrates that lead to innovation, and in their sense, the problem today is not the productivity paradox but the paradox of productivity, as we re-define what is in productivity that is so overwhelmingly a measure of GDP over quality of life, world culture and the human experience in all its dimensions.

  2. January 6, 2013

    Good points, Carlos. Thanks for sharing.

    – Greg

  3. January 7, 2013

    Greg, thank you for the great observations and your impressive writing. Best wishes from Kyiv during Ortodox Christmas time! Looking for more reading on your blog this year.

  4. January 8, 2013

    Thanks for your support Sergiy. Have a great 2013!

    – Greg

  5. Robert Lewis permalink
    January 14, 2013

    Well written article and interesting, but I support Carlos’ response strongly. There seems to be an underlying wish in the new collective psyche to become the slaves of technology rather than its master. (Believe me, if ever computers can do everything for you, you will be the slave (the robot), not the computer. (The always-on mobile is just a primitive example!)
    Also, I read in a Deloitee report last year that 65% of business computer simulations don’t work for the top companies that use it – at least not yet.

    Human life is complex, always has been and always be so. The achievement of Happiness, the prime objective ,(otherwise, why bother to live.) is not a zero sum game and does not rely on 100% logic, nor big bucks. If we can use technology to stimulate that 90% of the brain that the average human supposedly does not use, that would be a giant step in the right direction – rather than trying to pass on thinking work to robots. (Basic operational failures are already being transferred from people to machine, e.g. “the systems down” / ” there’s a power cut.” Somehow, if it’s ‘that silly machine again’, it’s no-one’s fault.

    I tentatively suggest that one result of this is the oft-commented, marked decline in both the visibility, transparency and quality / imagination of senior management leadership over the last 20 years across all types of industry and organisation; (i.e. if the ‘metrics’ look right no one can blame us, syndrome; remember the lazy travel agency slag and “the computer says so”, in the Little Britain series?)

    To confuse speed and convenience with productivity is a fundamental flaw in thought – otherwise, playthings like Facebook, iPhones and tablets would be really contributing to serious thought. Too often in business, such ‘convenience’ merely enables not very intelligent people (at all levels) to make hasty decisions, under the delusion that the same technology can easily correct the negative results – usually by involving (and unnecessarily inconveniencing more people or communities after the damage has been done.)

    Data is not wisdom – it’s not even knowledge; it’s just input, and if your cranial OPS is f—d, no knowledge, much less wisdom (lessons from applied knowledge) can possibly emerge.

  6. January 14, 2013

    Thanks Robert. Some interesting points. However, you use simulations to eliminate possibilities. No one thinks that it should replace real world experience altogether.

    As for now, if you go to the movies, have bought music from any major label, sent or received a package by a major overnight courier or, in fact, participated in modern life in any way, you’ve bought the product of a simulation (i.e. they all have specific versions of simulation software for their industries).

    – Greg

  7. January 14, 2013

    Greg,

    Very nice article with very strong point of understanding today’s reality.

    Let me add some ideas. The phenomena of productivity paradox is the effect of assimetric digitalization of the market (“demand and supply”). We have already the digitized “supply” but still anolog “demand” based on old-fashioned marketing.

    To provide the digitized “demand” we need the digitized “customer” and fully new digitized marketing and management.

    The new digital market will operate with global E-companies (digital agents) and Digital Intelligence for conducting and processing deals and providing arbitrage.

  8. January 14, 2013

    Interesting ideas as always, Sergei. Although, there is significant evidence that the productivity paradox no longer exists and hasn’t for a while. The problem now seems to be that the use of robots (both the hardware and software varieties) seem to be crowding out human employment.

    – Greg

  9. January 14, 2013

    Up to now the main accent in IT was the computing. But according to the Digital Laws computing has exceeded already the expectations of customers and is not as actual as in the past. Robots will be designed within the new management Cybernetics and will operate in information ecosystem with institutional (legal) basis.

    Today’s Internet is flat with non-structured content turning into the useless Big Data. Contextual, semantic compression of information and business utilization of the Data is available within the ecosystem with some (4 or 5) levels of self-control: agent activity, community relationships, monitoring, arbitrage, etc. (In this case it can be interesting to look at Stafford Beer’s Viable System Model in “Brain of the Firm”).

    Big Data and mobile computing need the semantic compression of information, Cloud Computing – creation the homestatic self-developing ecosystems, Machine Intelligence – managing the socialized, man-sized systems (where man is a main part of activity).

    So, semantic, institutional and digitized economy will significantly increase the employment for new digitized marketing and new management. Today’s corporate management model is hierarchical and very slow in comparison with global innovative processes.

    I heard interesting idea that the main driver of the market is the Idea which generates the big profit. Idea that Internet can generate more profit within the new Digital Market, it seems to me, should become popular as a way out of crisis…

Leave a Reply

Note: You can use basic XHTML in your comments. Your email address will never be published.

Subscribe to this comment feed via RSS