Where Original Ideas Come From
Revolutions are seldom solo efforts. Isaac Newton was the greatest scientist of his age and not one known for his false modesty, but even he had to admit, “If I have seen further it is by standing on the shoulders of giants.”
Thomas Kuhn made a related point in his classic, The Structure of Scientific Revolutions. He argued that precedence in science is somewhat arbitrary—a matter of perspective rather than fact—because new discoveries are rarely tied to the work of just one person or team.
Yet, while very few ideas are truly original, there are exceptions. Sometimes an important new idea seems to have no precursor or precedent, but springs forth whole from a single mind and completely alters our perception of how the world works. Although these are rare, they have a lot to teach us about how to become more creative ourselves.
The Idea That Launched Western Civilization
In the history of the world, very few ideas rival the impact Aristotle’s logic. In terms of longevity, only Euclid’s geometry is in the same league. While there was healthy philosophical discussion before Aristotle, it was he that took it out of the realm of mysticism by creating a system to judge the internal consistency of particular statements.
At the core of Aristotelian logic is the syllogism, which is made up of propositions that consist of two terms (a subject and a predicate). If the propositions in the syllogism are true, then the argument is true. Much of our information technology today is based on Aristotle’s original idea.
Amazingly, Aristotle’s logic survived nearly 2000 years—until the late 19th century fully intact—when some flaws emerged having to do with a paradox in set theory. The effort to resolve these problems led to Gödel’s incompleteness theorems and eventually to the Turing machine that launched the computer age.
A Theory Of Information
During World War II, Claude Shannon spent his time developing and breaking codes for the military (and struck up a brief collaboration with Alan Turing). He was known to his colleagues as quirky, quiet and brilliant, but no one was quite prepared for his 1948 paper, A Mathematical Theory of Communication, which created the field of information theory.
The basic idea was that information is separate from content. Shannon proved that information can be broken down into quantifiable units he called binary digits (or bits for short), which represented two alternative possibilities, much like a coin toss. Add up the coin tosses and you arrive at the total amount of information required to communicate an idea or instruction.
In retrospect, it seems like a relatively simple concept, but its impact has been positively enormous. It touches everything we do in the digital age, from storing files on a computer drive to talking on a mobile phone to compressing videos. Every time you watch a video on Youtube, you have Shannon to thank for it.
Engineering At Nano-Scale
When Richard Feynman stepped up to the podium to address the American Physical Society in 1959, he had already gained a reputation as both an accomplished scientist and an iconoclast (during his tenure at the the Manhattan project, he became famous for his safecracking and pranks).
Yet few could have predicted that, in less than an hour, he would create a completely new field—now known as nanotechnology—before their very eyes. Starting from a simple suggestion about shrinking an encyclopedia to fit on the head of a pin, he extrapolated to molecular machines and radical new medical therapies.
While today nanotechnology is a thriving, multibillion dollar industry, back then even a very simple computer took up an entire room—and a large room at that. Feynman singlehandedly imagined not only the possibility of engineering on a molecular scale, but even some of the techniques to make it possible, many of which are still in use today.
Feynman soon went on to other things and played little part in the further development of the field he had conceived, but his little talk remains one of the most dazzling bursts of creative thought in recorded history.
The Common Thread
Thomas Kuhn, who I mentioned above, became famous for his concept of paradigm shifts. He pointed out that even great scientists get stuck in a particular way of thinking about things, even when their theories no longer match established facts. That’s why it is usually an outsider—or a new generation—that tends to break new ground.
Truly original ideas rarely come from diligently working within one field, but rather from synthesizing across domains. And therein lies the secret to how groundbreaking new ideas like logic, information theory and nanotechnology come about. Aristotle, Shannon and Feynman were stars in their fields, but also ventured outside them.
Aristotle reportedly wrote over 200 works, across fields as diverse as biology, physics, ethics, politics and aesthetics. Outside of mathematics, Shannon was an inveterate tinkerer, successful investor and even developed systems to win at the gambling tables. Feynman, was an early computing pioneer and published an important paper on virology.
All of this poses an important questions for how we run our businesses: Why do we expect bright young graduates to enter a particular field, spend a few years learning to master it and continually repeat that experience over an entire career? Is groundbreaking innovation even possible if we spend our time perfecting our ability to do rote tasks?
In order to create new paths, we first must venture outside of those that we have already travelled.
– Greg
Greg, My book is almost ready to publish.
Do I have permission to quote three paragraphs from here please? The reason is that a lot of high flying economists think that only they can come up with radical solutions that may work.
I have quotes Digital Tonto 24th August 1014 as the source.
What a great post! The issue of synthesis is so “obvious” yet rarely considered.
Sure. Go ahead.
– Greg
Good article as usual, but I’m afraid I am going to disagree in some small degree both as to some of the specific details you use to support your case, but also the general conclusions. This disagreement is not unusual in our intermittent discourse 🙂
To start with the conclusion: “In order to create new paths, we first must venture outside of those that we have already travelled.”
Simply to travel outside the paths we have already traveled is not sufficient; our perceptions must already be sufficiently ‘educated’ and ‘open’ to alternative possibilities.
We know that some people are inherently or natively generalists – their brains function syncretically – whilst others are inherently or natively specialists – their brains mostly function synthetically. Despite first appearances those names you have cited were all of them generalists – syncretists who were highly educated in or receptive to more than one specialism – and also beyond specialisms to much wider generalisms.
As an old aphorism has it: chance favours the prepared mind. To ask a *good* question one must already know at least 90% of the answer. Those whom you cite were already prepared.
Sending bright young graduate specialists off to walk untrodden paths is utterly pointless if they don’t have the right type of mind and the preparation and openness of mind to benefit from such walks. They will simply carry their assumptions and blinkered mindsets with them.
We need more generalists: modern education from the first year of schooling is, in most industrial nations, wrongly focused on the production of ever more specialists, even when the individual’s mindset and neuro-psychology makes them inherently a generalist. We do little to detect and foster generalists.
Whilst synthesis is useful, it is not sufficient. Newton may have stood upon the shoulders of giants but his efforts were much more than merely synthetic. He was also a syncretist.
And now for something completely different: to two of the specific names you’ve cited.
Newton was not the greatest scientist of his age, he was [and is] merely the best known in his culture: there is a difference. Hooke and Leibnitz have equal claim within European derived societies and cultures. Hooke as the father of microbiology [amongst many other things], saw further and deeper than Newton: he may ultimately prove to have been by far the better and greater scientist. Ironically his speculations on the nature of “the small” in biology lead us rather directly to Feynman’s nano-tech in biology and medicine, although he would never have conceived that name.
Leibnitz meanwhile, if we care to follow the path carefully, leads us rather directly to Wikipedia and the Internet of Things via the inventing of computing and the explosion of the Internet: his thinking on the subject of indexicality and placement directly echos the fundamentals of Vannevar Bush, Thomas Nelson, Hans Reiser and Tim Berners-Lee: the content *IS* the address.
So who, then, is the greatest scientist of his age? I’m not convinced it’s Newton, although he does make a handy illustration for your general point.
It is of course only my opinion, but I think that in some degree you mischaracterise Shannon’s work. Information and content are indeed distinct, but the key issue within his paper is how we differentiate noise from data, and how we use data to build information and content and then *communicate* those things. The core of the paper is that “noise is the negative reciprocal of probability” – the signal to noise ratio. The noisier the channel the less easily and successfully we communicate: and, strangely enough, through his associated hierarchy of wisdom and wealth [which defines what information *is* – and thus what knowledge is] that leads us directly back to innovation creation and new discoveries: each step up the hierarchy is done with purpose
noise>data>information>knowledge>wisdom/wealth.
we select two or more datum points from the noise with purpose to create data.
we select two or more data sets with purpose to create information.
we select two or more … et. seq.
This is the deductive process – a synthesis – where we treat the hierarchy as a bottom up process. When we invert the hierarchy and use anecdotal wisdom and knowledge with induction we have syncretic processes. Chance once again favours the prepared mind.
We can’t simply push young graduate specialists out into the wild to fend for themselves; we have to do much more than that to prepare them for their journey. We have to open their eyes beforehand and ensure they have skills and capabilities suited to their mind and neuro-physiology and learning abilities. They need to be helped to see, and not merely to look.
Thanks Robert. I agree that becoming a generalist takes preparation. My point was that we prepare people to ge specialists and there are actually disincentives to crossing domains, when we should be encouraging it.
I don’t think I mischaracterized Shannon’s work, but I did explain it narrowly. Thanks for filling in some gaps.
– Greg
Greg – lets not forget the power of sex and genetics in evolution ..sex is important in the evolution of new ideas.
Its not productive to try having sex on your own and does nothing for evolution.
Likewise – genetic inbreeding leads to problems and does nothing for evolution.
Evolution benefits from chance and some randomness .. as do new ideas 🙂
Thanks Martin. It’s always good to have a dose of sex in the morning:-)
Thank you for this intelligent and especially genteel exchange of ideas. It is so rare and welcome to read people who appreciate the other, yet want to clarify and add to the ideas of the other without being venomous. As a self-proclaimed generalist with absolutely no claims to innovative ideas, I’m grateful for your mutual appreciation(s) of the generalist mentality.