The Deployment Age

A couple of weeks ago James Gross, co-founder of Percolate, had me speak at their Transition conference. I talked about Carlota Perez, her theories, and the transition to the deployment period that we are currently undergoing. The talk, as I remember it, (plus some stuff I had to cut for time) is below. I’ve also added some additional material as sidenotes.

Perez’ theory describes the path a technological revolution, like the Industrial Revolution, takes and the social, economic and institutional changes that go along with it. The jury is still out on the theory, and there are plenty of reasons to doubt it. But if it successfully predicts what happens over the next ten years it will have in good part proved its power.


I’ve been in the technology business for more than thirty years and for most of that time it’s felt like constant change. Is this the way innovation progresses, a never-ending stream of new things?

If you look at the history of technological innovation over the course of decades or centuries, not just years, it looks completely different. It looks like innovation comes in waves: great surges of technological development followed by quieter periods of adaptation.

The Deployment Age.001

The past 240 years have seen four of these great surges and the first half of a fifth.

We’re all familiar with the industrial revolution, with its mechanization of textile mills, the spread of water power and canals, and the massive increase in productivity. This was followed by the spread of the railways and steam power, the age of steel, and the Age of Oil, Autos, and Mass Production (from the beginning of the 20th century to the 1970s.) The wave that we are currently in, the Information and Communications Technology Revolution, started around 1971.

These cycles have eerie similarities. Each is characterized by

  • some critical factor of production suddenly becoming very cheap,
  • some new infrastructure being built,
  • a laissez-faire period of wrenching innovation followed by a bubble,
  • a post-bubble recession,
  • a re-assertion of institutional authority, and then
  • a period of consolidation and wide spread of the gains in productivity from using the new technology.

The repeated pattern in otherwise dissimilar eras seems like it has to be more than coincidence. What dynamic could cause it?


Another accounting of these waves is from Chris Freeman and Francisco Louçã’s As Time Goes By: From the Industrial Revolutions to the Information Revolution. If you find Perez’ theory interesting, you should read As Time Goes By. Perez’ book has the overwhelming strength of just coming right out and saying what it wants to say, but As Time Goes By is tighter, better argued, addresses alternative views, describes the waves with economic detail, has better citations, and shows its work.



Economist Carlota Perez in her 2002 book Technological Revolutions and Financial Capital puts forward a theory that addresses the causes of these successive cycles and tries to explain why each cycle has a similar trajectory of growth and crisis. Her answers lie not just in technological change, but in the social, institutional, and financial aspects of our society itself.

The Deployment Age.002

We like theory because it tells us why, but more than that, a good theory is predictive. If Perez’ theory is correct, it should allow us to predict what will happen next in the current technological cycle. I’m going to give you the outlines of her theory first, then use it to make some predictions about where we are now and what will change over the next ten years or so.

The Deployment Age.003

Each cycle is initiated by a group of technologies interacting to create a “Technological System.” Noah [Brier, the other co-founder of Percolate] mentioned [in his talk introducing the conference] Donatella Meadows’ definition of a system: “an interconnected set of elements that is coherently organized in a way that achieves something.”1 A Technological System is a set of technologies that interconnect as a platform to allow other innovations. When this system allows a fundamental and basic innovation, the system can cause a technological revolution.

For example, here is a group of some of the technologies involved in the start of the Railroad revolution. These technologies weren’t all invented at the same time, or even commercialized at the same time, but together they formed a system that made railroads viable. This diverse group of technologies included the high-pressure steam engine, precision machine parts, and improved metallurgy to allow rails to be cast. There were also innovations that we often don’t think about as technologies–things like corporate limited liability and the stock market–that allowed the concentration of the massive amounts of money needed to build the railway infrastructure.

The railroad was invented to solve a problem: moving coal from the mines to the piers, where it could be loaded on ships or barges. But the railroad addressed a broader and more basic need: moving things around quickly and cheaply. It did more than move coal from mine to ship, it moved goods from city to city, it moved mail from city to city, it moved people from city to city, and those people started living away from their hometowns en masse for the first time. These uses then enabled many other industries to flourish. This is the difference between a technological revolution and a plain old technological system: a technological revolution changes the whole economy.

Another constellation of technologies started the ICT revolution: semiconductors, integrated circuits, computers, software, computer networking, and mobile phones are some of them. Some of these together allowed the creation of the microprocessor. Originally built to power portable calculators, it ended up addressing another broad and basic need and started a new technological revolution.


An interesting thing about technological systems is that they are not just a bunch of technologies in the same place at the same time, they are systems: their further development is linked together. When some of the technologies in a linked system progress faster than others, the laggards become the limiting factor in the system. Thomas Hughes called these the ‘reverse salient’2 with all that implies. The system can not progress until the reverse salient is cured, so economic resources are directed to its improvement. The invisible hand detects what is holding progress back and redirects resources to cure it, so the system evolves faster than its individual technologies would. If you see a reverse salient, you have found a problem whose solution is far more valuable than it may first seem. Currently, the most obvious reverse salient is battery technology.

The Deployment Age.004

When the new technological system starts to promise commercial opportunities, we have reached the ‘irruption’ phase of the cycle. At this point the economic logic of the new system is starting to become evident and there is a promise of a new ‘Techno-Economic Paradigm’ (TEP). The techno-economic paradigm shows a way of using the new technology to make businesses more efficient and profitable, and so more competitive than existing businesses. But existing businesses have a techno-economic paradigm of their own, the one associated with the previous technological revolution, and they have a lot invested in the old way of doing business. So adoption of the new technologies is slow at the beginning and incumbents resist it.

But as entrepreneurs and early adopters realize the potential of the new TEP, intrepid financiers start to show interest. The financiers reason that if the new TEP can really help create much more efficient businesses then massive disruption will occur and they should be able to make money investing in companies that produce the new technologies. Money starts to flow into the sector.

At this point technological development is primarily exploratory, so many of the investments the financiers make fail. But a few hit big, and other financiers notice. In the current ICT revolution, think about the investments in Apple Computer and Intel by the then new venture capital industry3.

Seeing the money to be made, money starts to pile into the new technologies. As the scale of the new opportunity becomes more evident with each successful investment, investment in the sector grows exponentially. This is the ‘frenzy’ phase of the cycle. For example, the small venture capital industry of the 1970s lead to the much larger VC industry of the 1980s and then to the dot-com frenzy of the 1990s.


Venture capital, as we think of it, is primarily a feature of the current cycle. Other cycles had other funding mechanisms. E.g.:

The consumer boom of the 1920s was financed through a vast expansion of credit, with personal debt nearly doubling as a proportion of income. And since banks did not make consumer loans, new lending channels had to be created. These included installment sales finance companies (such as the General Motors Finance Company founded in 1919), retail installment lenders (particularly department stores), licensed consumer finance companies (such as the Beneficial Loan Company) and Morris Plan industrial banks4.

Although many of the entrepreneurs of each cycle were funded initially by wealthy patrons. Also see: Brunt, Liam. “Rediscovering Risk: Country Banks as Venture Capital Firms in the First Industrial Revolution.” The Journal of Economic History 66.01 (2006) : 74-102.

At the same time, incumbent companies start to react to the new technologies both operationally and strategically. Operationally, they start to adopt the new technologies, although this adoption is often piecemeal and uneconomic; regardless, companies do not want to be left behind. Strategically, they try to react to new companies and the threat of disruption by trying to adopt some of the tactics that financial capital is using, but firewalling it from the main thread of their business: in this cycle this has taken the aspect of corporate venture capital groups, skunkworks and innovation ‘divisions’ reporting to the CEO, ‘intrapreneurship’, a mandate to innovate (although without taking real risk), etc.

Meanwhile, the casino capitalism of the frenzy leads to overinvestment and a financial bubble. Bubbles, being bubbles, cannot last, so they don’t. When they pop there is a recession. (I want to reiterate that this is not just the history of the recent past but the common events that have occurred in all the cycles. So, after the dot-com bubble we had a recession. After the roaring ’20s, there was the Great Depression, etc.)

During the frenzy society’s institutions could not effectively assert themselves because there’s never a large constituency for fixing something that’s not broken. But during the recession there is a reassessment of what just happened. Rules are put in place to restrain the worst excesses of the recent bubble. After 1929 this meant the Securities Acts of 1933 and 1934 and the oversight of the SEC. After the dot-com bubble, this meant Sarbanes-Oxley, among other things (and then, after the real-estate bubble–Perez considers both bubbles aspects of the same frenzy–Dodd Frank.) Financiers also learn a new humility, not least the recurring fear that a new bubble might be forming at any time, and a continuous looking over their shoulder for it.

The zeitgeist changes from creative destruction to creative construction. Financial capital pulls back and production capital takes over the funding of innovation.

The distinction between ‘financial capital’ and ‘production capital’ is key to Perez’ explanation, but it took me a while to figure out exactly what she meant by it. My long-ago operations research textbook had a cartoon showing one MBA talking to another: “Things? I didn’t come here to learn how to make things, I came here to learn how to make money.” This is the view of financial capital. The view of production capital is exemplified by Peter Drucker: “Securities analysts believe that companies make money. Companies make shoes.”

Financial capital invests in innovation because it thinks it can earn a good return. It is primarily controlled by agents that are external to the means of production, financiers. Production capital invests in innovation to add to and make more efficient its production resources. It is controlled by the management of companies. Financial capital has a casino-like mentality: nine losing bets and one that pays 12x is a good year. Production capital has a planning mentality: a failed investment is a failure of management, almost a moral failing.

When financial capital pulls back, both because of decreased opportunities for massive exits because of the new regulation and because of the new conservatism that fear teaches, production capital takes over. Production capital is not looking to create entirely new markets and disrupt incumbents–they are the incumbents–it is looking to improve the means of production through innovation. Production capital funds predictable innovation: classic sustaining innovation, not the riskier exploratory innovation. This period, when production capital starts to take control, is a period of synergy, with less technological volatility, fewer business failures, more (and longer-lasting) employment, and less income inequality. It also heralds a decrease in corporate dynamism (for example, the fifty largest companies going into the 1950s were pretty much the same as those coming out of the ’50s.)

But because production capital is not looking for radical innovations, the kind that further innovations are built on, the opportunity space of the technological revolution starts to run dry. Stagnation sets in (think U.S. manufacturing in the 1970s.) As returns to investing in large companies become small, savvy financiers start to look for something new. When they find a new technological system and a new techno-economic paradigm that promise a new technological revolution, they start to move their money there, and the next cycle begins.

The Deployment Age.005

The process is driven by the nature of change in three spheres. I‘ve talked about two, the technological and the economic. But the third is just as important, the institutional. At the beginning of a technological revolution social and institutional ideas reinforce the previous paradigm and resist the new one. But as the cycle progresses, first social norms and then political and institutional norms coalesce into a new framework that reinforces the dominance of the new paradigm. The technology changes the world in a way that entrenches the technology, the old paradigm loses all power. While some older people argue whether we would be better off without some of these technologies, the vast majority of the younger generations consider only how to make them better.


An example of the entrenchment of a TEP from a previous cycle:

The real reason for the decline of the English canals is, however, to be sought in the fact that English internal commerce had largely reconstructed itself and that the railway transport had come to suit it far better than water transport. English agriculturalists, for instance, had changed from selling wheat to selling dairy produce, and the water-ways were too slow for the transport of milk and butter, whatever they had been for cereals. The coal merchant was unwilling to provide large warehouses for coal; he preferred to have it in the railway trucks and get it as he wanted it; he could then work with smaller capital5.

In 1953 the establishment believed that what was good for GM was good for America; it’s not a stretch to say that over the next 20 years the establishment will begin to believe, and act, like what’s good for Google, Facebook, and Apple is good for America.

The Deployment Age.006

Perez’ theory divides each cycle into two main parts: the installation period and the deployment period. Installation is from irruption to the crisis, and deployment is after the crisis. These are the ying and the yang of the cycle. Some of the differences between the two periods we’ve already mentioned—creative destruction vs. creative construction, financial capital vs. production capital, the battle of the new paradigm with the old vs. acceptance of the new TEP, etc.

A few other characteristics worth mentioning:

  1. In the installation period, much of the capital goes to building the infrastructure the technological revolution needs. The bubble causes it to be overbuilt, so as the deployment period begins there is excess infrastructure and low cost to use it. This happened with canals, railways, and most recently with the telecom infrastructure. The deployment period gets to take advantage of the low cost glut of infrastructure.
  2. In the installation period, much of the new technology needs to be ‘pushed’ to market: customers don’t necessarily understand the benefits and need to be sold on each new thing. In the deployment period, technology is ‘pulled’: customers demand the new technology and are often ahead of the technology companies on what they need and how they will use it. The sources of innovation shift from exploratory companies to forward-thinking customers.
  3. Also, in the deployment period, companies move from creating entirely new markets–as they must with entirely new technologies–to expanding and consolidating their existing markets. To appeal to more customers in a given market they must make their products both cheaper and easier to use.
  4. And last, during the deployment period the technology becomes familiar, easier to implement and repair. This means the economic benefits of tech knowledge that in the installation period accrued to a small, well-educated elite start to spread out across the population, lessening economic inequality.


So what does the deployment period look like? Let’s look at the 1950s, the first half of the deployment period for the last wave, The Age of Oil, Autos, and Mass Production.

Robert Field, in his landmark study of the history of productivity growth in the US, A Great Leap Forward, notes that the years before World War II–the installation period, according to Perez–were notable for rapid innovation in manufacturing. Additionally, manufacturing was the largest contributor to productivity growth in that period.

But after the war, this changed.

The view of the quarter century following 1948 as one of more moderate innovative advance is consistent with the enumerations of basic innovations by Kleinknecht, Schmookler, and Mensch. All of their series show peaks in the 1930s, particularly in the second half. Kleinknecht’s analysis, which runs through 1969, shows a big peak in total and product innovations in the 1930s, although process innovations peak in the 1950s. Schmookler’s data, which run through 1959, show a peak of forty-eight basic innovations in the 1935-1939 period, dwindling to zero in 1955-1959. Mensch’s series on basic innovations also peaks in 1935-1939 (at 13) before declining to zero in 1955-19596.

The disappearance of product innovations while some process innovation continues makes sense if innovators no longer seek new markets but continue to try to expand existing markets by making products cheaper.

Field also notes that during the 1950s and after, productivity growth moved from being dominated by manufacturing to being dominated by distribution7. The spread of a new infrastructure–the interstate highway system–was part of this, but the biggest factor was containerization, which allowed the older infrastructure (ships and railways) to utilize the newer technologies and infrastructure (trucks and highways) in a way that made them both far more productive. Productivity growth had moved from the new technologies themselves (manufacturing of trucks) to their use in society (distribution of goods) while encompassing the pre-existing fixed assets (the railroads and ships.)

It’s important to note that there wasn’t a conscious turning away from innovation in the 1950s. Many firms of the era continued to improve their products and use the improvements as a selling point. Manufacturers of electronic goods, for instance, were quick to tout improved products. But these innovations were decided on and funded by production capital through a careful (at least, relative to financial capital) planning process.

Even the larger innovations of the time were funded by production capital: large company research labs and government defense spending. Companies that spun out of this spending tended to also be funded by, or quickly taken over by, production capital. Shockley Semiconductor (funded by and a division of Beckman Instruments) and the Eckert-Mauchly Computer Corporation, discussed below, are good examples. Even when the traitorous eight left Shockley on their own entrepreneurial quest, forming Fairchild Semiconductor, they chose to be under the umbrella of an existing company rather than completely on their own.

Innovation was inside, or controlled by, incumbents. And incumbents, having been through a shakeout over the course of the Great Depression, were less anxious about being disrupted by contenders. J.K. Galbraith, who was to deployment what Schumpeter was to installation, wrote of the 1950s and 1960s:

Through marketing and planned obsolescence, the disruptive force of technological change–what Joseph Schumpeter called creative destruction–had largely been domesticated, at least for a time. Whereas large corporations had funded research leading to a number of important innovations during the 1930s, many critics now argued that these behemoths had become obstacles to transformative innovation, too concerned about the prospect of devaluing rent-yielding income streams from existing technologies. Disruptions to the rank-order of the largest U.S. industrial corporations during this quarter century were remarkably few8.

auto industry.001

This change was not obvious at the time, even to the people who were directing it to happen. Companies and businesspeople continued to think of themselves as daring innovators even as the business climate shifted away from entrepreneurism. Galbraith again, writing in 1964:

Until recent times, senior officials of the mature corporation were inclined to assume the public mantle of the entrepreneur. They pictured themselves as self-reliant men, individualistic, with a trace of justifiable arrogance, fiercely competitive and with a desire to live dangerously. Individualism is the note that “sounds through the business creed like the pitch in a Byzantine Choir.” “They’re bred to race. It’s the same with people. It’s something that’s born into you.” “Business is tough–it’s no kissing game.” These characteristics are not readily reconciled with the requirements of the technostructure. Not indifference but sensitivity to others, not individualism but accommodation to organization, not competition but intimate and continuing cooperation are the prime requirements for group action…

To a surprising degree, American businessmen and writers about business have [stopped] interpreting our cooperative society as individualistic and [have stopped] concealing our quest for security in phrases like competition…Interdependence is recognized…Executive life, so far from being competitive and dangerous, is highly secure9.

The turn away from innovation was not because innovation was less revered–it was revered so highly that corporate executives continued to tell themselves they were fiercely competitive innovators long after it ceased to be true–it was because in a corporate world safe from the threat of upstarts, transformative innovation was economically unsound.

The Deployment Age.007

Where are we in the cycle? Perez, in a 2013 paper10, says we are now in the deployment period. This has big consequences for how you run your business.


This quote makes a shocking claim I didn’t think the Transition audience would be interested in: financial capital’s job is done.

Financial capital in this cycle is none other than yours truly, the venture capitalist. Is our job done? If you look at the current VC pace of investing, we certainly don’t seem to think so. And the championing of Perez by some of the smartest VCs out there, like Fred Wilson, Chris Dixon, and Marc Andreessen, seems…odd. Why champion a theory that says you are now irrelevant?

To put a finer point on it, think about innovation funding in the 1950s: corporate development, corporate research labs like Bell Labs, and defense spending. You might, if you’re a student of VC history, also remember ARD, the proto-venture capital firm that funded DEC in the late 1950s.

But consider that ARD was, on the whole, a failure. Its investment in DEC was one of its few bright spots. But their success was on the back of founder Ken Olsen. Olsen, “The Ultimate Entrepreneur“, sold 70% of his company right off the bat to ARD for $70,000 (ARD also lent him an additional $30,000 which should be considered de facto equity) in 1957. When ARD liquidated in 1972 DEC was worth $400 million, giving ARD an IRR of some 55% per annum but leaving Olsen worth, probably, less than $40 million ($230 million in today’s dollars.) Not a bad haul, but not what people today would consider a fair deal for the ultimate entrepreneur. ARD could get this deal because Olsen had nowhere else to go for financial capital in the 1950s.

Even more starkly, the Eckert-Mauchly Computer Corporation, the first company to sell a computer (the ENIAC), founded in 1947, had to sell itself to Remington Rand in 1950 because it could not raise money to continue as a separate business. This was not for lack of recognition of the promise of the technology, it was for lack of financial capital willing to fund innovative technology.

Is this what third-party funding for innovation will look like over the next ten years: ICT funding moving entirely to production capital and next-wave technological funding almost impossible to procure? Hard to imagine, but it’s what Perez’ predicts.

The Deployment Age.008

I’m going to make two overarching points about what the theory implies about what the next ten years will look like:

  1. Information and communications technology becomes ubiquitous but invisible, and
  2. Innovation becomes ubiquitous but small.

The Deployment Age.009

First point. In keeping with the theme of this part of the day: software eats the world and everybody ignores it.

There was a time when people would pitch me ‘internet companies.’ But unless you’re actually selling internet service, you’re no more an internet company than a company using electricity is an electric company. Over the next ten years there won’t be online stores, there will just be stores; there won’t be mobile-enabled taxi dispatchers, there will just be taxis. The expectation of sensible use of ICT will be total.

At the same time, because companies will need to grow by expanding markets, not creating new ones, there will be an economic mandate to make the technology cheaper and easier to use so it can be used in more places. These things together mean that ICT will be everywhere, but so integrated into products that it will be invisible.

The Deployment Age.010

How do you adjust your strategy for this?

Stop considering the technology a feature. Using the technology where it fits is no longer a feature, it’s a requirement. Connecting a thermostat to the Internet wirelessly is awesome, but calling it an Internet-enabled thermostat will start to be like calling a vacuum cleaner an electricity-enabled broom. And if your thermostat does not connect to the Internet, it will be bought only by retro-chic hipsters.

But if you use ICT in your product, it needs to be seamless. Your users shouldn’t need an instruction manual. Don’t scrimp on user interface and user experience design. Many startups have already gone down this road.

Venture capitalists have also been pushing the idea of the ‘full-stack startup’, a startup that doesn’t focus solely on its innovation, but builds the pieces that the innovation needs to function as a product. Full-stack makes sense when depending on others’ products would cause a poorer user experience, when the interconnection itself is a weak point. Apple has been a proponent of this design approach for some time, of course, but now every company will have to either build out their stack or more closely partner with other providers to create a seamless experience.

And, of course, as companies are forced to drive down the cost of both technologies and their adoption to grow markets, the technologies will be integrated into more and more products (perhaps even over-integrated, a la the electric can opener.)

The Deployment Age.011

Second point. The technology becomes ubiquitous, as noted, but innovation itself becomes ubiquitous. The deployment age is not an age of exploration, it’s an age of extending the paradigm into all parts of society. What people want to do with the technology is pretty clear, and the technology’s improvement trajectory is pretty clear. And with financial capital’s funding of disruptive innovation out of the picture, larger companies enjoy relative stability to innovate in more conservative, planned ways. It’s the end of the ICT frontier, no more wild west11.

The Deployment Age.012

This means that companies can, and have to, make innovation part of their normal business processes. And by normal I mean the everyday, production-oriented business processes used by all employees, not just the high-level corporate strategy and planning process. Innovation is no longer something special that needs to be walled off from the rest of the company, it needs to be everywhere. This is a big change for people who learned management over the last thirty years.

Corporate venture capital, whose point is not to make money12 but to provide a view into the direction and speed of entrepreneurial innovation, will become passe. When innovation is easier to predict, companies won’t need this view.

Intrapreneurship–the word was coined in the 1970s, when this technological revolution was new–assumes that certain people must be allowed to buck the established business processes in order to catalyze radical innovations. If businesses no longer need to look for radical innovations, the threat intrapreneurship poses to regular order will outweigh the benefit.

Skunkworks and innovation divisions had their place when companies needed a rapid response to a looming threat and established structures and dynamics sabotaged efforts to build innovations that threatened the core business line (think about how the IBM PC was developed, for instance.) As these sorts of outside radical innovations become rarer, the need for ad-hoc groups outside of normal process diminishes.

All of these ideas, and any idea that stems from the belief that innovation is special and that large companies can’t innovate because they are large and slow, lose their power when economically worthwhile innovations themselves become smaller and happen less quickly. Corporate venture capital is replaced by M&A, because financial capital and production capital valuations converge when uncertainty is low: Eckert-Mauchly is, essentially, acqui-hired by Remington Rand; DEC is majority owned by ARD; etc. Intrapreneurship and skunkworks are replaced by internal innovation processes which, while ineffective at producing radical innovations, allow controllable and measurable sustaining innovation. Money that would have been spent financing external innovation is redirected back to corporate development and, perhaps, even corporate controlled research labs13.

These sorts of controllable and measurable innovation processes are already taking hold, both inside and outside the corporate world. It’s no coincidence that the buzzwords in innovation the last few years have been ‘lean’ and ‘customer development.’ While these both claim to be new discoveries, they are actually old practices that fell out of favor during the installation period because they aren’t suited to radical, fast-moving innovation; they only work when innovation is slower and more predictable: Steve Jobs could not have used customer development to create the Apple computer; Henry Ford’s quip that if he asked his customers what they wanted they would have said “a faster horse” are both acknowledgements of this. The hallmark of a new technological revolution is that the innovation trajectory is unknown14: lean doesn’t work on early adopters because they will use anything novel (i.e. the Altair as an MVP was pretty well useless in predicting what mainstream customers would want in a personal computer); customer development doesn’t work when you’re developing a general purpose technology. In general, you can’t iterate your way to radical innovations, almost by definition.

But these tools are perfect for companies during the deployment period. IBM was a master of customer development in the 50s, they knew exactly what problems their customers were facing because they had people sitting in their customers’ offices. Lean—the idea of build, test, learn, iterate—was advocated by the RAND Corp back in the 1950s for use in building bombers and ICBMs. These ideas have been out of vogue for 40 years. Now they’re back and you should be using them.


In the 1950s elements in the defense department advocated planning the development of complex projects by creating an optimal development plan.

Stated in its purest form, the fundamental idea of concurrent engineering is that a final product (such as a bomber or a missile) can be so well specified in terms of its performance that all aspects of its engineering, design, and production can be pursued essentially in parallel rather than in series15.

Others in the defense establishment pushed back. For example, re the Atlas ICBM: “too much time was being devoted by the present Atlas management complex toward optimizing the intercontinental ballistic missile rather than getting into design and production on a system which could give an early operational capability16.” This ignited a debate with the RAND Corporation, the government funded think-tank that was the foremost proponent of the systems approach to management, the success of which had lead to the original belief that concurrent engineering would work.

The research that resulted, papers like William Meckling’s (unpublished) “Are We Overplanning Aircraft Development?” showed that  technological change necessitated a series of shorter-term milestones with the ability to flexibly choose direction after each one rather than rigorous, top-down planning. The RAND economists pushed the idea that the Air Force could at that time adopt some aspects of the innovation that had previously been confined to entrepreneurial companies (like Boeing) without losing the assurances that the project would, in fact, be completed; that the Air Force should adopt an innovation strategy–what we would today call lean–rather than innovation planning. This build, learn, iterate process is the inevitable best way to search an unknown fitness landscape17.


The Deployment Age.013

These are a couple of things I think the theory predicts will be major themes of the next ten years, and I don’t think I’m really going out on a limb with them because you can see them happening already. But there’s one overarching theme I want to emphasize: the economy is not static.

If you took economics 101, you learned things like supply and demand, Y = C + I + G + (X – M), and the like. These concepts assume that if things are changing, it’s because they are moving towards equilibrium. And once they reach equilibrium, they will stay that way until there’s some shock from outside the economy. If you took all of this stuff you learned in economics and put it all together, you still wouldn’t be able to predict the economy because that’s not how the economy works. There is no equilibrium. The economy is constantly changing, it evolves. This doesn’t mean you can’t make predictions, it just means that you have to predict change.

People have always had a penchant for thinking that now is the end of history, whenever now was. Whatever just happened will continue happening; however we’ve learned to deal with problems are deep underlying truths, not just contingent responses. But if things are always changing, then there is never an end to history, and many of the things you’ve learned as deep underlying truths are actually subject to being overturned at any time. Everything you’ve learned in your career has to be re-examined every once in awhile to see if it will be as true in the future as it was in the past.

Some things we’ve learned over the past 30 years–that novelty is more important than quality; that if you’re not disrupting yourself someone else will disrupt you; that entering new markets is more important than expanding existing markets; that technology has to be evangelized, not asked for by your customers–may no longer be true. Almost every company will continue to be managed as if these things were true, probably right up until they manage themselves out of business. There’s an old saying that generals are always fighting the last war, it’s not just generals, it’s everyone’s natural inclination.

But you, you now know, at least, that economic history has this ever-changing, cyclical nature; that the rules change with the times and that strategies have to change with them. It’s easier to fall back on the tried-and-true, always, but right now, during the transition into the deployment age, is the very worst time to do that.

Thank you.


Susan Lawler read early drafts of this talk and provided critical feedback. Thank you Susan!

  1. Meadows, Donatella, Thinking in Systems: A Primer, first paragraph of chapter one. 

  2. In his Networks of Power, but he talks about the concept in much of his work. My favorite of his books is American Genesis

  3. Yes, I know venture capital started well before then, but calling it an ‘industry’ before the end of the 1960s would be a stretch. 

  4. Bhide, Amar V., The Demise of US Dynamism Is Vastly Exaggerated – But Not All Is Well (January 26, 2015), p.10. Available at SSRN: http://ssrn.com/abstract=2557154 

  5. Knowles, L.C.A. The Industrial and Commercial Revolutions in Great Britain During the Nineteenth Century. Routledge, 1937, p. 251. 

  6. Field, p. 108 

  7. Field, pp. 110-115. 

  8. Quoted in Field, p. 108 

  9. Galbraith, J.K., The New Industrial State, 6th Printing, Houghton Mifflin Company, Boston: 1967, p. 92-94. 

  10. Perez, C. (2013). Unleashing a golden age after the financial collapse: Drawing lessons from history. Environmental Innovation and Societal Transitions, 6, 9–23. http://doi.org/10.1016/j.eist.2012.12.004 

  11. I didn’t think this trope really needed more explanation, although I think when I started talking I did lob in an inadequate explanation. Anyway, it’s a trope, cf. http://observer.com/2008/11/tarnation-experts-agree-internet-like-wild-west-since-at-least-1994/ 

  12. How much money could Google Ventures possibly make, for instance, and what would it matter to a company that has after-tax earnings of $15 billion a year? The small relative size and, more importantly, one-time nature of GV’s earnings probably adds about nothing to Google’s share price. So GV can’t be about the money. This is true of every venture fund attached to a company. 

  13. I think this was not driven by the financial-capital/production-capital dynamic in the 1950s but was still an artefact of the deployment age as entrenched companies tried to prove their worth to society through cutting-edge research, a la Bell Labs. 

  14. Christensen’s The Innovator’s Dilemma has a decent discussion of this, but the real place to look is either Utterback’s writing on dominant design or Kim Clark’s 1985 paper The interaction of design hierarchies and market concepts in technological evolution

  15. Hounshell, David, “The Medium Is the Message, or How Context Matters: The Rand Corporation Builds an Economics of Innovation, 1946–1962”, Chapter 9 in Agatha and Thomas Hughes’ Systems, Experts, and Computers: The Systems Approach in Management and Engineering, World War II and After

  16. Ibid. 

  17. Not that the RAND Corp. researchers used this language, that came later, primarily out of the complexity research at the Santa Fe Institute. 


  1. Jerry,
    This is fantastic. Last week I spoke about trends for an ICT audience (among other topics), and the most fascinating thing to see from the tech perspective was the assumption that the technology was *just there.* Connections to the Internet, to other machines, to data — all is just expected. Technology professionals are increasingly focusing on UX (user experience) and end-to-end user needs. And it’s about time. Thank you again for this — I’ll be digesting it for a while!

  2. This article is excellent and in complete accord with the physics of evolutionary organization of flow (movement) in all nature, bio, non bio, social and technological. This universal tendency (phenomenon) is underpinned by the Constructal law of physics.

    A broad view of how to predict this universal phenomenon is in this book:

    A. Bejan and J.P. Zane, DESIGN IN NATURE. How the Constructal Law Governs Evolution in Biology, Physics, Technology and Social Organization (Doubleday, New York, 2012).

    In particular, the S-curve history and future of all spreading and collecting movement (flow) is a manifestation of the Constructal law, and is predictable. Here is a brief news outline, from the National Science Foundation :


    The oldest and most powerful “energy technology” that has spread to maturity, and in accord with the Constructal law, was the adoption of fire. Here is a brief article from Nature Scientific Reports:

    “Why humans build fires shaped the same way”

    With best wishes,

    Adrian Bejan
    Duke University

  3. I think it is simplistic to say the current ICT revolution is in the deployment phase, that would imply that the boom time was in the ’90s. Even Perez seems to think the dot.com was the true installation forgetting the the real promise of ICT is the mobile phone and mobile broadband internet.

    The best analogy between the age of autos and oil and current ICT is that what the model T was to the age in 1908, the iPhone of 2007 is to the ICT age. Hard to see the PC introduced in 1981 having the same impact the iPhone is having. If you include Asia into the mix, we are still in the installation phase of the ICT revolution.

    1. You disagree that we’ve transitioned to deployment (in her theory)? Are you anticipating another crisis, or do you not believe her theory at all?

      1. I think even if you agree with her theory, it seems the timing is off. As an estimate the investment in broadband during the dotcom days in the ’90s was about $100 billion. In mobile just the 4G investment in the next five years is forecasted to be about $1.4 Trillion [1]. Hard to make the case that the boom phase was in the ’90s as opposed to today. The PC/broadband portion of the ICT phase might be 10x smaller than the mobile/broadband installation phase we are in right now.

        This doesn’t include things like IoT, big data, AI which are poised for their own boom phase (how much of the economy has been rewired with these technologies even though everyone talks about them).

        I actually don’t know what to make of her theory as applied to ICT, since there are really three distinct phases (with investments, applications etc.) – PC, mobile, IoT and capital flows between all of them. Plus ICT impacts previous technologies like cars, oil etc as well. There is a much larger combinatorial aspect to ICT applications which I think the previous technologies lacked so maybe the theory is not as straightforward.

        [1] http://www.gsmamobileeconomy.com/GSMA_Global_Mobile_Economy_Report_2015.pdf

        1. I think the key differentiator between the periods is who is making the investment, not how much. During deployment, it’s financial (speculative) capital and during installation it’s production (investment) capital, for some conservative definition of speculative/investment. I think this fits the investment in network capacity. Dark fiber (Level Three, et al) in the 90s was primarily speculative. 4G today is well-planned and low risk investment.

          I agree that even under this definition, there are some questions: why is there so much venture (speculative) investment today? But how you answer that probably goes more towards whether you believe the theory or not than towards whether we are in deployment or not. (You might also ask whether VC today is primarily ‘speculative’ or not as well, which is the question that I’ve thought about most: cf., http://reactionwheel.net/2015/01/80s-vc.html)

          1. it might indeed be too simplistic to pick one set of technologies and/or system ans explain a full cycle with it. Indeed, there may be multiple overlaying cycles happening in parallel, one slightly earlier/later than the other.
            I agree, cables have been much more speculative than 4G today, but what about 3D printing, machine learning and AI etc.? Those technologies are way earlier in the cycle and they are not really considered by the big guys (yet). But each single one of those technologies has the power to change the world significantly (one famous TED talker recently said 3D printing will dwarf the entire industrial revolution)

          2. There might be, but that’s not what her theory says, and there’s some supporting evidence. The book referenced in the text (As Time Goes By: From the Industrial Revolutions to the Information Revolution) talks in depth about the controversy over whether these long waves exist and the evidence for and against them. Doesn’t mean it’s true, it just means you’re going to need more than opinions to justify your position.

          1. Totally fair. Yes, the underlying data for that one has been around for quite a while, and used frequently in the venture industry, although everyone picks a slightly different set of technologies to include. We also borrowed the underlying data for the automotive industry growth chart from work that Stefan had done while at McKinsey (it is also in his “Resource Revolutions” book. The three others that are third party derived are all referenced directly on the chart.

          2. The data “has been around for quite a while, and used frequently in the venture industry.” So…it’s bogus?

            Just to come full circle: it bugs me you don’t cite your sources, it’s knowledge unfriendly.

  4. Jerry, thanks for a refreshing piece of perspective.

    What strikes me most is the little overlap between Surges of Development in the first figure.

    But, this seems contradictory. In the “Number of firms participating in the auto industry” graph, the installation phase starts in 1908, which is the crisis period of the “Age of Steel and Heavy Engineering”. If the installation phase of the “Number of firms…” is the precursor of the “Age of Oil, Autos and Mass Production” then the n+1 installation phase starts during the crisis period of the n phase. Is this right?

    Since we are in the ICT Revolution crisis period:
    – If the former is true then the we will have to wait 15-30 years for the next installation period.
    – If the latter is true, then the next installation period is happening right now.

    Can you help to calrify?

    1. It’s a good question. Perez’ theory is a bit schematic so I can’t tell you what she would answer about the overlap of the maturity of the Steel and Heavy Engineering cycle and the irruption of the next. What she does say is that the next cycle irrupts when money is moved from financing the exitsing cycle to speculating on the next.

      The Steel and Heavy Engineering cycle was not (like most of the other cycles) predominantly in a single country. The railroad revolution took place in Britain, and the technology then diffused to other places. The age of oil and autos and mass production was based in the US. The steel and heavy engineering cycle was in the US, in Britain and in Germany.

      So, some hypotheses:
      1) The production capital financing steel and heavy engineering was disjoint from the financial capital financing autos and oil
      2) In conjunction, autos were low-capital in the early years and oil was self-financing, so less capital was needed
      3) Alternatively, the S-curve for autos and mass production was much less steep in the 1910s than in other cycles, steepening in the 1920s as more capital was made available (Perez dates the start of each cycle from the introduction of a representative technology, the Model-T in this case; this is a somewhat arbitrary way to date them…why should ICT be dated from the introduction of the microprocessor, not the semiconductor chip? In reality, S-curves extend further back and there’s no exact date the technological revolution starts to impact the macroeconomy, just a date it starts to be noticed in the popular imagination.)

      How long do we have to wait until the next irruption? Perez doesn’t say, really. But an installation phase is marked by a large amount of capital being deployed into it. Not just into startups, but into R&D (and, later, into infrastructure.) Right now logic says that the large amounts of capital being deployed into the existing paradigm preclude large amounts being spent on the next wave, whatever it is.

      That doesn’t mean the next wave isn’t coming, or isn’t being funded…but technological systems require more funding than just a company here or there, because each of the technologies needs to keep improving or it holds back the whole system. I think you can see this when it is happening…innovation building on and reinforcing innovation, one after another in what looks like a continuous stream. I remember this about the computer industry in the 1970s and literature shows it in both the 1910s in the auto industry and the early 19th century in railroads (less so about the age of steel, but that may just be a deficiency in my reading.) I don’t see it happening now in any of the promising technologies that might be the next wave.

  5. I found this article very informative, well written, and very thought provoking. I really enjoyed reading it, but as I’ve thought more about it over the past week I’ve found that I take issue with a few points that you make.

    I’ve written up my thoughts on my own blog at http://sagevoice.com/thoughts-on-the-deployment-age/

    I would love it if you would take the time to read my post, and let me know what you think. If there’s anything that you feel I got wrong or should change I’d be happy consider it.

    Thanks for the great article.

    By the way, I see from your responses to other comments that you are a stickler for sources. I haven’t had as much time as I would like to write my post, so I don’t have any footnotes, but I have tried to link to relevant information where I could, though I clearly could do more.

    1. You mistake my explanation of Perez’ theory as a “passionate plea” for it. My exploration of Perez is not an advocacy of her theory. I do not believe that theories need advocates, they need data that refutes them, or does not. And anyway, If I were to cheerlead for a theory, it would definitely not be Perez’. Perez predicts the end of my world. I have spent most of my waking life since I was a child immersed in the computer revolution, it’s all I know well and most of what I care about. The idea that innovation might become more stable is my worst professional fear. I hope to God that Perez is wrong and if she’s not my most passionate plea would be to find a way to change the dynamic.

      But I think there is substantial historical and current evidence that her theory should be taken seriously. Historical evidence, the kind I cite in the post, is convincing (of course, because that evidence is what the theory was built to explain.) And current evidence does not refute Perez, in my opinion. Your description of three separate computer revolutions she sees (and I do too) as three aspects of the same revolution, so saying that there has been no crisis in two of them is moot. Perez also does not predict that the crisis (the bubble popping) is the end of the revolution, as you can see in the second slide above, she predicts that it is a crisis of adjustment, somewhere in the middle of the revolution’s cycle. The revolution continues, it just takes a different cast and is financed by a different set of institutions with different goals. If you think about it this way, most of your criticism is answered.

      I disagree, though, that the technologies you cite as interesting are potentially the seeds of technological systems that will change the macroeconomy. AI assistants would be an interesting use of technologies already developed, if they ever really work, but they will not change the macroeconomy; they might make life slightly more efficient. IoT is different. I can imagine a scenario where IoT changes the economy in large ways and is one of the primary drivers of productivity growth over the next 20 years. But I would liken this more to the example I gave in the text of containerization driving productivity growth in the second half of the last cycle. That is: it is part of the current deployment phase and its innovations driven by production capital, not financial capital. For example, I am willing to bet that the agricultural IoT will be dominated by Monsanto, John Deere and the rest of the agricultural industry incumbents because the technology is just not that radical. If there is more than one IoT startup that becomes a major company over the next ten years I will agree that Perez was wrong.

      The economy is extremely large and so very hard to change. No one company or group of companies can change it. That’s why technological revolutions, which are defined as events that change the macroeconomy, have to be driven by technological systems, defined as a large group of interrelated technologies. Advancing a system costs orders of magnitude more than advancing an innovation, so the system has to be economy changing to make appealing the diversion of a very large part of the money and time spent on innovation. This part of her theory is, I think, inarguable.

      1. Obviously I was unclear in my conclusion. I’ll go back and fix that. The person making a passionate plea was Maciej Cegłowski, who wants the web to stop moving so fast, and is trying to argue that this is where we are at.

        I saw you not as a proponent of Perez, but as an explainer and instructor in something you very much considered a theory. I’m sorry I didn’t make that clear. I’ll try to fix that too.

        Thank you for the long response, because you have clarified a few points that I misunderstood about your original post.

        I’m working on a post that tries to explain why I think that the digital assistant is going to be revolutionary, and I’ll comment here when that’s finally ready.

  6. Great talk. Am curious how does the notion of ‘human capital’ fit into this, and is it at all synonymous with ‘production capital’, as juxtaposed against ‘financial capital’? The notion that: “financial capital’s job is done” connects with a short post we wrote arguing that today financial capital is less critical in producing innovation than human capital. http://blog.craft.co/post/125335502788/money-is-cheap. I’m not saying financial capital isn’t needed (workers need to be paid a salary), but rather financial capital is so abundant today that it’s not the deciding factor, whereas, perhaps, Talent is. How many people each given $10m in VC could create something worth more than $10m with it? You mention UI/UX, which is a great example, because you can’t just throw money at that problem (look at the UI on most big company apps), you actually have to find the rare bird with the (artistic?) talent to do it.
    Loosely related: you say that “the most obvious reverse salient is battery technology.” Is Allocation of Talent another? i.e. if/when we see another step-change in how efficiently talent finds work, as the railroads created, could this drive a new wave?

    1. That may be true strategically (i.e. the bottleneck right now is human capital not financial capital) but I don’t think it’s true that money’s less important. If that were true, we wouldn’t have companies like Uber raising so much money.

      I disagree, though, that the talent shortage means that the processes Perez describes are no longer valid. Good designers were probably pretty hard to hire back in the 1950s too (someone had to design better tailfins for the cars…sorry, seriously: all those consumer products were also designed by someone, and there were arguably just as many new consumer products from 1947-1967 as there will be from 2008-2028), good engineers were hard to find during the railway boom, etc. so this isn’t a new phenomenon.

  7. Gerry, this is brilliant! You’re right up there beside Peter Drucker in my mind – but far more transparent. It’s a lot to digest and I’ll be rereading and rethinking for quite a while. I’ll also be sharing this with thoughtful friends to get their feedback. Stunning scholarship, We need to do coffee so I can learn how you’re applying these insights. Profound thanks to Fred Wilson for bringing this to my attention. I had observed the shift in technological innovation from infrastructure to applications and have been puzzled by the insane valuations on unicorns like Uber that seem to be primarily outsourcing resources to minimize labor and other fixed costs. I have also observed a change in kind about early stage companies. It seems to me lots are “me too” and others improved ways of doing things we already do. Doesn’t the latter sound like the deployment phase is here? A thousand thanks for the time and effort that went into this.

  8. tl;dr Historical analogies need not apply

    I’m not a hard-core singularitarian, but these historical parallels seem to me to be missing a glaring point: each of the developmental surges cited effectively made people a bit stronger or faster. The combination of exponential data processing and biotech is in the process of not just making people better, but of making *better people” and even steering asymptotically toward replacing them—a difference not just of magnitude, but of kind from any previous historical process.

    Even without a steep AGI takeoff in the foreseeable future, expecting a protracted period of consolidation seems bizarre. I think it’s much more likely that any “pauses” will follow the general exponential trend of tighter product/innovation cycles and become shorter and shorter until the “flip-book” of history assumes the appearance of continuous motion.

    Arguments against this appear to reduce to a “God of the gaps” approach; “But computers robots can’t…” is a losing proposition. It’s (robot) turtles all the way down, forever.

    1. I’ve heard that argument, that innovation is continually speeding up, several times, but I’ve never seen any evidence to support it. And it should certainly be visible in the data if true. I’d be curious to see any empirical studies or whatever.

      1. That’s precisely the problem—in two respects. If you look at a graph of the advancement of powered human flight, you will see a flat line parked at zero, stretching back to the dawn of time. Until it no longer is. The maximum distance traveled by a human in a lifetime? Ka-boom.

        I recognize and even share your desire for more rigorous empiricism, but it’s a logical impossibility. We have never had effective mental amplification in the history of the world (books being the closest analog, and see how that turned out?) Since, as the prospectuses say, “past performance is not an indication of future performance”, we are all relegated to argument by analogy, regardless of how you may wish to dress it up under the fig leaf of empiricism. I respect your decision as it applies to yourself, but I think it’s extremely short-sighted.

        As for “never seen any evidence”, I’d be interested in hearing how you explain the exponential computational explosion of the past 100 years. “No evidence?” Or is your criterium for evidence not actual capacity but a 1:1 correspondence to GDP? (Hard to defend when the price per unit is so deflationary…) So let’s look at cars: 100 years of slightly faster & safer & heavier. Do you honestly believe there won’t be a sea change marked by the 1st DARPA challenge and ending with 100% automation (possibly even mandated) in the very near future—and that this won’t have dramatic and wide-ranging effects? Do you not think that this won’t be much, much faster than the original historical motorization? Sounds accelerated to me. Our understanding of the biochemical underpinnings of life? Clearly exponential. And the implications are just starting to fruit.

        You argue that there is a lack of evidence. I believe you may be suffering from a lack of insight and would ask, “What use is 3/4 of a baby?” (let’s say the *bottom* three-fourths.) Now view AI & crispr/cas9 and all the other things I could list, and make a compelling argument why none of them will lead to the kinds of dramatic changes you seem to be denying. I reiterate: The fact that you haven’t ever seen it before *is the point* — nobody has.

        1. Certainly technologies undergo exponential growth in a characteristic, until they flatten out. Airspeed for planes grew very quickly, as you note, then flattened out…not because faster speeds weren’t possible, but because the cost-benefit was no longer there (read The Simple Science of Flight by Tennekes for a great discussion of this.) Computational speeds also increased exponentially, until about 5-8 years ago when the growth rate started to decline.

          But as I understood your original argument, it wasn’t about the speed of increase of a technology, it was about a decreasing time between introductions of new technological revolutions. I’m not downplaying this argument, I’ve heard it from some very smart people who are experts at systems dynamics. They say it’s an inevitable result of systems math (Geoffrey West has done some interesting work on this, http://www.santafe.edu/about/people/profile/Geoffrey%20West).

          But a compelling mechanism is not enough. If this dynamic takes place, we should be able to observe it. Not the specific future instantiation of it–that hasn’t taken place yet, as you note–but the dynamic itself. Yes computers make us better thinkers (in theory) so we should be able to think more things. Yes the internet makes us better communicators (in theory) so we should have more of an innovation network effect. But just because the internet and computers are new doesn’t mean the same dynamics at a different scale weren’t observed with the printing press and the university. If that is so, then why hasn’t each of the cycles been shorter than the previous? (Maybe they have been, depending on how you define them, but that would be the sort of evidence I would find convincing.)

          The reason the theory says that the technologies you mention won’t change the economy in the next ten years is that changing the economy is expensive. It requires capital far in excess of the paltry amount VCs can bring to bear. The theory is a theory of where and why capital backs certain systems at certain times. You may believe this is untrue, but there is some evidence for it and you should give weight to that. On the other hand, it’s a vague theory and difficult to refute with evidence, since social sciences evidence is often somewhat contradictory, so there is clearly room to refute it.

  9. Hi Jerry,
    As ever, a great post. Really thoughtful.
    If we are in a deployment age, I’m interested in how the incumbents can do this. If, just as you say, software eats the world and is ignored, so innovation has to spread to all rather than be top-down resulting in less need for corporate venture capital (CVC)/skunkworks, etc.
    It might be this happens longer-term but I foresee a considerable period where the CVC/external innovation function grows increasingly important and more integrated into an overall innovation strategy (M&A, joint ventures, R&D, etc) until a cultural change in internal organisation culture and management has happened.
    It is as noteworthy to me that even relatively new businesses, such as Airware and DJI, are setting up venture funds(1) to help make sure they stay ahead of competition and build out a market.
    Structurally, companies are rethinking how they are organised to better deploy innovation and ICT (2) and the issue of siloed open innovation/CVC or fully integrated in each team member will be a discussion point, just as whether their goals are more or less financial versus strategic on a continuum.
    I’m unsure, therefore, if, as you say: “Corporate venture capital, whose point is not to make money but to provide a view into the direction and speed of entrepreneurial innovation, will become passe. When innovation is easier to predict, companies won’t need this view.”
    As you also say, change is the constant, and the 1950s became the 1970s and then 1990s and new shocks emerged, even separating from wanting a toolkit that helps identify the partners and innovations that might help. The desire for predictability will increase if we are in an age of deployment and part of likely improvement in outcomes will come from the iteration through effectively crowdsourcing ideas and testing that a good open innovation/CVC function can bring.
    In this vein, Carlota Perez and Mariana Mazzucato last year wrote(3) about how the venture function should effectively be broadening through government direction.
    CVC might become passe in the long-term but only when it is fully incorporated and enacted by all groups, large and small. Our research at Global Corporate Venturing(4) indicates only about half of the Fortune 500 have even started their programmes.
    Thanks, Jim

    (1) http://techcrunch.com/2015/05/27/drone-fund/#.mihse6:u2f7
    (2) http://www.theatlantic.com/magazine/archive/2015/10/are-bosses-necessary/403216/
    (3) http://marianamazzucato.com/wp-content/uploads/2014/07/SPRU-WP-Mazzucato-Perez.pdf
    (4) http://www.globalcorporateventuring.com/pages/cvworld2015.html

    1. Jim,

      Sure, change is a constant over the course of centuries, but it isn’t, necessarily, over the course of a decade. How much real change was there in the 1950s? Not discoveries or inventions, but real, economy-changing change? If you exclude the massive national security state funding (and that wasn’t a part of that cycle that seems like it will repeat; ISIS isn’t ‘competing’ with us the way the Soviet Union was), was there any? So I’m not thinking about whether there will be CVC in 2030, I’m thinking about the next ten years.

      Most CVC programs, as you know, have a muddled strategic aim, and I think that dooms most of them even in the best of times. I think Perez’ theory also dooms two of the most common reasons for CVC: a window on fast-changing technology, and profit. I think it leaves open, at least, a less-used reason: having outside professional investors involved in the management of a startup increases its chance of success. But it’s the rare corporate executive who believes they are not the expert or can hire the expert in any possible thing. The fact that the only way to learn the tacit knowledge needed to pick and guide radically innovative companies is by working with radically innovative companies, not established companies, is taboo thinking to a F500 CEO, because that would mean acknowledging that they are not radically innovative.

      That said, Perez says that each cycle has its own dominant form of capital formation. Venture capital was this cycle’s. She does not say that the form disappears in deployment. History seems to show (on a superficial reading, I haven’t really investigated this) that the form survives but is bent towards production capital. The only way I can conceptualize venture capital adapting to production capital would be the internalization of CVC: corporations adopting the venture capital approach, instead of top-down planning, to decide on internal projects. This would certainly fit into this cycle’s emphasis on network-style management instead of hierarchy. You may be able to think of other models.

Comments are closed.