Saturday, April 15, 2017

Differing without Dividing

Variety is great for innovation. For instance, consider the case of Seymour Cray, the “father of the supercomputer.” In the 1970s, Cray left Control Data to start Cray Research, a company devoted to creating the world’s fastest computer. Cray approached the problem with a revolutionary architecture, so called “vector processing.” By 1976 he and his team introduced the Cray 1, and Cray Research was seen as the Mecca of high-speed computing. John Rollwagen became company President in 1977, bringing business leadership alongside Cray’s technological prowess.


In 1979, Rollwagen brought in another technology genius, Steve Chen, to lead the design of a completely different approach to supercomputing. So as Seymour Cray’s team worked on the Cray 2, Chen’s team worked on the Cray X-MP. Chen’s design built on Cray’s initial innovation, but did so using a revolutionary architecture featuring multiple processors operating in parallel. Released in 1982, the X-MP set a new standard for supercomputer performance, and significantly raised the bar for the team working on the Cray 2.


When we do not know what the future holds, variety helps our organization to discover what is possible. This truth is one reason why we often hear people saying that they want to increase the diversity of their employees. Just like the biosphere, organizations evolve better if they sustain variety.

Yet examples like Cray and Chen’s are rare. One reason is that sustaining variety is expensive. How inefficient to run multiple projects that are trying to do the same thing. But another, bigger problem is that sustaining variety threatens to divide a company. People object to having others in their company working at cross purposes. How can we encourage differences without being divisive?

One way is to live by the adage “disagree and commit.” Here in Silicon Valley people attribute the saying to Intel. The idea is that you should encourage disagreement during the decision-making process, in order to improve the quality of your decisions. But once a decision is made, everybody needs to fully commit to its implementation. Unfortunately, in practice this saying often is used to silence those who see things differently. Often managers say “disagree and commit,” but they are really saying “disagree and shut up.”


I prefer “switch and commit.” The goal is still to end up committing at the end of the process, but during the decision I want the participants to switch roles. The person disagreeing with you needs to take your position and argue it well. Similarly, you must argue the other’s view well. You can think of the approach as devil’s advocacy taken seriously by both sides.

I first tried “switch and commit” when teaching a controversial topic here at Stanford. For the first assignment, the students had to state their position on the topic. For the second, big assignment, they had to write an essay taking the opposite view. (They did not hear about the second assignment until after they handed in the first.) The end results were some fantastic essays, because the authors were legitimately skeptical.

Since then, I have tried “switch and commit” when facilitating hard-hitting business meetings among top managers. The results have been mixed. Many people cannot get their head around a different perspective. But now and then you find an exceptional leader who appreciates the value of differing without dividing.


A readable review of related academic work is Scott Page’s book The Difference.

Thursday, March 30, 2017

On VCs, Clairvoyants, and Magicians

The guy at the next table looks out at the amber sunset, puffs up with gravitas, and announces to his wide-eyed friend “soon all things will be connected seamlessly to the ubiquitous network." Time to change tables. I grab my drink and set off to find an area outside of Houdini’s vocal range. The bar at the Rosewood is cursed by its reputation as the place where VCs from Sand Hill Road meet, so it attracts posers playing the prophet like LA attracts actors.

Coincidentally, on my way over to the Rosewood, I saw along El Camino Real a hand-painted sign saying “clairvoyant conference” with an arrow pointing to a hotel. Imagine a conference for clairvoyants! You would not need to have sessions on “future trends,” since they would already know. But wait: Why the sign giving directions, for that matter?

Since time began, it seems people want to believe that some of us have a special knowledge of what is to come. So we’re vulnerable to those who claim such knowledge - albeit that different people fall for different images of the prophet. You might mock the trappings of the village shaman, the tarot reader, and the astrologist – but I bet you’d clear your calendar to hear the latest word from the valley’s richest VCs.

I’m not just waxing cynical. It turns out that venture capitalists typically are bad at telling the future. As an industry, VCs don’t perform that well financially. To find VCs who outperform the market, you have to selectively sample only the most successful ones - but that is true for slot machines, too. OK, a minority of VCs do tend to appear repeatedly on the winning side, so perhaps they are the real Houdinis. Maybe. Or maybe having been successful they end up getting preferential access to things that continue to make them successful. (I appreciate that advantage, working at Stanford.)

Whatever the reason, the fact that at least some VCs are repeatedly successful has created the mystique that there does exist, somewhere along Sand Hill Road, somebody who does know what’s next. Hence all the puffery at the Rosewood. How are we to know that he’s not really a visionary?

Professor Elizabeth Pontikes and I took a careful look at this question. We collected data on thousands of firms in the software business and looked at their fates over time – including both successes and failures in the data. We found that these firms herd into “hot” markets that have been blessed by the VCs. But we also found that the VCs herd into markets too, following each other in financing frenzies. The resulting hype cycles lead to bad outcomes for companies; firms getting funded in these waves are the least likely to ultimately succeed by going public. So much for Houdini.

Perhaps even more interesting, the VCs themselves seem to be aware of this problem. While VCs herd into hot markets, at the same time they try to avoid investing in firms that do so. The VCs prefer instead to invest in those who pioneered what is now a hot market (and survived). As a kid I had a precocious classmate who would jump to the front when he saw a trend, pointing resolutely forward in Napoleonic fashion, proclaiming “follow me!” I’ll have to check; perhaps he grew up to be a VC.

Remember, the most disruptive changes are not predicted by our experts. They are pioneered by those foolish enough to ignore the consensus. Be skeptical of those who claim to know what's next.


For the research behind this article, see my paper with Elizabeth Pontikes.

Wednesday, March 15, 2017

Define the Game

"Change the game" you'll hear people say. In fact, we often define the game we play when we choose how to play it. But only some of us realize this fact. I was reminded of this lesson in Moscow by the French chess master, Joel Lautier.

Joel and the rest of the team had to solve the problem before morning. Vladimir Kramnik would sleep, of course. He would need to be rested fully, and even then it would be a long shot to beat Garry Kasparov. The dominant world chess champion was unlikely to lose. But the team had an idea. Even if it worked just once - there were many games in the match - it might open up the smallest of chances.



The team were all wiz kids, each a chess master in his own right. No one person could fully prepare alone for the many possibilities of a match at this level. So each of the grand masters had his team of trainers assigned to work out the best solution to a particular situation that might arise. By morning, each team member needed to have solved his problem and prepared the result of his analysis for presentation to Kramnik himself, who would scan the analysis and commit it to his marvelous mind. Few words were needed.

Joel Lautier was on the team because he is one of the few to ever beat Kasparov. Lautier’s job was key: Formulate Kramnik’s best “black” opening. Many readers will know about the most well-known chess openings. But, in fact, there are many more possible openings than the common ones, and by its nature chess allows for the invention of new openings to this day. But in such an old and storied game, the chances of inventing an effective new opening are not great. That was Lautier’s task.


What made Lautier’s job especially important was that Kasparov was especially good at the attack. Key for Kramnik would be to find a way to improve his chances in the games where he started as “black”- second - since those games would favor Kasparov’s attacking ability. And he would need to be able to repeat the opening over several games as black, a tough job once the opening was revealed.

Working into the night, Lautier formulated a risky approach for Kramnik. The opening was not ideal, but it might work given Kasparov’s strengths relative to Kramnik. It involved an odd series of moves, quickly leading to the trading of queens. When complete, the strategy - which has come to be known as the "Berlin" opening - left Kasparov, as white, with a slight positional advantage (with his players in slightly better places on the board). But the opening hurt Kasparov too, by skipping the complicated “mid game” where Kasparov famously had an advantage. The strategy worked. The opening shifted the edge enough for Kramnik to draw games that he would have otherwise lost.

Several years later, when I was lecturing in Moscow, Joel Lautier was in the audience and he asked me this question: “There may be many possible winning business strategies. How do we know which is best?” The answer comes from Mr. Lautier's example. The best strategy plays to your strengths, and away from the other’s. Don't just play the game as defined; define the game you play.


Research on "metacompetition," defining the game you play, appears in my book on Red Queen competition.

Tuesday, February 28, 2017

Why You Should Turn Down That Well-Paying Job

I remember being young and broke, going to an interview for an internal auditor job at a bank. The bankers who interviewed me were enthusiastic; they were authentic bankers. I did my best to pose as a banker, but as often happens to posers I was found out. The bankers asked me for a “writing sample.” I showed them my poetry. It was not to be.

Some people become accountants because they want a secure job. There is much to be said for pragmatism; better to be an employed accountant than a wanna-be actor. But is that the right comparison? Here is the issue: Somewhere tonight, maybe around 3 AM, some guy will be laying awake thinking about accounting. He lives and breathes accounting; it occupies his thinking even in his spare time. If things get competitive for accountants, he is going to dominate. His rivals are acting like accountants; he is the real thing.

Competitive advantage goes to the authentic. Their job is their avocation. They would do it, if need be, without pay. The authentic persist at getting through the tough parts of their vocation. They ponder it during the quiet times, so the magic of insight makes their work more creative. The gardener out on a cold morning; the writer typing away when she should be sleeping. Such people will take their vocation as far as it can be taken. By contrast, those who merely pose will not. As in the old adage “you cannot coach passion,” no amount of posturing can outdo authentic dedication.

The practical reader is objecting at this point, noting that there is a big economic difference between being an authentic accountant and an authentic writer. This contrast brings to mind a corollary adage: “Every person has a special gift.” As each of us grew up, we searched for that gift – the activity that seemed authentic to us. Our parents hoped that it might also be an activity that pays. How fortunate is the authentic accountant! His passion lines up well with economic gain. Meanwhile there goes the authentic musician, waiting on the accountant as he dines. Don’t get me wrong; I love the arts and admire the authentic artists. But though all of us may have a gift, some gifts pay better than others.


Does this mean that only some of us can follow our calling? That depends on how long we search. My failed attempt to be a banker left me broke still, but the upside was that I kept searching. Life is a “sequential search” process. We search, one by one, trying to match our gifts with the opportunities of the world. When we settle on an occupation, we also stop our search. If we stop the search at the first pragmatic job, then we are posing - and will surely be out-competed by the authentic. But if we keep searching, we increase the chances of matching our gifts with opportunity. Of course not all jobs pay the same. But better to keep searching for a way to remain authentic, than to settle early for mediocrity. Search enables authenticity.

The lesson: Ask “what do you do well?” and then search to see how that ability fits the opportunities of the world. You will have failures along the way if your search is thorough. But the upside of each failure is that you'll be required to keep searching, again increasing your chances of finding a match between your passion and the opportunities of the world.

More dangerous than failure is that you might, early on, score a well-paying job for which you are not authentic. Turn it down. Search enables authenticity.


For an academic treatment of the sequential search strategy, see Levinthal and March’s paper.

Wednesday, February 15, 2017

Learning without Logic

After Napster was shut down in 2001, the brand was reborn in 2003 as a subscription online-music service run by Roxio’s Chris Gorog. Chris and his team quickly amassed a large catalog of songs, enabled radio streaming, established partnerships with online platforms like yahoo, built an entrepreneurial organization, and expanded internationally. As record stores became history, Apple’s iTunes, illegal music downloads, and a few subscription services like Napster offered different visions of the future. But by 2005 the verdict was in. Illegal downloads continued apace, iTunes was a clear success, and subscription services were not. As one Washington Post writer put it (in 2005), Napster’s subscription model was not a viable alternative to music ownership: “When music is good, you want to know that it can’t be taken away from you.” The final nail was Steve Jobs' declaration: "Nobody wants to rent their music." The experiment had been run, and the music ownership model beat subscription services.

But wait. With the explosive growth of services like Pandora and Spotify, the pundits are now saying that subscription models are the future. Even Apple has launched such a service. What about the lesson we learned from the failures of just a few years ago?

The problem here is that a failure is a datum, not a logical argument. Data do not speak for themselves. Failures can have various causes, and so it takes logical reasoning to explain why failures happen. Perhaps the early subscription services were ahead of their time, such that limited bandwidth might have made them less attractive than they are today. Or maybe the smartphone is a necessary complement to such services.  Whatever the diagnosis, logic is required to sort out why firms succeed and fail.

Unfortunately, most observers skip the logic part. It is mentally easier to jump to the “obvious” conclusion: If the business failed, the business model must be wrong. Full stop.  You can easily tell when this skip happens. The person will name an example as if it were a reason. Is online grocery delivery a viable model? No: Webvan. Is internet search a viable business? No: Alta Vista. These examples are data, not logical reasoning. But it is hard to rebut those who argue by citing examples, because you look the fool trying to say that a failure somehow might have made sense. Like Gerald Grow’s cartoon, we replace reasoning with dueling examples: I shout “iTunes!” you reply “Spotify!”


The result? We often “learn” without logic, and so we often walk away from great ideas. The Apple Newton failed, leading many to say that there was no market for smart handheld devices - yet now we all own them. Early attempts at remote alarm systems failed, leading many to conclude that such services could not be profitable; now they are commonplace. Even internet search, possibly the most lucrative business in history, was initially panned after a spate of failures among early movers – Lycos, Alta Vista, Excite, and others. Often firms fail.  But that may not mean, logically, that we should abandon their business models entirely.

To diagnose well, we need to systematically contrast failures and successes - as is done in good academic research. The popular maxim “fail fast and cheap,” A/B testing, agile development, root-cause analysis and similar approaches are designed to show us successes and failures without destroying the firm. These techniques routinely are used in Silicon Valley firms these days, and are making their way into the global business lexicon. Sometimes such techniques are very effective for learning. But keep in mind that these techniques simply provide us with data. It is up to us to explain the data, and that requires logic.


The academic research on this topic can be found in the research of Jerker Denrell.

Monday, January 30, 2017

The Truth about Hiding from the Truth

If you find an old-timer at the Dulzura Cafe, ask him about Bulldozer man and his fence. He was old back when I was young, and like many in this rural California outpost near the Mexican border, he used his acreage as he saw fit. Many of us shot skeet; some just left the sagebrush alone and enjoyed the isolation. Bulldozer man owned a big old Caterpillar bulldozer, and he spent his time moving mounds of dirt hither and yon.

Now, about the fence. True story. Happened in 1975, just outside of Dulzura. One fine Spring day Bulldozer man visited his neighbor, an affable, transplanted New Yorker who had gone native, complete with horses, boots, and plenty of Coors. Bulldozer's proposition was that they two share the cost of a fence that Bulldozer man was willing to build. Indeed, he had already begun piledriving large holes along the property line. But the affable Coors drinker saw no need to break up the beauty of the countryside with a fence. Bulldozer was enraged, especially since he'd already started on the hole digging. He stormed off, shouting something about how the fence will be all his. Soon the measure of this man had become public for all to see: Bulldozer decided to make the fence "his." He backed it up a full 10 yards, so that it was clearly and completely on his side of the property line, effectively giving up hundreds of square yards of real estate to our affable Coors drinker. Many a Coors was raised in thanks to this dimwitted neighbor in the years since.

So it is that often when we look out for #1, we end up doing more harm to ourselves than good. Same goes for public policies meant to protect domestic jobs and economic vitality. Truth is, when our companies have to compete, it does them good. You don't get good at anything by hiding away. (Think of how you shop for schools for your kids. You certainly don't look for a place where they can perform as poorly as possible and get away with it. You probably look for the best school, and do everything you can to encourage them to compete.)

Same with companies. Faced with competition from other countries, domestic companies either improve their performance or fail. There is plenty of evidence to back up this claim. Especially notable is a recent paper by Stanford economist Nick Bloom and his colleagues. They found that when Chinese imports increased as a result of that country entering into the WTO, the impact on firms in other countries was profound. Those firms picked up their game, often innovating much more in order to compete. The firms that did not pick up their games lost business, of course. But in the end, having to deal with competition from places like China turns out to be a big reason we have vibrant firms in today's economy.

Tough talk may sound good, but it does not make you a winner. And, for folks like Bulldozer man, bluster provides cover for downright stupid, self-destructive actions. When all the tough talk is done, you become competitive by competing. Hide from that truth if you wish, but the person you're hurting is yourself.


Read the research on this by Bloom, Draca and Van Reenen.

Sunday, January 15, 2017

Why You Don't Understand "Disruption"

Been to the "Disrupt" conference? Self-proclaimed "disruptors" gather to reach consensus about what are the non-consensus ideas out there. 

Big wigs having a conference on disruption is like the Czar creating a bureau on revolutionary thinking. Really want to see disruption? Don't go to a conference. Go to where people are breaking the rules.

If you just smiled, then you are probably from a small startup (or wish you were), and you know that disruptions come from startups who break the rules of the game.

For example, consider this idea from a small team of rule breakers: Provide a way to instantly share digital photographs with others anywhere on earth - but only with those who you want to see the photo.

You are thinking Instagram, the tiny company acquired in 2012 by Facebook for $1 billion.

Wrong.

I'm describing a project launched in 1996 - that's right, 1996 - by a group at Kodak's Brazil headquarters in Sao Paulo. (Yes, Kodak - everybody's favorite example of a company that failed by being too slow to innovate.) Kodak's country head for Brazil, Jarbas Mendes, and his team were trying to find innovative ways to help customers share their digital photographs. The team understood that the internet - brand new at the time - could enable such sharing. So they designed a system where one could upload photographs to a server in the cloud (though nobody yet used the term "cloud"), and send a code to another person who could then view the photographs. "The technological possibility of having an online way to view pictures was the idea. There was a lot of work by the team on this approach to sharing." recalls Joao Ciaco, who was in a marketing role on the team at the time.

What we now call Instagram was invented by Kodak in 1996 - 16 years before Instagram would be acquired for a billion.

How can this be? After all, we often hear that big, established firms are slow to innovate, and so they get disrupted by new technologies. As the story goes, success at a well-honed strategy leaves companies blind to the value of new technologies until it is too late. If this is how you understand disruption, you believe in the slow-incumbent myth.

It turns out, Kodak is not a strange exception. Often big, established firms do a great job of rapidly adopting new technologies. With success, leaders are often more willing to innovate – even when such innovations are out-of-step with their traditional organizations. And therein lies the problem: “success bias”.  We misread our success at one game, and so readily launch into another – whether our organization is suited for that business or not.

Looking again at Kodak, it was the first mover in digital cameras, and it held an early lead in that market. (See the new paper on the digital camera revolution by Jesper Sørensen and Mi Feng.) Kodak even made the digital cameras sold by other firms trying to be in that market. The problem was not Kodak’s ability to innovate. At work was the poor fit of its organization to the logic of the digital business.  If anything, Kodak was too willing to innovate given its organization.

Same with the minicomputer firms like DEC. They are often criticized for resisting a disruption. We know that the personal computer cut the legs off of the market for minicomputers (powerful mid-range computers and servers) starting in the 1980s. At that time, the cutting edge of the computer industry – the real “hackers” – were the minicomputer manufacturers like Data General and DEC that flourished from the 1960s through the 1980s. They were scrappy, imaginative rebels compared to the monoliths of the mainframe computer business. The secret to their success was imaginative design, since they relied on the architecture of the entire system for performance. And, as Tracy Kidder romanticized in his novel Soul of a New Machine, they were passionate about getting products out into the market. That book documented the tale of the cult-like Data General, and its creation of the Eclipse MV/8000 minicomputer that launched in 1980.

Technology writers, decades later, would describe these innovative firms as unable to change.  The slow-incumbent myth: These successful, established firms did not see the microcomputer coming, since they were wed to the technologies and designs of the old market that they knew well.

Not true.

The real story is that the most successful minicomputer companies made the transition to the personal computer very quickly – but once there they were ill-suited organizationally. Success bias was at work yet again. For instance, Data General released its first microcomputer in 1981 – the same year as IBM. And DEC – another legendary champion of the minicomputer era – entered with the “Rainbow” in 1982. These fast-moving firms had no problem innovating. They could and did. Their problem was that everything else about their organizations was well tuned to their traditional market. They innovated in the PC market very quickly, and then they failed there at a very high rate.

We want to believe in the slow-incumbent myth, so we dismiss the early moves by incumbents as half-hearted. But look again at the evidence. Successful incumbents are often very innovative – too innovative for their own good. What is going on in these cases is success bias. When business leaders win, they infer from victory an exaggerated sense of their own ability to win.  So they are overly eager to enter into new competitions – even ones where they are not well suited to play. Their very success in the earlier business is evidence that they are well-honed to an earlier strategy - yet it is that earlier success that makes them especially willing to move into the new competition.

The lesson for leaders? Disruption is not just about technology changing; it is about changing the logic of a business. Success with a new technology requires organizing for a new logic, and organizing in new ways requires that you forget the successes of your past.


The theory behind success bias among managers is in this paper by Jerker Denrell, and evidence linking success bias with failure is in my paper with Elizabeth Pontikes.

Saturday, December 31, 2016

Leading Truth or Denying Reality?

How we talk about a fact shapes its meaning. Was that new product a failure, or did we just come down the learning curve? Did your career just take a hit, or are you pivoting into a promising new future? Facts are ambiguous, and so how we describe facts helps make sense out of them. One person's rebel is another's traitor, and a well-told story (think Hamilton) will tip the balance.

Of course we all know that effective leaders know how to use narratives - stories that give meaning to facts. But the use of narratives can be either very good - or very bad - for the future. It all depends on whether a narrative leads, or denies, the truth.

In some cases, narratives are used to lead the truth. While the major carmakers were spinning narratives against California's zero-emission vehicle mandate back in 2003, Elon Musk and his colleagues created Tesla, a counter-narrative that has changed the truth about electric cars. Because multiple futures are possible, those who shape what we regard to be possible change our efforts to make those possibilities real. So it is that a narrative can lead the truth.

Leading the truth is not just "spin." Spin is about putting a good face on bad facts. In contrast, leading the truth creates reality by helping us see what is possible. The greatest leaders in history are important not for what they created, but for what they helped others to see as possible, and so create. Once created, what was once considered impossible is then seen, rightly, as the truth. In this way, great leaders lead the truth.

Alternatively, other leaders use narratives to deny realities that we wish were not true. How convenient it would be if man-made climate change were not real. Psychologists tell us that we are prone to believe what we wish were true, even if this requires denying reality - as we see with science deniers confronted by the looming reality of climate change.

Time corrects such wishful thinking, of course, as the facts come to be undeniable at some point. But in the meantime, nefarious leaders take advantage of our desire to deny reality by spinning narratives that play to our weaknesses. When I was young, I recall hearing leaders spin narratives to deny reality: cigarettes are not really bad for you; the US was winning in Vietnam; climate change is a hoax.

Time will tell, of course. In time, we'll look back and know that some leaders were visionary - they used narratives to lead the truth. Others will be shown to have been reality deniers, and history will judge them severely. The problem, of course, is the damage they do in the meantime.


To dive into the large academic literature on narratives and counter-narratives, you might start with the work of Michael Bamberg and his colleagues.

Thursday, December 15, 2016

Leading by Design

  In 1993 the software startup FITS moved from Toulouse France to Scotts Valley California (a town at the edge of Silicon Valley). Their founder, Bruno Delean, had invented a radically new approach to editing photographic images – an important breakthrough given the hardware limitations of the time. Delean’s team worked tirelessly, aiming to get the software in shape for a big demo at Silicon Graphics, at that time a powerhouse in computer-aided design. The team worked day and night, stopping now and then just to eat. They were not paid that well, nor were they working for a noble cause; it was just software. They worked because they wanted to do a good job for Delean, who was a legend among French software developers. Delean himself had done most of the coding, and was there working side-by-side with the team, always available when a tough call had to be made. He led in the most compelling, direct, and personal way possible: by example.


Just up the street on the other side of town, Steve Luczo was re-creating what was to become one of the most innovative companies on earth: Seagate Technologies. Steve had taken over the helm at Seagate, and began to turn the perennially late-to-market disk drive company into the industry’s unrivaled leader. He changed the organization’s structures, routines, and culture dramatically, creating within Seagate a global development platform that brought improved technologies to customers sooner and more profitably than any firm in the industry’s history. His people worked tirelessly, and mastered the science of rapid product development. Innovation at Seagate became routine, and the company transformed storage technology as we know it. Steve Luczo led this company, but not like Bruno Delean. Steve Luczo led by design.










                                                        
Leading by example shows the way, but leading by design creates a system that discovers the way. Those who lead by example are authentic, because they put their words into action – like Delean. But they are limited to what they know, and what they can do. Delean’s firm ultimately was limited to what he could imagine, and so no longer exists today. Those who lead by design do not invent, nor are they involved in the specific decisions to get the job done. Like Luczo, their names are not on patents. Instead, they build the culture, routines, and structures within which others can flourish. Done well, such leadership creates an organization that takes us places we never imagined. Seagate’s innovations were not foretold by Luczo, but they were created by the organization that he put in place. When you lead by design your job is not to know the future, but to create an organization that discovers the future.

Leading by design is especially effective in changing times, because when times are changing it is difficult for any one person to know what is next. In fact, our successful leaders typically do not have a very good track record when it comes to predicting the future. For starters, their very own success likely came about in ways that they, themselves, did not expect when they were starting out. (That is true for Google, Facebook, and Apple, for instance.) And if you look back on the predictions made (and not made) by our luminaries at any point in time, the track record is unimpressive.  In 1992, for instance, virtually no leaders in the technology world were predicting that the worldwide web would soon explode onto the scene. Like a clairvoyant caught in an avalanche, somehow our technology leaders failed to see the worldwide web coming. Look back at what the experts were saying before many of our most profound innovations, from the microcomputer to wireless telecommunications, and you’ll find they were typically off the mark. But when we lead by design, we do not pretend to know what is next. Instead, we create an organization designed to discover possibilities that we never dreamed of.


The classic academic treatment of these ideas is in Selznick's book on leadership.

Wednesday, November 30, 2016

Bake Your Own Pie

Recently I was lecturing a group of high-level Chinese executives, when one asked me: “What do you think of plagiaristic innovation?” Before I could answer, he went on to explain that for China to “catch up,” he felt it needs to have innovation of any kind – even what he called "plagiaristic" innovation.

Don’t worry. I’m not about to rehearse the well-worn arguments about the protection of intellectual property: incentives for continued innovation, just rewards for investors who back authentic creativity, quality guarantees for consumers of branded products, and the like. Nor am I going down the “information must be free” path – indignantly advocating “free as in free speech (not free beer),” “stick it to the man” (the artist is not getting the payments anyway), or the “hackers’ code of ethics.” No, here I’m talking about something else.

My point here is about what “innovation” means. Debates about intellectual property, stealing, and plagiarism are all about who owns the pie. That question is very important, and is obscured when patent “trolls” flood the system with complaints, or when plagiarists masquerade as innovators. But another important point often gets lost in the fray:

Innovation is not about fighting over the pie; it is about baking a new pie.

For example, hybrid vehicles hit the worldwide market starting in 1999 and 2000, and within a few years an echo of patent litigation followed – escalating in 2003. The big car makers battled over who invented what, sometimes with each other and sometimes with small firms, everyone claiming a piece of the pie. Meanwhile, also in 2003 but with far less fanfare, Elon Musk and his team of co-founders created Tesla, the forward-looking innovator that has changed the game in the automobile industry. The noisy pie fights in 2003 were over hybrids; the profound innovations of 2003 were quietly happening at Tesla.

Pie fights extend to all walks of business life, not just battles over intellectual property. For instance, the so-called “browser wars” between Netscape and Microsoft were at their peak in 1998, following Microsoft’s integration of its Internet Explorer browser into its ubiquitous operating system. Advocates of competition howled, and defenders of Microsoft replied with talk of “seamless technology” and “complementarities”. Also in 1998, but unknown to most people at the time, PhD students Larry Page and Sergey Brin created Google – the company that would change the game so thoroughly that we would soon forget about those early browser wars. The noisy pie fights of 1998 were the browser wars; the great innovation of 1998 was quietly taking shape at the newborn Google.

"Wait," you are saying. "After innovating, innovators need to defend their creation." Of course. Take QualComm, for example. That company has an unparalleled track record of continued innovation in wireless technology. As a result, its intellectual property has turned out to be extremely valuable. It has of course defended that property against plagiarists; it owes that to its shareholders. But QualComm transformed its industry by innovating - never mistaking defending IP for creating new, valuable technologies.

All around us, we see real innovators at the cutting edge of knowledge. Have a conversation with my son Burton Barnett, pictured here doing science, and you won't hear about pie fights; you'll hear about amazing new developments in immunology. And similar developments are happening worldwide - in China, Europe, India, the Americas - everywhere forward-looking people are creating new knowledge. This process of innovation is key to our collective future, and it has little to do with plagiarism or pie fighting.

The lesson to innovators: Pie fights are important; we all deserve our piece of the pie. And of course even true innovators often must fight off plagiarists. But being good at pie fighting does not make you good at innovating. Innovation means baking a new pie. 

The lesson to plagiarists: Want to create something useful? Leave the other guy’s pie alone and learn to bake.


Research on the uniqueness of innovators appears in the work of Lee Fleming and Olav Sorenson, among others.

Tuesday, November 15, 2016

The Time-to-Market Strategy

In many industries, products are profitable only during a limited window of time. We see time-sensitive products, of course, in consumer electronics, where new models of phones, computers, and home entertainment products come (and go) frequently. We also see this pattern in fashion markets. Zara, for instance, introduces new clothing products twice weekly across nearly 2,000 stores – amounting to the introduction (and retirement) of over 10,000 designs a year. Timing is key, as well, in the introduction and short shelf life of cultural products, such as popular music and films. The market for video games follows a similar pattern of rapid new-product introduction and short product lives, including flash-in-the-pan hits like Pokemon Go as well as the more cadenced, but still time-sensitive, annual replacements of the various EA Sports games. Innovative pharmaceutical products also compete in a time-sensitive way, since the first to patent enjoys a profit premium that ends abruptly with patent expiration. Consequently, drug companies are in a race against time, first to file the patent, and then to bring the drug to market. Even some durable products, such as automobiles, are introduced and retired on a time-sensitive, if cadenced, basis.

Time-sensitive markets can be identified by the presence of two underlying features. First, these market place a premium on introducing a product early – or at least on a reliable schedule where being late is punished. For instance, releasing a new version of a gaming console like Xbox or Playstation late - after the holiday season – would be unthinkably costly for Microsoft or Sony. Second, the products in these markets also have a limited shelf life. A new clothing style or a new hit song may be wildly popular today, but within weeks (or maybe days) they will be yesterday’s news.

In such time-sensitive markets, success depends on a distinct logic: the time-to-market strategy. To understand this strategy, consider a firm that introduces a successful new product, “Product 1,” as in the plot below. Initially the product takes off slowly, and then it catches on, finally reaching a maximum market size. This “S-shaped” diffusion curve is typical of successful products.
The story does not end there, however, because this company is facing a market where the shelf life of a product is limited in time. Consequently, it is preparing to introduce a second product, “Product 2,” that will compete directly with its own Product 1. Note that if the firm does not introduce Product 2, somebody else will. That is what is compelling the company to continue with the introduction of another product even though it will cannibalize its first product, as shown here.
Now focusing in on Product 1, we can see that there is a limited window of time during which the firm can make money selling the product. It is interesting to think about the price that can be charged for Product 1 over the life of the product. Typically, the price that can be charged will be much higher earlier in the product’s life, for two reasons. First, the earliest buyers of the newly introduced product will be those with a greater willingness to pay for the product. For instance, if Intel comes to market with a new chip that is extremely valuable to cloud service providers running state-of-the-art server farms, these eager buyers will be the among the first to buy this new chip and they have a high willingness to pay. Less enthusiastic buyers will also buy at some point, but only if the price falls. Second, the price will begin to fall over time as other competitors come out with a product that is a direct rival to Product 1. For instance, perhaps AMD will introduce a competitor to Intel’s new chip. As the price falls, now more and more chips are sold as other buyers come into the market with lower levels of willingness to pay. Ultimately, once Product 2 is on the market, the price for Product 1 falls away completely. This pricing dynamic is pictured below.
At this point, the logic of a time-to-market strategy is clear. If we introduce Product 1 as shown, our firm will make a great deal of revenue. To see this, look at both the price and sales volume curves in the plot above. Revenue from Product 1 is found by multiplying these two curves, and total revenue over the life of the product is just the cumulative revenue over time. However, what if we are another firm and we release a competitor to Product 1 – but we do so late. A competitor entering near the end of Product 1’s life may sell at high volumes, but only for a short time and only during the period when the price of the product is very low. This firm will have introduced a product that, over its life, will make very little cumulative revenue. So the earlier a firm enters into this competition, the more it makes in cumulative revenue. In fact, entering earlier increases total cumulative revenue at an increasing rate.

By this logic, it is obvious that we should release Product 1 at the soonest possible date. However, this may not be profitable. To see this, consider now the costs of developing Product 1. If we could take all the time we want, we could carefully research and develop Product 1 and, when it is ready, we will have run up total costs equal to C1. But by taking our time, we might end up introducing the product late, and this will hurt our revenue. So instead of paying C1 over a long period of development, say, 2 years, what if we accelerate development and pay the same amount but all in one year – or even in six months?

Well, here is the bad news. It turns out that compressing development costs into a short period does not give you the same result. This problem, famously dubbed the “mythical man month” by Frederick Brooks, occurs for two reasons. First, compressing the amount spent on development causes many to work simultaneously and in parallel, which results in coordination diseconomies. Second, there is the problem sometimes called “gestation,” where development is inherently sequential and cannot be made to happen all at once. Concretely, gestation requires time in development because answers obtained at one point in time decide the questions asked at the next point in time. Doing all this development at once leaves us asking many questions that we will never need to know the answer to – and failing to answer questions that we wish we had researched.

Consequently, to speed development, it is not enough to concentrate C1 into a more compressed period of time. Instead, we will have to pay more than C1, and the more we compress the development process, this additional amount will continue to escalate until, at some point, we could spend an infinite amount and not improve our release date any more. So development costs increase at an increasing rate as time-to-market shortens, as shown below.
As the figure shows, the time-to-market strategy confronts the firm with a dynamic cost-benefit comparison. To be profitable, the firm needs to introduce the product early enough to benefit from higher revenue over a longer time, but not so early that its development costs skyrocket.

How is a firm to achieve this balance? The answer comes down to organization. Firms that are good at the time to market strategy are organized in a way that minimizes development costs while maximizing the reliability of product introduction. The key factor to be managed by these firms is uncertainty. These firms typically design around customer-defined time milestones, track progress toward meeting those release dates, and hold employees accountable for staying on time. As uncertainties arise, routines such as phase-review processes are used to update release dates or revisit go/no-go decisions. And wherever, possible, the firm contracts with other firms in order to solve “weak link” problems that are slowing its ability to deliver on time.



Time-based competition is discussed in my book on the Red Queen. 

Sunday, October 30, 2016

Leading Amidst Change: Why Strategy Matters

You’ve probably heard people compare the Fortune 500 from a few decades ago with today, noting how fallen greats like Sears, American Motors, and Zenith Electronics have been eclipsed by innovators. The reason seems clear enough. We’ve seen considerable technological and regulatory change over the past 50 years, so the rules of the game keep changing in business and there appears to be no end in sight. In fact, globalization may even be accelerating the rate of change we see around us.

For business leaders, these changes can be daunting. Companies routinely take their leadership teams off-site to discuss the challenges and opportunities implied by the latest innovations. How will augmented reality affect the gaming market? How will mesh computing affect the financial technology space? How will the internet of things affect, well, everything? The idea seems to be that if we can forecast future technology, we will be more likely to survive or even prosper from the changes to come.

Not so. It turns out that often the very technologies that seem to have upended the great firms of the past were well understood, and sometimes even created, by those very firms: the Swiss invented quartz watches; Kodak invented the digital camera; Sears was a pioneer in inventory and brand management; and the list goes on. A number of different theories have been proposed to explain this puzzle. This note briefly summarizes a few of the major theories, and offers a way that business leaders can constructively think about leading amidst change.


Discontinuous Innovation Theory

As early as the 1960s, writers on technology management distinguished between continuous and discontinuous technological changes (albeit using various terms for the same idea). Continuous, incremental advances happen all the time, gradually improving the state of a given technology. Such changes are typically straightforward for existing firms to accommodate. Discontinuous changes, by contrast, represent a radically different approach to a technology, and often bring about an order-of-magnitude improvement in performance. For instance, adding functionality to an old-style “feature phone” would be a continuous innovation, while the invention of the smart phone could be seen as a discontinuous innovation.

This idea has appeared in hundreds of published academic articles since the 1960s, but its implications for business leadership were best explained in a classic article by Professors Michael Tushman and Phillip Anderson.[1] The breakthrough in Tushman and Anderson’s study was to note a key distinction between different types of discontinuous innovations. Some of these changes, while quite significant in technological terms, build nicely on the capabilities of existing firms. Such “competence enhancing” discontinuities retrench the status quo, giving incumbents even more of an advantage over new, upstart organizations. By contrast, other discontinuities render irrelevant, or even counter-productive, the capabilities of existing firms. These “competence destroying” discontinuities are difficult for industry incumbents to employ profitably, since by doing so they harm their existing operations. For that reason, such changes are more likely to be brought to market by new entrants to a business.

Using discontinuous innovation theory as her lens, a business leader will pay attention to how new technologies either enhance or destroy the value of her firm’s existing capabilities. For example, consider the iconic case of the Bank of America confronted by new innovations in electronic funds transfer technology. This technology would be at the heart of many new innovations in retail banking, including the spread of distributed networks of ATM machines. Prior to this discontinuous change, the Bank of America was the unparalleled leader in geographically distributed retail banking, operating a massive brick-and-mortar branch system across California’s 900 miles and hundreds of cities. Decades of experience honed the bank’s capabilities, as it daily cleared millions of paper transactions into its system regardless of where they originated across the state. With over 2,500 branches, the bank had a branch in every town in that state, and so attracted customers wanting convenient access to their accounts. The spread of ATM machines dramatically reduced the unique value of this system, making widespread geographic access to bank accounts commonplace. Although ATM technology was embraced by Bank of America, its competitive standing clearly was set back in the new era. By the late 1980s, the bank saw record losses in part due to its outdated brick-and-mortar system – the very system that had been key to its competitive advantage – and so it laid off tens of thousands of employees, closed branches, and consolidated functions into regional centers. In this way, electronic funds transfer technology can be understood as a competence destroying discontinuity from the perspective of incumbent banks in the era of brick-and-mortar branching. 


Structural Inertia Theory

Large, well-established organizations are notoriously difficult to change. The leading theory to explain this fact is known as “structural inertia theory,” pioneered by sociologists Michael T. Hannan and John Freeman in their seminal 1984 article[2], and since then supported and developed by a large body of research.[3] In this theory, the authors assume that modern society favors the creation and development of organizations that perform reliably and that can account for their actions rationally. Further, they assume that accountability and reliability are greater for organizations that have reproducible structures – meaning routinized, bureaucratic structures. Finally, they assume that such structures are more difficult to quickly change, that is, more inert. These premises lead them to the essence of their theory: Modern society favors the development of inert organizations.

Hannan and Freeman further argue that when inert organizations do change, these changes are hazardous to their performance and their life chances. Especially as they age, grow, and become complex, increasing inertia renders attempts to change difficult and fraught with risk. If the organization survives the change, it may be better off – at least as long as the change keeps pace with the demands of the environment. Stated in terms of a large-scale strategic change, this basic prediction is illustrated below:


Structural-Inertia Theory:
The Effect of Change at time T1 on Organizational Performance
 

In this illustration, an organization makes a large scale strategic and organizational change as of time T1, and as a result suffers a dramatic decrease in performance. (In fact, Hannan and Freeman state this prediction in terms of the organization’s likelihood of outright failure.) But over time, if the organization does survive, its performance improves. And, if the organization’s new strategy is better aligned to its situation, then its performance may even be better than at the start of the ordeal.

The evidence suggests that two distinct effects can be usefully separated when looking at this theory empirically.[4] The initial decrease in performance has been widely documented, and can be thought of as the “process effect” of change. This effect captures the various difficulties that come up as large-system changes cascade through an organization, creating various misalignments along the way. As these difficulties are worked out, eventually the organization reaps the “content effect” of the change – the improvement in performance due to the new strategy being better aligned with the environment.

Structural inertia theory highlights a problem inherent to strategic leadership. If leadership does a good job aligning the different aspects of organization and strategy, then changing that complex, aligned system will be especially difficult. And even if leadership does manage to change that system, the consequential drop in performance is likely to be extreme. Looking again at the Bank of America example, many pundits criticized the company for taking so long to shift away from the earlier strategy and structure centered on brick-and-mortar branches, skilled loan officers in every location, and decentralization to allow for extreme localization. Only after nearly failing did the bank’s leadership manage to break with that strategy and structure. Yet, when seen through the lens of structural inertia theory, this resistance to change is precisely what one would expect from such a well-managed organization. Years of fine tuning had created a complex organization including various, well-aligned features that made the bank particularly good at geographic localization. In this way, the difficulties involved with changing Bank of America were more a measure of its excellence than an indictment – albeit excellence that had become outdated. And this example suggests a caution to business leaders: Your good work aligning your strategy and organization today is precisely what will make changing that strategy so difficult and hazardous tomorrow.


Disruption Theory

By far the most popular theory of strategic and organizational change these days is disruption theory, developed by Professor Clay Christensen and his colleagues beginning with its introduction in 1995.[5] The theory restricts its attention to change involving technologies and innovations, and assumes a market context with differentiation between a low-end of the market, a mainstream market, and a high-end. The product performance, expectations of customers for product performance, willingness to pay for performance, and potential for profitability are all assumed to be low at the low end, middling for the mainstream, and high at the high end. Incumbent firms are assumed to use a well-established technology and focus on the mainstream market. Over time, they are assumed to improve their technologies gradually in an attempt to serve the high-end of the market, where profitability is greatest. In so doing, they are assumed to “overshoot” the needs of the low end and middle of the market. Furthermore, focusing on profitability, they are assumed to largely ignore the low-end of the market.

Under these assumptions, new entrants can enter with completely different technologies. These technologies assumedly will not initially perform as well as the established technologies used by incumbents, but they are good enough to give the new entrant a foothold market: either serving the underserved low-end of the market, or serving previously unserved “non-consumers” in a new-market foothold. Incumbent firms, meanwhile, pay no attention to the new entrant, because they are serving different customers. As time passes, however, the new entrant’s technology improves greatly, ultimately moving the firm into the mainstream and high-end of the market. At that point the new firm gains an advantage over the incumbent, often because the new firm’s technologies are part of a business model that is incompatible and superior to that of the incumbent.

Disruption theory is extremely popular, and this popularity leads to some problems for the theory’s use. For many business leaders, especially in information technology businesses, the pattern described by Christensen and his colleagues reflects their context well. For these leaders the theory is useful – at least if they can spot potential disruptors while still in the “foothold” stage. (A theory is not helpful if you must wait until you see a success to then identify it.) In many other instances, however, the term “disruption” is widely used to mean any change that has a big effect on business regardless of whether the change resembles that addressed by the theory. In these instances, the theory will not apply and in fact could be misleading. Christensen and his colleagues have recently voiced concern about this. In a recent article, they explain that many big, important business changes are not disruption as the theory defines it.[6] Uber, they note, is not a disruptor (based on their understanding of Uber).

For business leaders, the lens of disruption theory directs attention to those start-ups just getting a foothold either in a new market segment or in an underserved low end of an existing market. And the theory helps to evaluate these new entrants not on the basis of how well their technology performs when it starts out, but rather in terms of its potential to become disruptive as it improves over time. To some extent, the theory also directs attention to the new business models that often accompany such start-ups. 


Leading Amidst Change

For you as a business leader, all three of these theoretical perspectives direct attention, first and foremost, to what makes your organization compete well. Discontinuous innovation theory highlights the importance of knowing the capabilities that give your organization a strategic advantage, and how a given technological change affects the value of those capabilities. Structural inertia theory directs your attention to the complexity and intricacy of your organization in order to understand the cascade of misalignments that moving to a new strategy will trigger – if even temporarily. And, for some, disruption theory directs your attention to potential threats that, while benign in their infancy, have the chance to mature into something strategically significant.

One way to sum up these implications is to note that they draw attention to the importance of strategy, and in particular to the logic of your firm’s strategy. Big changes that leave the logic of your strategy intact are not a threat. Discontinuous innovation theory calls such changes “enabling.” Structural inertia theory notes that such changes do not require a fundamental alterations to your organization. And disruption theory would say that such changes are not likely to be problematic for you as an incumbent. But when a change threatens the logic of your firm’s strategy, it is time to act. The changes you set in motion will be difficult and costly for your organization, especially if you have done a good job aligning your strategy and organization around a clear and compelling logic. But the alternative is to become yet another example in the failure lexicon of business school professors.

What’s more, looking to see how changes affect your strategy’s logic gives you a way to identify and diagnose what matters before your performance suffers. As a leader, you do not have the luxury of waiting until changes play themselves out. You must make the call early on in the game, while there is still a chance to affect the outcome. Going back to the Bank of America, the logic of their branch strategy was crystal clear, even as technological changes were afoot that would threaten that logic. Had their leadership honestly reviewed how those changes were likely to affect the bank’s strategic logic, steps could have been taken long before red ink ultimately drove action. This lesson applies to every business leader. The logic of your strategy can be identified today, as can the implications of the changes you see around you for that logic. Use your strategy’s logic as the lens through which you understand and manage change.




[1] Tushman, Michael L. and Philip Anderson. 1986. “Technological Discontinuities and Organizational Environments.” Administrative Science Quarterly, 31: 439-465. 
[2] Hannan, Michael T. and John Freeman. 1984. “Structural Inertia and Organizational Change.” American Sociological Review, 49: 149-164.
[3] Carroll, Glenn R. and Michael T. Hannan. 2000. The Demography of Corporations and Industry. Princeton: Princeton University Press.
[4] Barnett, William P. and Glenn R. Carroll. 1995. “Modeling Internal Organizational Change.” Annual Review of Sociology, 21:217-236.
[5] Bower, Joseph L. and Clayton M. Christensen. 1995. “Disruptive Technologies: Catching the Wave.” Harvard Business Review, Jan-Feb.
[6] Christensen, Clayton M., Michael E. Raynor, and Rory McDonald. 2015. “What Is Disruptive Innovation?” Harvard Business Review, December.