Monday, May 15, 2017

On Behalf of Sprezzatura

It was in June back in 1991 - my first day of work at Stanford. My wife dropped me off, since I did not have a car and had my golf clubs, wanting to try out the Stanford Golf Course after work. As I walked up the parking lot toward the Business School, it hit me how bad this looked. The new guy shows up carrying his golf clubs. What kind of work ethic is that?

What to do? Looking about, I noticed some handy bushes where I could stash the clubs. They'd be ok until I got out of work. But it was too late. Up ahead the dean was approaching! Mike Spence, the famous economist and our dean at the time (who, in a few years, would win the Nobel prize), was walking straight at me coming out of the school.

I put my head down and kept walking, figuring that I looked young; maybe he'd just think I was a student. But it was not to be. "Bill!" he said, looking straight at me. "This must be your first day at work! Welcome!"

I tried to look as professional as possible, but I was carrying a set of golf clubs. Mr. Busy? Not! "Yes, Mike. Great to be here."

"And you brought your clubs!" he observed. "What a great day. I'm heading out myself!" He gestured to the windsurfing board clipped to his car's roof rack.

Ah, sprezzatura. The internet says the term was coined by Baldesar Castiglione in Il Cortegiano, where it refers to an air of measured nonchalance among accomplished courtiers. Become dean. Win the Nobel prize. Do a little windsurfing. No problem. In the Stanford parking lot that day, I had my first lesson in the culture that then characterized Stanford: Sprezzatura. You may be paddling furiously under the water line, but above the surface you're a calm and graceful swan.

But that was 1991. Decades later, I can report that the days of sprezzatura appear to be behind us here in the silicon valley. Today, a day in the life seems to be all of a rush, running from one thing to the next, face aimed unwaveringly at the phone. But I wonder about all this busyness. Are these hyper-busy people really all that productive? Or are they just looking busy to seem important?

Being busy to look important is nothing new. Chaucer wrote in The Canterbury Tales (circa 1387) of the important lawyer: "Nowhere a man so busy of his class, and yet he seemed much busier than he was." Spend some time people watching on University Avenue, here in Palo Alto (perhaps over a glass of wine), and you'll count dozens of head-down hurriers - but no sprezzatura to be seen anywhere. Even the others at their tables will be busy staring at you-know-what. Perhaps they are trying to fool others, since important people are thought to be busy. (In some contexts, busyness even serves to signal high status.) What's worse, perhaps they are fooling themselves. A full calendar, appointment pings, deadline checklists, "screens up" at meetings, all the trappings of a productive life may convince us that we really are productive.

But what about stopping to think? One advantage of sprezzatura is that it includes having long conversations with others - conversations without an agenda. Sprezzatura means we breathe between sentences, allow silence in the conversation, and listen long enough to hear. Do that enough, and you think - not just of what you must consider, but also of what you may have never considered. We could use more of that in the valley these days.

Remedy? Well, you could spend some time in Croatia. No, seriously. Croatians have a wonderful habit of spending hours on end relaxing and talking at coffee houses. I love the fact that in the Croatian language, they use the phrase ajmo na kavu, that is "let's go for coffee," to mean taking time to talk and think with others. Talking and thinking can lead to many creative things - maybe even some productive things. So what looks on the surface as sprezzatura may be more important than you realize, and it cannot be had if you're driving through Starbucks.


In your spare time, read the book in which Sprezzatura was first coined.

Sunday, April 30, 2017

The Partnership Fallacy

Remember doing group projects in school? Typically your teacher would partner you up with students who had different skills. Maybe a good writer would be matched with someone who knew math, for instance. In theory, you would each teach the other; the group project would get done, and everyone would learn. But what really happened? Perhaps pushed for time, at some point everyone in the group realized that the job needed to get done. And if you wanted to do well, this meant dividing labor. You broke down the project into parts, as in the well-known "divide and conquer" algorithm pictured here, and each of you took on the part that you understood best. In the end, the project was a success – but the math person did even less writing, and the writer again avoided math. So goes the partnership fallacy:  We partner up hoping to improve our weaknesses, only to divide labor and so make our weaknesses even worse.


Business leaders often fall victim to the partnership fallacy. For example, it may be hard to believe today, but in the early 1990s Apple Computer (as it was then called) was widely known to be bad at making small devices. The company’s earliest attempts to make small computers had flopped. Meanwhile, Sony stood out as the worldwide leader in making cool, miniaturized electronics. So Apple’s leaders decided to form a partnership with Sony. The idea was for Apple to learn about miniaturization while producing a laptop computer. History says that the alliance succeeded, because it created the first “Powerbook” computers. But in terms of learning, the alliance was a failure. Case studies at the time reported that deadlines kicked in, and pushed the two companies to each stay focused on what they did best. In particular, Sony took care of miniaturization, with very little day-to-day contact between Sony and Apple engineers. In the end, they got the job done – but Apple emerged without learning Sony’s miniaturization magic.



What’s more, often partnerships fail even to accomplish their stated goal, never mind learning. If you have much experience, you can probably name an example or two of failed partnerships. Business leaders typically allow these arrangements to fizzle out without much fanfare. Combining firms through merger and acquisition suffers this problem, too. We talk a good story about “synergy” and “learning” and “1+1>2”, but the evidence shows that combining firms typically increases only the variance in performance (not the level of performance as promised by “synergies”).  

The lesson: Partnerships are not a substitute for learning by doing. Partnerships sound great during a planning session. After all, who can argue with the idea of bringing in somebody who already knows what you need to learn? But ask yourself, how did they come to know?  Odds are, they learned the old-fashioned way: by doing. Effective leaders understand how organizations learn, so they avoid the partnership fallacy. Partner if you must, but don’t fool yourself that this is a great way for your organization to learn.


For a thorough academic study highlighting the value of organic learning and growth, read my book on competition.

Saturday, April 15, 2017

Differing without Dividing

Variety is great for innovation. For instance, consider the case of Seymour Cray, the “father of the supercomputer.” In the 1970s, Cray left Control Data to start Cray Research, a company devoted to creating the world’s fastest computer. Cray approached the problem with a revolutionary architecture, so called “vector processing.” By 1976 he and his team introduced the Cray 1, and Cray Research was seen as the Mecca of high-speed computing. John Rollwagen became company President in 1977, bringing business leadership alongside Cray’s technological prowess.


In 1979, Rollwagen brought in another technology genius, Steve Chen, to lead the design of a completely different approach to supercomputing. So as Seymour Cray’s team worked on the Cray 2, Chen’s team worked on the Cray X-MP. Chen’s design built on Cray’s initial innovation, but did so using a revolutionary architecture featuring multiple processors operating in parallel. Released in 1982, the X-MP set a new standard for supercomputer performance, and significantly raised the bar for the team working on the Cray 2.


When we do not know what the future holds, variety helps our organization to discover what is possible. This truth is one reason why we often hear people saying that they want to increase the diversity of their employees. Just like the biosphere, organizations evolve better if they sustain variety.

Yet examples like Cray and Chen’s are rare. One reason is that sustaining variety is expensive. How inefficient to run multiple projects that are trying to do the same thing. But another, bigger problem is that sustaining variety threatens to divide a company. People object to having others in their company working at cross purposes. How can we encourage differences without being divisive?

One way is to live by the adage “disagree and commit.” Here in Silicon Valley people attribute the saying to Intel. The idea is that you should encourage disagreement during the decision-making process, in order to improve the quality of your decisions. But once a decision is made, everybody needs to fully commit to its implementation. Unfortunately, in practice this saying often is used to silence those who see things differently. Often managers say “disagree and commit,” but they are really saying “disagree and shut up.”


I prefer “switch and commit.” The goal is still to end up committing at the end of the process, but during the decision I want the participants to switch roles. The person disagreeing with you needs to take your position and argue it well. Similarly, you must argue the other’s view well. You can think of the approach as devil’s advocacy taken seriously by both sides.

I first tried “switch and commit” when teaching a controversial topic here at Stanford. For the first assignment, the students had to state their position on the topic. For the second, big assignment, they had to write an essay taking the opposite view. (They did not hear about the second assignment until after they handed in the first.) The end results were some fantastic essays, because the authors were legitimately skeptical.

Since then, I have tried “switch and commit” when facilitating hard-hitting business meetings among top managers. The results have been mixed. Many people cannot get their head around a different perspective. But now and then you find an exceptional leader who appreciates the value of differing without dividing.


A readable review of related academic work is Scott Page’s book The Difference.

Thursday, March 30, 2017

On VCs, Clairvoyants, and Magicians

The guy at the next table looks out at the amber sunset, puffs up with gravitas, and announces to his wide-eyed friend “soon all things will be connected seamlessly to the ubiquitous network." Time to change tables. I grab my drink and set off to find an area outside of Houdini’s vocal range. The bar at the Rosewood is cursed by its reputation as the place where VCs from Sand Hill Road meet, so it attracts posers playing the prophet like LA attracts actors.

Coincidentally, on my way over to the Rosewood, I saw along El Camino Real a hand-painted sign saying “clairvoyant conference” with an arrow pointing to a hotel. Imagine a conference for clairvoyants! You would not need to have sessions on “future trends,” since they would already know. But wait: Why the sign giving directions, for that matter?

Since time began, it seems people want to believe that some of us have a special knowledge of what is to come. So we’re vulnerable to those who claim such knowledge - albeit that different people fall for different images of the prophet. You might mock the trappings of the village shaman, the tarot reader, and the astrologist – but I bet you’d clear your calendar to hear the latest word from the valley’s richest VCs.

I’m not just waxing cynical. It turns out that venture capitalists typically are bad at telling the future. As an industry, VCs don’t perform that well financially. To find VCs who outperform the market, you have to selectively sample only the most successful ones - but that is true for slot machines, too. OK, a minority of VCs do tend to appear repeatedly on the winning side, so perhaps they are the real Houdinis. Maybe. Or maybe having been successful they end up getting preferential access to things that continue to make them successful. (I appreciate that advantage, working at Stanford.)

Whatever the reason, the fact that at least some VCs are repeatedly successful has created the mystique that there does exist, somewhere along Sand Hill Road, somebody who does know what’s next. Hence all the puffery at the Rosewood. How are we to know that he’s not really a visionary?

Professor Elizabeth Pontikes and I took a careful look at this question. We collected data on thousands of firms in the software business and looked at their fates over time – including both successes and failures in the data. We found that these firms herd into “hot” markets that have been blessed by the VCs. But we also found that the VCs herd into markets too, following each other in financing frenzies. The resulting hype cycles lead to bad outcomes for companies; firms getting funded in these waves are the least likely to ultimately succeed by going public. So much for Houdini.

Perhaps even more interesting, the VCs themselves seem to be aware of this problem. While VCs herd into hot markets, at the same time they try to avoid investing in firms that do so. The VCs prefer instead to invest in those who pioneered what is now a hot market (and survived). As a kid I had a precocious classmate who would jump to the front when he saw a trend, pointing resolutely forward in Napoleonic fashion, proclaiming “follow me!” I’ll have to check; perhaps he grew up to be a VC.

Remember, the most disruptive changes are not predicted by our experts. They are pioneered by those foolish enough to ignore the consensus. Be skeptical of those who claim to know what's next.


For the research behind this article, see my paper with Elizabeth Pontikes.

Wednesday, March 15, 2017

Define the Game

"Change the game" you'll hear people say. In fact, we often define the game we play when we choose how to play it. But only some of us realize this fact. I was reminded of this lesson in Moscow by the French chess master, Joel Lautier.

Joel and the rest of the team had to solve the problem before morning. Vladimir Kramnik would sleep, of course. He would need to be rested fully, and even then it would be a long shot to beat Garry Kasparov. The dominant world chess champion was unlikely to lose. But the team had an idea. Even if it worked just once - there were many games in the match - it might open up the smallest of chances.



The team were all wiz kids, each a chess master in his own right. No one person could fully prepare alone for the many possibilities of a match at this level. So each of the grand masters had his team of trainers assigned to work out the best solution to a particular situation that might arise. By morning, each team member needed to have solved his problem and prepared the result of his analysis for presentation to Kramnik himself, who would scan the analysis and commit it to his marvelous mind. Few words were needed.

Joel Lautier was on the team because he is one of the few to ever beat Kasparov. Lautier’s job was key: Formulate Kramnik’s best “black” opening. Many readers will know about the most well-known chess openings. But, in fact, there are many more possible openings than the common ones, and by its nature chess allows for the invention of new openings to this day. But in such an old and storied game, the chances of inventing an effective new opening are not great. That was Lautier’s task.


What made Lautier’s job especially important was that Kasparov was especially good at the attack. Key for Kramnik would be to find a way to improve his chances in the games where he started as “black”- second - since those games would favor Kasparov’s attacking ability. And he would need to be able to repeat the opening over several games as black, a tough job once the opening was revealed.

Working into the night, Lautier formulated a risky approach for Kramnik. The opening was not ideal, but it might work given Kasparov’s strengths relative to Kramnik. It involved an odd series of moves, quickly leading to the trading of queens. When complete, the strategy - which has come to be known as the "Berlin" opening - left Kasparov, as white, with a slight positional advantage (with his players in slightly better places on the board). But the opening hurt Kasparov too, by skipping the complicated “mid game” where Kasparov famously had an advantage. The strategy worked. The opening shifted the edge enough for Kramnik to draw games that he would have otherwise lost.

Several years later, when I was lecturing in Moscow, Joel Lautier was in the audience and he asked me this question: “There may be many possible winning business strategies. How do we know which is best?” The answer comes from Mr. Lautier's example. The best strategy plays to your strengths, and away from the other’s. Don't just play the game as defined; define the game you play.


Research on "metacompetition," defining the game you play, appears in my book on Red Queen competition.

Tuesday, February 28, 2017

Why You Should Turn Down That Well-Paying Job

I remember being young and broke, going to an interview for an internal auditor job at a bank. The bankers who interviewed me were enthusiastic; they were authentic bankers. I did my best to pose as a banker, but as often happens to posers I was found out. The bankers asked me for a “writing sample.” I showed them my poetry. It was not to be.

Some people become accountants because they want a secure job. There is much to be said for pragmatism; better to be an employed accountant than a wanna-be actor. But is that the right comparison? Here is the issue: Somewhere tonight, maybe around 3 AM, some guy will be laying awake thinking about accounting. He lives and breathes accounting; it occupies his thinking even in his spare time. If things get competitive for accountants, he is going to dominate. His rivals are acting like accountants; he is the real thing.

Competitive advantage goes to the authentic. Their job is their avocation. They would do it, if need be, without pay. The authentic persist at getting through the tough parts of their vocation. They ponder it during the quiet times, so the magic of insight makes their work more creative. The gardener out on a cold morning; the writer typing away when she should be sleeping. Such people will take their vocation as far as it can be taken. By contrast, those who merely pose will not. As in the old adage “you cannot coach passion,” no amount of posturing can outdo authentic dedication.

The practical reader is objecting at this point, noting that there is a big economic difference between being an authentic accountant and an authentic writer. This contrast brings to mind a corollary adage: “Every person has a special gift.” As each of us grew up, we searched for that gift – the activity that seemed authentic to us. Our parents hoped that it might also be an activity that pays. How fortunate is the authentic accountant! His passion lines up well with economic gain. Meanwhile there goes the authentic musician, waiting on the accountant as he dines. Don’t get me wrong; I love the arts and admire the authentic artists. But though all of us may have a gift, some gifts pay better than others.


Does this mean that only some of us can follow our calling? That depends on how long we search. My failed attempt to be a banker left me broke still, but the upside was that I kept searching. Life is a “sequential search” process. We search, one by one, trying to match our gifts with the opportunities of the world. When we settle on an occupation, we also stop our search. If we stop the search at the first pragmatic job, then we are posing - and will surely be out-competed by the authentic. But if we keep searching, we increase the chances of matching our gifts with opportunity. Of course not all jobs pay the same. But better to keep searching for a way to remain authentic, than to settle early for mediocrity. Search enables authenticity.

The lesson: Ask “what do you do well?” and then search to see how that ability fits the opportunities of the world. You will have failures along the way if your search is thorough. But the upside of each failure is that you'll be required to keep searching, again increasing your chances of finding a match between your passion and the opportunities of the world.

More dangerous than failure is that you might, early on, score a well-paying job for which you are not authentic. Turn it down. Search enables authenticity.


For an academic treatment of the sequential search strategy, see Levinthal and March’s paper.

Wednesday, February 15, 2017

Learning without Logic

After Napster was shut down in 2001, the brand was reborn in 2003 as a subscription online-music service run by Roxio’s Chris Gorog. Chris and his team quickly amassed a large catalog of songs, enabled radio streaming, established partnerships with online platforms like yahoo, built an entrepreneurial organization, and expanded internationally. As record stores became history, Apple’s iTunes, illegal music downloads, and a few subscription services like Napster offered different visions of the future. But by 2005 the verdict was in. Illegal downloads continued apace, iTunes was a clear success, and subscription services were not. As one Washington Post writer put it (in 2005), Napster’s subscription model was not a viable alternative to music ownership: “When music is good, you want to know that it can’t be taken away from you.” The final nail was Steve Jobs' declaration: "Nobody wants to rent their music." The experiment had been run, and the music ownership model beat subscription services.

But wait. With the explosive growth of services like Pandora and Spotify, the pundits are now saying that subscription models are the future. Even Apple has launched such a service. What about the lesson we learned from the failures of just a few years ago?

The problem here is that a failure is a datum, not a logical argument. Data do not speak for themselves. Failures can have various causes, and so it takes logical reasoning to explain why failures happen. Perhaps the early subscription services were ahead of their time, such that limited bandwidth might have made them less attractive than they are today. Or maybe the smartphone is a necessary complement to such services.  Whatever the diagnosis, logic is required to sort out why firms succeed and fail.

Unfortunately, most observers skip the logic part. It is mentally easier to jump to the “obvious” conclusion: If the business failed, the business model must be wrong. Full stop.  You can easily tell when this skip happens. The person will name an example as if it were a reason. Is online grocery delivery a viable model? No: Webvan. Is internet search a viable business? No: Alta Vista. These examples are data, not logical reasoning. But it is hard to rebut those who argue by citing examples, because you look the fool trying to say that a failure somehow might have made sense. Like Gerald Grow’s cartoon, we replace reasoning with dueling examples: I shout “iTunes!” you reply “Spotify!”


The result? We often “learn” without logic, and so we often walk away from great ideas. The Apple Newton failed, leading many to say that there was no market for smart handheld devices - yet now we all own them. Early attempts at remote alarm systems failed, leading many to conclude that such services could not be profitable; now they are commonplace. Even internet search, possibly the most lucrative business in history, was initially panned after a spate of failures among early movers – Lycos, Alta Vista, Excite, and others. Often firms fail.  But that may not mean, logically, that we should abandon their business models entirely.

To diagnose well, we need to systematically contrast failures and successes - as is done in good academic research. The popular maxim “fail fast and cheap,” A/B testing, agile development, root-cause analysis and similar approaches are designed to show us successes and failures without destroying the firm. These techniques routinely are used in Silicon Valley firms these days, and are making their way into the global business lexicon. Sometimes such techniques are very effective for learning. But keep in mind that these techniques simply provide us with data. It is up to us to explain the data, and that requires logic.


The academic research on this topic can be found in the research of Jerker Denrell.

Monday, January 30, 2017

The Truth about Hiding from the Truth

If you find an old-timer at the Dulzura Cafe, ask him about Bulldozer man and his fence. He was old back when I was young, and like many in this rural California outpost near the Mexican border, he used his acreage as he saw fit. Many of us shot skeet; some just left the sagebrush alone and enjoyed the isolation. Bulldozer man owned a big old Caterpillar bulldozer, and he spent his time moving mounds of dirt hither and yon.

Now, about the fence. True story. Happened in 1975, just outside of Dulzura. One fine Spring day Bulldozer man visited his neighbor, an affable, transplanted New Yorker who had gone native, complete with horses, boots, and plenty of Coors. Bulldozer's proposition was that they two share the cost of a fence that Bulldozer man was willing to build. Indeed, he had already begun piledriving large holes along the property line. But the affable Coors drinker saw no need to break up the beauty of the countryside with a fence. Bulldozer was enraged, especially since he'd already started on the hole digging. He stormed off, shouting something about how the fence will be all his. Soon the measure of this man had become public for all to see: Bulldozer decided to make the fence "his." He backed it up a full 10 yards, so that it was clearly and completely on his side of the property line, effectively giving up hundreds of square yards of real estate to our affable Coors drinker. Many a Coors was raised in thanks to this dimwitted neighbor in the years since.

So it is that often when we look out for #1, we end up doing more harm to ourselves than good. Same goes for public policies meant to protect domestic jobs and economic vitality. Truth is, when our companies have to compete, it does them good. You don't get good at anything by hiding away. (Think of how you shop for schools for your kids. You certainly don't look for a place where they can perform as poorly as possible and get away with it. You probably look for the best school, and do everything you can to encourage them to compete.)

Same with companies. Faced with competition from other countries, domestic companies either improve their performance or fail. There is plenty of evidence to back up this claim. Especially notable is a recent paper by Stanford economist Nick Bloom and his colleagues. They found that when Chinese imports increased as a result of that country entering into the WTO, the impact on firms in other countries was profound. Those firms picked up their game, often innovating much more in order to compete. The firms that did not pick up their games lost business, of course. But in the end, having to deal with competition from places like China turns out to be a big reason we have vibrant firms in today's economy.

Tough talk may sound good, but it does not make you a winner. And, for folks like Bulldozer man, bluster provides cover for downright stupid, self-destructive actions. When all the tough talk is done, you become competitive by competing. Hide from that truth if you wish, but the person you're hurting is yourself.


Read the research on this by Bloom, Draca and Van Reenen.

Sunday, January 15, 2017

Why You Don't Understand "Disruption"

Been to the "Disrupt" conference? Self-proclaimed "disruptors" gather to reach consensus about what are the non-consensus ideas out there. 

Big wigs having a conference on disruption is like the Czar creating a bureau on revolutionary thinking. Really want to see disruption? Don't go to a conference. Go to where people are breaking the rules.

If you just smiled, then you are probably from a small startup (or wish you were), and you know that disruptions come from startups who break the rules of the game.

For example, consider this idea from a small team of rule breakers: Provide a way to instantly share digital photographs with others anywhere on earth - but only with those who you want to see the photo.

You are thinking Instagram, the tiny company acquired in 2012 by Facebook for $1 billion.

Wrong.

I'm describing a project launched in 1996 - that's right, 1996 - by a group at Kodak's Brazil headquarters in Sao Paulo. (Yes, Kodak - everybody's favorite example of a company that failed by being too slow to innovate.) Kodak's country head for Brazil, Jarbas Mendes, and his team were trying to find innovative ways to help customers share their digital photographs. The team understood that the internet - brand new at the time - could enable such sharing. So they designed a system where one could upload photographs to a server in the cloud (though nobody yet used the term "cloud"), and send a code to another person who could then view the photographs. "The technological possibility of having an online way to view pictures was the idea. There was a lot of work by the team on this approach to sharing." recalls Joao Ciaco, who was in a marketing role on the team at the time.

What we now call Instagram was invented by Kodak in 1996 - 16 years before Instagram would be acquired for a billion.

How can this be? After all, we often hear that big, established firms are slow to innovate, and so they get disrupted by new technologies. As the story goes, success at a well-honed strategy leaves companies blind to the value of new technologies until it is too late. If this is how you understand disruption, you believe in the slow-incumbent myth.

It turns out, Kodak is not a strange exception. Often big, established firms do a great job of rapidly adopting new technologies. With success, leaders are often more willing to innovate – even when such innovations are out-of-step with their traditional organizations. And therein lies the problem: “success bias”.  We misread our success at one game, and so readily launch into another – whether our organization is suited for that business or not.

Looking again at Kodak, it was the first mover in digital cameras, and it held an early lead in that market. (See the new paper on the digital camera revolution by Jesper Sørensen and Mi Feng.) Kodak even made the digital cameras sold by other firms trying to be in that market. The problem was not Kodak’s ability to innovate. At work was the poor fit of its organization to the logic of the digital business.  If anything, Kodak was too willing to innovate given its organization.

Same with the minicomputer firms like DEC. They are often criticized for resisting a disruption. We know that the personal computer cut the legs off of the market for minicomputers (powerful mid-range computers and servers) starting in the 1980s. At that time, the cutting edge of the computer industry – the real “hackers” – were the minicomputer manufacturers like Data General and DEC that flourished from the 1960s through the 1980s. They were scrappy, imaginative rebels compared to the monoliths of the mainframe computer business. The secret to their success was imaginative design, since they relied on the architecture of the entire system for performance. And, as Tracy Kidder romanticized in his novel Soul of a New Machine, they were passionate about getting products out into the market. That book documented the tale of the cult-like Data General, and its creation of the Eclipse MV/8000 minicomputer that launched in 1980.

Technology writers, decades later, would describe these innovative firms as unable to change.  The slow-incumbent myth: These successful, established firms did not see the microcomputer coming, since they were wed to the technologies and designs of the old market that they knew well.

Not true.

The real story is that the most successful minicomputer companies made the transition to the personal computer very quickly – but once there they were ill-suited organizationally. Success bias was at work yet again. For instance, Data General released its first microcomputer in 1981 – the same year as IBM. And DEC – another legendary champion of the minicomputer era – entered with the “Rainbow” in 1982. These fast-moving firms had no problem innovating. They could and did. Their problem was that everything else about their organizations was well tuned to their traditional market. They innovated in the PC market very quickly, and then they failed there at a very high rate.

We want to believe in the slow-incumbent myth, so we dismiss the early moves by incumbents as half-hearted. But look again at the evidence. Successful incumbents are often very innovative – too innovative for their own good. What is going on in these cases is success bias. When business leaders win, they infer from victory an exaggerated sense of their own ability to win.  So they are overly eager to enter into new competitions – even ones where they are not well suited to play. Their very success in the earlier business is evidence that they are well-honed to an earlier strategy - yet it is that earlier success that makes them especially willing to move into the new competition.

The lesson for leaders? Disruption is not just about technology changing; it is about changing the logic of a business. Success with a new technology requires organizing for a new logic, and organizing in new ways requires that you forget the successes of your past.


The theory behind success bias among managers is in this paper by Jerker Denrell, and evidence linking success bias with failure is in my paper with Elizabeth Pontikes.

Saturday, December 31, 2016

Leading Truth or Denying Reality?

How we talk about a fact shapes its meaning. Was that new product a failure, or did we just come down the learning curve? Did your career just take a hit, or are you pivoting into a promising new future? Facts are ambiguous, and so how we describe facts helps make sense out of them. One person's rebel is another's traitor, and a well-told story (think Hamilton) will tip the balance.

Of course we all know that effective leaders know how to use narratives - stories that give meaning to facts. But the use of narratives can be either very good - or very bad - for the future. It all depends on whether a narrative leads, or denies, the truth.

In some cases, narratives are used to lead the truth. While the major carmakers were spinning narratives against California's zero-emission vehicle mandate back in 2003, Elon Musk and his colleagues created Tesla, a counter-narrative that has changed the truth about electric cars. Because multiple futures are possible, those who shape what we regard to be possible change our efforts to make those possibilities real. So it is that a narrative can lead the truth.

Leading the truth is not just "spin." Spin is about putting a good face on bad facts. In contrast, leading the truth creates reality by helping us see what is possible. The greatest leaders in history are important not for what they created, but for what they helped others to see as possible, and so create. Once created, what was once considered impossible is then seen, rightly, as the truth. In this way, great leaders lead the truth.

Alternatively, other leaders use narratives to deny realities that we wish were not true. How convenient it would be if man-made climate change were not real. Psychologists tell us that we are prone to believe what we wish were true, even if this requires denying reality - as we see with science deniers confronted by the looming reality of climate change.

Time corrects such wishful thinking, of course, as the facts come to be undeniable at some point. But in the meantime, nefarious leaders take advantage of our desire to deny reality by spinning narratives that play to our weaknesses. When I was young, I recall hearing leaders spin narratives to deny reality: cigarettes are not really bad for you; the US was winning in Vietnam; climate change is a hoax.

Time will tell, of course. In time, we'll look back and know that some leaders were visionary - they used narratives to lead the truth. Others will be shown to have been reality deniers, and history will judge them severely. The problem, of course, is the damage they do in the meantime.


To dive into the large academic literature on narratives and counter-narratives, you might start with the work of Michael Bamberg and his colleagues.

Thursday, December 15, 2016

Leading by Design

  In 1993 the software startup FITS moved from Toulouse France to Scotts Valley California (a town at the edge of Silicon Valley). Their founder, Bruno Delean, had invented a radically new approach to editing photographic images – an important breakthrough given the hardware limitations of the time. Delean’s team worked tirelessly, aiming to get the software in shape for a big demo at Silicon Graphics, at that time a powerhouse in computer-aided design. The team worked day and night, stopping now and then just to eat. They were not paid that well, nor were they working for a noble cause; it was just software. They worked because they wanted to do a good job for Delean, who was a legend among French software developers. Delean himself had done most of the coding, and was there working side-by-side with the team, always available when a tough call had to be made. He led in the most compelling, direct, and personal way possible: by example.


Just up the street on the other side of town, Steve Luczo was re-creating what was to become one of the most innovative companies on earth: Seagate Technologies. Steve had taken over the helm at Seagate, and began to turn the perennially late-to-market disk drive company into the industry’s unrivaled leader. He changed the organization’s structures, routines, and culture dramatically, creating within Seagate a global development platform that brought improved technologies to customers sooner and more profitably than any firm in the industry’s history. His people worked tirelessly, and mastered the science of rapid product development. Innovation at Seagate became routine, and the company transformed storage technology as we know it. Steve Luczo led this company, but not like Bruno Delean. Steve Luczo led by design.










                                                        
Leading by example shows the way, but leading by design creates a system that discovers the way. Those who lead by example are authentic, because they put their words into action – like Delean. But they are limited to what they know, and what they can do. Delean’s firm ultimately was limited to what he could imagine, and so no longer exists today. Those who lead by design do not invent, nor are they involved in the specific decisions to get the job done. Like Luczo, their names are not on patents. Instead, they build the culture, routines, and structures within which others can flourish. Done well, such leadership creates an organization that takes us places we never imagined. Seagate’s innovations were not foretold by Luczo, but they were created by the organization that he put in place. When you lead by design your job is not to know the future, but to create an organization that discovers the future.

Leading by design is especially effective in changing times, because when times are changing it is difficult for any one person to know what is next. In fact, our successful leaders typically do not have a very good track record when it comes to predicting the future. For starters, their very own success likely came about in ways that they, themselves, did not expect when they were starting out. (That is true for Google, Facebook, and Apple, for instance.) And if you look back on the predictions made (and not made) by our luminaries at any point in time, the track record is unimpressive.  In 1992, for instance, virtually no leaders in the technology world were predicting that the worldwide web would soon explode onto the scene. Like a clairvoyant caught in an avalanche, somehow our technology leaders failed to see the worldwide web coming. Look back at what the experts were saying before many of our most profound innovations, from the microcomputer to wireless telecommunications, and you’ll find they were typically off the mark. But when we lead by design, we do not pretend to know what is next. Instead, we create an organization designed to discover possibilities that we never dreamed of.


The classic academic treatment of these ideas is in Selznick's book on leadership.

Wednesday, November 30, 2016

Bake Your Own Pie

Recently I was lecturing a group of high-level Chinese executives, when one asked me: “What do you think of plagiaristic innovation?” Before I could answer, he went on to explain that for China to “catch up,” he felt it needs to have innovation of any kind – even what he called "plagiaristic" innovation.

Don’t worry. I’m not about to rehearse the well-worn arguments about the protection of intellectual property: incentives for continued innovation, just rewards for investors who back authentic creativity, quality guarantees for consumers of branded products, and the like. Nor am I going down the “information must be free” path – indignantly advocating “free as in free speech (not free beer),” “stick it to the man” (the artist is not getting the payments anyway), or the “hackers’ code of ethics.” No, here I’m talking about something else.

My point here is about what “innovation” means. Debates about intellectual property, stealing, and plagiarism are all about who owns the pie. That question is very important, and is obscured when patent “trolls” flood the system with complaints, or when plagiarists masquerade as innovators. But another important point often gets lost in the fray:

Innovation is not about fighting over the pie; it is about baking a new pie.

For example, hybrid vehicles hit the worldwide market starting in 1999 and 2000, and within a few years an echo of patent litigation followed – escalating in 2003. The big car makers battled over who invented what, sometimes with each other and sometimes with small firms, everyone claiming a piece of the pie. Meanwhile, also in 2003 but with far less fanfare, Elon Musk and his team of co-founders created Tesla, the forward-looking innovator that has changed the game in the automobile industry. The noisy pie fights in 2003 were over hybrids; the profound innovations of 2003 were quietly happening at Tesla.

Pie fights extend to all walks of business life, not just battles over intellectual property. For instance, the so-called “browser wars” between Netscape and Microsoft were at their peak in 1998, following Microsoft’s integration of its Internet Explorer browser into its ubiquitous operating system. Advocates of competition howled, and defenders of Microsoft replied with talk of “seamless technology” and “complementarities”. Also in 1998, but unknown to most people at the time, PhD students Larry Page and Sergey Brin created Google – the company that would change the game so thoroughly that we would soon forget about those early browser wars. The noisy pie fights of 1998 were the browser wars; the great innovation of 1998 was quietly taking shape at the newborn Google.

"Wait," you are saying. "After innovating, innovators need to defend their creation." Of course. Take QualComm, for example. That company has an unparalleled track record of continued innovation in wireless technology. As a result, its intellectual property has turned out to be extremely valuable. It has of course defended that property against plagiarists; it owes that to its shareholders. But QualComm transformed its industry by innovating - never mistaking defending IP for creating new, valuable technologies.

All around us, we see real innovators at the cutting edge of knowledge. Have a conversation with my son Burton Barnett, pictured here doing science, and you won't hear about pie fights; you'll hear about amazing new developments in immunology. And similar developments are happening worldwide - in China, Europe, India, the Americas - everywhere forward-looking people are creating new knowledge. This process of innovation is key to our collective future, and it has little to do with plagiarism or pie fighting.

The lesson to innovators: Pie fights are important; we all deserve our piece of the pie. And of course even true innovators often must fight off plagiarists. But being good at pie fighting does not make you good at innovating. Innovation means baking a new pie. 

The lesson to plagiarists: Want to create something useful? Leave the other guy’s pie alone and learn to bake.


Research on the uniqueness of innovators appears in the work of Lee Fleming and Olav Sorenson, among others.

Tuesday, November 15, 2016

The Time-to-Market Strategy

In many industries, products are profitable only during a limited window of time. We see time-sensitive products, of course, in consumer electronics, where new models of phones, computers, and home entertainment products come (and go) frequently. We also see this pattern in fashion markets. Zara, for instance, introduces new clothing products twice weekly across nearly 2,000 stores – amounting to the introduction (and retirement) of over 10,000 designs a year. Timing is key, as well, in the introduction and short shelf life of cultural products, such as popular music and films. The market for video games follows a similar pattern of rapid new-product introduction and short product lives, including flash-in-the-pan hits like Pokemon Go as well as the more cadenced, but still time-sensitive, annual replacements of the various EA Sports games. Innovative pharmaceutical products also compete in a time-sensitive way, since the first to patent enjoys a profit premium that ends abruptly with patent expiration. Consequently, drug companies are in a race against time, first to file the patent, and then to bring the drug to market. Even some durable products, such as automobiles, are introduced and retired on a time-sensitive, if cadenced, basis.

Time-sensitive markets can be identified by the presence of two underlying features. First, these market place a premium on introducing a product early – or at least on a reliable schedule where being late is punished. For instance, releasing a new version of a gaming console like Xbox or Playstation late - after the holiday season – would be unthinkably costly for Microsoft or Sony. Second, the products in these markets also have a limited shelf life. A new clothing style or a new hit song may be wildly popular today, but within weeks (or maybe days) they will be yesterday’s news.

In such time-sensitive markets, success depends on a distinct logic: the time-to-market strategy. To understand this strategy, consider a firm that introduces a successful new product, “Product 1,” as in the plot below. Initially the product takes off slowly, and then it catches on, finally reaching a maximum market size. This “S-shaped” diffusion curve is typical of successful products.
The story does not end there, however, because this company is facing a market where the shelf life of a product is limited in time. Consequently, it is preparing to introduce a second product, “Product 2,” that will compete directly with its own Product 1. Note that if the firm does not introduce Product 2, somebody else will. That is what is compelling the company to continue with the introduction of another product even though it will cannibalize its first product, as shown here.
Now focusing in on Product 1, we can see that there is a limited window of time during which the firm can make money selling the product. It is interesting to think about the price that can be charged for Product 1 over the life of the product. Typically, the price that can be charged will be much higher earlier in the product’s life, for two reasons. First, the earliest buyers of the newly introduced product will be those with a greater willingness to pay for the product. For instance, if Intel comes to market with a new chip that is extremely valuable to cloud service providers running state-of-the-art server farms, these eager buyers will be the among the first to buy this new chip and they have a high willingness to pay. Less enthusiastic buyers will also buy at some point, but only if the price falls. Second, the price will begin to fall over time as other competitors come out with a product that is a direct rival to Product 1. For instance, perhaps AMD will introduce a competitor to Intel’s new chip. As the price falls, now more and more chips are sold as other buyers come into the market with lower levels of willingness to pay. Ultimately, once Product 2 is on the market, the price for Product 1 falls away completely. This pricing dynamic is pictured below.
At this point, the logic of a time-to-market strategy is clear. If we introduce Product 1 as shown, our firm will make a great deal of revenue. To see this, look at both the price and sales volume curves in the plot above. Revenue from Product 1 is found by multiplying these two curves, and total revenue over the life of the product is just the cumulative revenue over time. However, what if we are another firm and we release a competitor to Product 1 – but we do so late. A competitor entering near the end of Product 1’s life may sell at high volumes, but only for a short time and only during the period when the price of the product is very low. This firm will have introduced a product that, over its life, will make very little cumulative revenue. So the earlier a firm enters into this competition, the more it makes in cumulative revenue. In fact, entering earlier increases total cumulative revenue at an increasing rate.

By this logic, it is obvious that we should release Product 1 at the soonest possible date. However, this may not be profitable. To see this, consider now the costs of developing Product 1. If we could take all the time we want, we could carefully research and develop Product 1 and, when it is ready, we will have run up total costs equal to C1. But by taking our time, we might end up introducing the product late, and this will hurt our revenue. So instead of paying C1 over a long period of development, say, 2 years, what if we accelerate development and pay the same amount but all in one year – or even in six months?

Well, here is the bad news. It turns out that compressing development costs into a short period does not give you the same result. This problem, famously dubbed the “mythical man month” by Frederick Brooks, occurs for two reasons. First, compressing the amount spent on development causes many to work simultaneously and in parallel, which results in coordination diseconomies. Second, there is the problem sometimes called “gestation,” where development is inherently sequential and cannot be made to happen all at once. Concretely, gestation requires time in development because answers obtained at one point in time decide the questions asked at the next point in time. Doing all this development at once leaves us asking many questions that we will never need to know the answer to – and failing to answer questions that we wish we had researched.

Consequently, to speed development, it is not enough to concentrate C1 into a more compressed period of time. Instead, we will have to pay more than C1, and the more we compress the development process, this additional amount will continue to escalate until, at some point, we could spend an infinite amount and not improve our release date any more. So development costs increase at an increasing rate as time-to-market shortens, as shown below.
As the figure shows, the time-to-market strategy confronts the firm with a dynamic cost-benefit comparison. To be profitable, the firm needs to introduce the product early enough to benefit from higher revenue over a longer time, but not so early that its development costs skyrocket.

How is a firm to achieve this balance? The answer comes down to organization. Firms that are good at the time to market strategy are organized in a way that minimizes development costs while maximizing the reliability of product introduction. The key factor to be managed by these firms is uncertainty. These firms typically design around customer-defined time milestones, track progress toward meeting those release dates, and hold employees accountable for staying on time. As uncertainties arise, routines such as phase-review processes are used to update release dates or revisit go/no-go decisions. And wherever, possible, the firm contracts with other firms in order to solve “weak link” problems that are slowing its ability to deliver on time.



Time-based competition is discussed in my book on the Red Queen.