Saturday, December 31, 2016

Leading Truth or Denying Reality?

How we talk about a fact shapes its meaning. Was that new product a failure, or did we just come down the learning curve? Did your career just take a hit, or are you pivoting into a promising new future? Facts are ambiguous, and so how we describe facts helps make sense out of them. One person's rebel is another's traitor, and a well-told story (think Hamilton) will tip the balance.

Of course we all know that effective leaders know how to use narratives - stories that give meaning to facts. But the use of narratives can be either very good - or very bad - for the future. It all depends on whether a narrative leads, or denies, the truth.

In some cases, narratives are used to lead the truth. While the major carmakers were spinning narratives against California's zero-emission vehicle mandate back in 2003, Elon Musk and his colleagues created Tesla, a counter-narrative that has changed the truth about electric cars. Because multiple futures are possible, those who shape what we regard to be possible change our efforts to make those possibilities real. So it is that a narrative can lead the truth.

Leading the truth is not just "spin." Spin is about putting a good face on bad facts. In contrast, leading the truth creates reality by helping us see what is possible. The greatest leaders in history are important not for what they created, but for what they helped others to see as possible, and so create. Once created, what was once considered impossible is then seen, rightly, as the truth. In this way, great leaders lead the truth.

Alternatively, other leaders use narratives to deny realities that we wish were not true. How convenient it would be if man-made climate change were not real. Psychologists tell us that we are prone to believe what we wish were true, even if this requires denying reality - as we see with science deniers confronted by the looming reality of climate change.

Time corrects such wishful thinking, of course, as the facts come to be undeniable at some point. But in the meantime, nefarious leaders take advantage of our desire to deny reality by spinning narratives that play to our weaknesses. When I was young, I recall hearing leaders spin narratives to deny reality: cigarettes are not really bad for you; the US was winning in Vietnam; climate change is a hoax.

Time will tell, of course. In time, we'll look back and know that some leaders were visionary - they used narratives to lead the truth. Others will be shown to have been reality deniers, and history will judge them severely. The problem, of course, is the damage they do in the meantime.


To dive into the large academic literature on narratives and counter-narratives, you might start with the work of Michael Bamberg and his colleagues.

Thursday, December 15, 2016

Leading by Design

  In 1993 the software startup FITS moved from Toulouse France to Scotts Valley California (a town at the edge of Silicon Valley). Their founder, Bruno Delean, had invented a radically new approach to editing photographic images – an important breakthrough given the hardware limitations of the time. Delean’s team worked tirelessly, aiming to get the software in shape for a big demo at Silicon Graphics, at that time a powerhouse in computer-aided design. The team worked day and night, stopping now and then just to eat. They were not paid that well, nor were they working for a noble cause; it was just software. They worked because they wanted to do a good job for Delean, who was a legend among French software developers. Delean himself had done most of the coding, and was there working side-by-side with the team, always available when a tough call had to be made. He led in the most compelling, direct, and personal way possible: by example.


Just up the street on the other side of town, Steve Luczo was re-creating what was to become one of the most innovative companies on earth: Seagate Technologies. Steve had taken over the helm at Seagate, and began to turn the perennially late-to-market disk drive company into the industry’s unrivaled leader. He changed the organization’s structures, routines, and culture dramatically, creating within Seagate a global development platform that brought improved technologies to customers sooner and more profitably than any firm in the industry’s history. His people worked tirelessly, and mastered the science of rapid product development. Innovation at Seagate became routine, and the company transformed storage technology as we know it. Steve Luczo led this company, but not like Bruno Delean. Steve Luczo led by design.

Leading by example shows the way, but leading by design creates a system that discovers the way. Those who lead by example are authentic, because they put their words into action – like Delean. But they are limited to what they know, and what they can do. Delean’s firm ultimately was limited to what he could imagine, and so no longer exists today. Those who lead by design do not invent, nor are they involved in the specific decisions to get the job done. Like Luczo, their names are not on patents. Instead, they build the culture, routines, and structures within which others can flourish. Done well, such leadership creates an organization that takes us places we never imagined. Seagate’s innovations were not foretold by Luczo, but they were created by the organization that he put in place. When you lead by design your job is not to know the future, but to create an organization that discovers the future.

Leading by design is especially effective in changing times, because when times are changing it is difficult for any one person to know what is next. In fact, our successful leaders typically do not have a very good track record when it comes to predicting the future. For starters, their very own success likely came about in ways that they, themselves, did not expect when they were starting out. (That is true for Google, Facebook, and Apple, for instance.) And if you look back on the predictions made (and not made) by our luminaries at any point in time, the track record is unimpressive.  In 1992, for instance, virtually no leaders in the technology world were predicting that the worldwide web would soon explode onto the scene. Like a clairvoyant caught in an avalanche, somehow our technology leaders failed to see the worldwide web coming. Look back at what the experts were saying before many of our most profound innovations, from the microcomputer to wireless telecommunications, and you’ll find they were typically off the mark. But when we lead by design, we do not pretend to know what is next. Instead, we create an organization designed to discover possibilities that we never dreamed of.


The classic academic treatment of these ideas is in Selznick's book on leadership.

Wednesday, November 30, 2016

Bake Your Own Pie

Recently I was lecturing a group of high-level Chinese executives, when one asked me: “What do you think of plagiaristic innovation?” Before I could answer, he went on to explain that for China to “catch up,” he felt it needs to have innovation of any kind – even what he called "plagiaristic" innovation.

Don’t worry. I’m not about to rehearse the well-worn arguments about the protection of intellectual property: incentives for continued innovation, just rewards for investors who back authentic creativity, quality guarantees for consumers of branded products, and the like. Nor am I going down the “information must be free” path – indignantly advocating “free as in free speech (not free beer),” “stick it to the man” (the artist is not getting the payments anyway), or the “hackers’ code of ethics.” No, here I’m talking about something else.

My point here is about what “innovation” means. Debates about intellectual property, stealing, and plagiarism are all about who owns the pie. That question is very important, and is obscured when patent “trolls” flood the system with complaints, or when plagiarists masquerade as innovators. But another important point often gets lost in the fray:

Innovation is not about fighting over the pie; it is about baking a new pie.

For example, hybrid vehicles hit the worldwide market starting in 1999 and 2000, and within a few years an echo of patent litigation followed – escalating in 2003. The big car makers battled over who invented what, sometimes with each other and sometimes with small firms, everyone claiming a piece of the pie. Meanwhile, also in 2003 but with far less fanfare, Elon Musk and his team of co-founders created Tesla, the forward-looking innovator that has changed the game in the automobile industry. The noisy pie fights in 2003 were over hybrids; the profound innovations of 2003 were quietly happening at Tesla.

Pie fights extend to all walks of business life, not just battles over intellectual property. For instance, the so-called “browser wars” between Netscape and Microsoft were at their peak in 1998, following Microsoft’s integration of its Internet Explorer browser into its ubiquitous operating system. Advocates of competition howled, and defenders of Microsoft replied with talk of “seamless technology” and “complementarities”. Also in 1998, but unknown to most people at the time, PhD students Larry Page and Sergey Brin created Google – the company that would change the game so thoroughly that we would soon forget about those early browser wars. The noisy pie fights of 1998 were the browser wars; the great innovation of 1998 was quietly taking shape at the newborn Google.

"Wait," you are saying. "After innovating, innovators need to defend their creation." Of course. Take QualComm, for example. That company has an unparalleled track record of continued innovation in wireless technology. As a result, its intellectual property has turned out to be extremely valuable. It has of course defended that property against plagiarists; it owes that to its shareholders. But QualComm transformed its industry by innovating - never mistaking defending IP for creating new, valuable technologies.

All around us, we see real innovators at the cutting edge of knowledge. Have a conversation with my son Burton Barnett, pictured here doing science, and you won't hear about pie fights; you'll hear about amazing new developments in immunology. And similar developments are happening worldwide - in China, Europe, India, the Americas - everywhere forward-looking people are creating new knowledge. This process of innovation is key to our collective future, and it has little to do with plagiarism or pie fighting.

The lesson to innovators: Pie fights are important; we all deserve our piece of the pie. And of course even true innovators often must fight off plagiarists. But being good at pie fighting does not make you good at innovating. Innovation means baking a new pie. 

The lesson to plagiarists: Want to create something useful? Leave the other guy’s pie alone and learn to bake.


Research on the uniqueness of innovators appears in the work of Lee Fleming and Olav Sorenson, among others.

Tuesday, November 15, 2016

The Time-to-Market Strategy

In many industries, products are profitable only during a limited window of time. We see time-sensitive products, of course, in consumer electronics, where new models of phones, computers, and home entertainment products come (and go) frequently. We also see this pattern in fashion markets. Zara, for instance, introduces new clothing products twice weekly across nearly 2,000 stores – amounting to the introduction (and retirement) of over 10,000 designs a year. Timing is key, as well, in the introduction and short shelf life of cultural products, such as popular music and films. The market for video games follows a similar pattern of rapid new-product introduction and short product lives, including flash-in-the-pan hits like Pokemon Go as well as the more cadenced, but still time-sensitive, annual replacements of the various EA Sports games. Innovative pharmaceutical products also compete in a time-sensitive way, since the first to patent enjoys a profit premium that ends abruptly with patent expiration. Consequently, drug companies are in a race against time, first to file the patent, and then to bring the drug to market. Even some durable products, such as automobiles, are introduced and retired on a time-sensitive, if cadenced, basis.

Time-sensitive markets can be identified by the presence of two underlying features. First, these market place a premium on introducing a product early – or at least on a reliable schedule where being late is punished. For instance, releasing a new version of a gaming console like Xbox or Playstation late - after the holiday season – would be unthinkably costly for Microsoft or Sony. Second, the products in these markets also have a limited shelf life. A new clothing style or a new hit song may be wildly popular today, but within weeks (or maybe days) they will be yesterday’s news.

In such time-sensitive markets, success depends on a distinct logic: the time-to-market strategy. To understand this strategy, consider a firm that introduces a successful new product, “Product 1,” as in the plot below. Initially the product takes off slowly, and then it catches on, finally reaching a maximum market size. This “S-shaped” diffusion curve is typical of successful products.
The story does not end there, however, because this company is facing a market where the shelf life of a product is limited in time. Consequently, it is preparing to introduce a second product, “Product 2,” that will compete directly with its own Product 1. Note that if the firm does not introduce Product 2, somebody else will. That is what is compelling the company to continue with the introduction of another product even though it will cannibalize its first product, as shown here.
Now focusing in on Product 1, we can see that there is a limited window of time during which the firm can make money selling the product. It is interesting to think about the price that can be charged for Product 1 over the life of the product. Typically, the price that can be charged will be much higher earlier in the product’s life, for two reasons. First, the earliest buyers of the newly introduced product will be those with a greater willingness to pay for the product. For instance, if Intel comes to market with a new chip that is extremely valuable to cloud service providers running state-of-the-art server farms, these eager buyers will be the among the first to buy this new chip and they have a high willingness to pay. Less enthusiastic buyers will also buy at some point, but only if the price falls. Second, the price will begin to fall over time as other competitors come out with a product that is a direct rival to Product 1. For instance, perhaps AMD will introduce a competitor to Intel’s new chip. As the price falls, now more and more chips are sold as other buyers come into the market with lower levels of willingness to pay. Ultimately, once Product 2 is on the market, the price for Product 1 falls away completely.
At this point, the logic of a time-to-market strategy is clear. If we introduce Product 1 early, our firm will make a great deal of revenue. To see this, consider the cumulative revenue from Product 1, pictured above. Cumulative revenue falls off rapidly as the time of introduction moves later, due to the fall in price as well as the product's limited lifetime. In short, entering earlier increases total cumulative revenue at an increasing rate.

By this logic, it is obvious that we should release Product 1 at the soonest possible date. However, this may not be profitable. To see this, consider now the costs of developing Product 1. If we could take all the time we want, we could carefully research and develop Product 1 and, when it is ready, we will have run up total costs equal to C1. But by taking our time, we might end up introducing the product late, and this will hurt our revenue. So instead of paying C1 over a long period of development, say, 2 years, what if we accelerate development and pay the same amount but all in one year – or even in six months?

Well, here is the bad news. It turns out that compressing development costs into a short period does not give you the same result. This problem, famously dubbed the “mythical man month” by Frederick Brooks, occurs for two reasons. First, compressing the amount spent on development causes many to work simultaneously and in parallel, which results in coordination diseconomies. Second, there is the problem sometimes called “gestation,” where development is inherently sequential and cannot be made to happen all at once. Concretely, gestation requires time in development because answers obtained at one point in time decide the questions asked at the next point in time. Doing all this development at once leaves us asking many questions that we will never need to know the answer to – and failing to answer questions that we wish we had researched.

Consequently, to speed development, it is not enough to concentrate C1 into a more compressed period of time. Instead, we will have to pay more than C1, and the more we compress the development process, this additional amount will continue to escalate until, at some point, we could spend an infinite amount and not improve our release date any more. So development costs increase at an increasing rate as time-to-market shortens, as shown below.
As the figure shows, the time-to-market strategy confronts the firm with a dynamic cost-benefit comparison. To be profitable, the firm needs to introduce the product early enough to benefit from higher revenue over a longer time, but not so early that its development costs skyrocket.

How is a firm to achieve this balance? The answer comes down to organization. Firms that are good at the time to market strategy are organized in a way that minimizes development costs while maximizing the reliability of product introduction. The key factor to be managed by these firms is uncertainty. These firms typically design around customer-defined time milestones, track progress toward meeting those release dates, and hold employees accountable for staying on time. As uncertainties arise, routines such as phase-review processes are used to update release dates or revisit go/no-go decisions. And wherever, possible, the firm contracts with other firms in order to solve “weak link” problems that are slowing its ability to deliver on time.



Time-based competition is discussed in my book on the Red Queen. 

Sunday, October 30, 2016

Leading Amidst Change: Why Strategy Matters

You’ve probably heard people compare the Fortune 500 from a few decades ago with today, noting how fallen greats like Sears, American Motors, and Zenith Electronics have been eclipsed by innovators. The reason seems clear enough. We’ve seen considerable technological and regulatory change over the past 50 years, so the rules of the game keep changing in business and there appears to be no end in sight. In fact, globalization may even be accelerating the rate of change we see around us.

For business leaders, these changes can be daunting. Companies routinely take their leadership teams off-site to discuss the challenges and opportunities implied by the latest innovations. How will augmented reality affect the gaming market? How will mesh computing affect the financial technology space? How will the internet of things affect, well, everything? The idea seems to be that if we can forecast future technology, we will be more likely to survive or even prosper from the changes to come.

Not so. It turns out that often the very technologies that seem to have upended the great firms of the past were well understood, and sometimes even created, by those very firms: the Swiss invented quartz watches; Kodak invented the digital camera; Sears was a pioneer in inventory and brand management; and the list goes on. A number of different theories have been proposed to explain this puzzle. This note briefly summarizes a few of the major theories, and offers a way that business leaders can constructively think about leading amidst change.


Discontinuous Innovation Theory

As early as the 1960s, writers on technology management distinguished between continuous and discontinuous technological changes (albeit using various terms for the same idea). Continuous, incremental advances happen all the time, gradually improving the state of a given technology. Such changes are typically straightforward for existing firms to accommodate. Discontinuous changes, by contrast, represent a radically different approach to a technology, and often bring about an order-of-magnitude improvement in performance. For instance, adding functionality to an old-style “feature phone” would be a continuous innovation, while the invention of the smart phone could be seen as a discontinuous innovation.

This idea has appeared in hundreds of published academic articles since the 1960s, but its implications for business leadership were best explained in a classic article by Professors Michael Tushman and Phillip Anderson.[1] The breakthrough in Tushman and Anderson’s study was to note a key distinction between different types of discontinuous innovations. Some of these changes, while quite significant in technological terms, build nicely on the capabilities of existing firms. Such “competence enhancing” discontinuities retrench the status quo, giving incumbents even more of an advantage over new, upstart organizations. By contrast, other discontinuities render irrelevant, or even counter-productive, the capabilities of existing firms. These “competence destroying” discontinuities are difficult for industry incumbents to employ profitably, since by doing so they harm their existing operations. For that reason, such changes are more likely to be brought to market by new entrants to a business.

Using discontinuous innovation theory as her lens, a business leader will pay attention to how new technologies either enhance or destroy the value of her firm’s existing capabilities. For example, consider the iconic case of the Bank of America confronted by new innovations in electronic funds transfer technology. This technology would be at the heart of many new innovations in retail banking, including the spread of distributed networks of ATM machines. Prior to this discontinuous change, the Bank of America was the unparalleled leader in geographically distributed retail banking, operating a massive brick-and-mortar branch system across California’s 900 miles and hundreds of cities. Decades of experience honed the bank’s capabilities, as it daily cleared millions of paper transactions into its system regardless of where they originated across the state. With over 2,500 branches, the bank had a branch in every town in that state, and so attracted customers wanting convenient access to their accounts. The spread of ATM machines dramatically reduced the unique value of this system, making widespread geographic access to bank accounts commonplace. Although ATM technology was embraced by Bank of America, its competitive standing clearly was set back in the new era. By the late 1980s, the bank saw record losses in part due to its outdated brick-and-mortar system – the very system that had been key to its competitive advantage – and so it laid off tens of thousands of employees, closed branches, and consolidated functions into regional centers. In this way, electronic funds transfer technology can be understood as a competence destroying discontinuity from the perspective of incumbent banks in the era of brick-and-mortar branching. 


Structural Inertia Theory

Large, well-established organizations are notoriously difficult to change. The leading theory to explain this fact is known as “structural inertia theory,” pioneered by sociologists Michael T. Hannan and John Freeman in their seminal 1984 article[2], and since then supported and developed by a large body of research.[3] In this theory, the authors assume that modern society favors the creation and development of organizations that perform reliably and that can account for their actions rationally. Further, they assume that accountability and reliability are greater for organizations that have reproducible structures – meaning routinized, bureaucratic structures. Finally, they assume that such structures are more difficult to quickly change, that is, more inert. These premises lead them to the essence of their theory: Modern society favors the development of inert organizations.

Hannan and Freeman further argue that when inert organizations do change, these changes are hazardous to their performance and their life chances. Especially as they age, grow, and become complex, increasing inertia renders attempts to change difficult and fraught with risk. If the organization survives the change, it may be better off – at least as long as the change keeps pace with the demands of the environment. Stated in terms of a large-scale strategic change, this basic prediction is illustrated below:


Structural-Inertia Theory:
The Effect of Change at time T1 on Organizational Performance
 

In this illustration, an organization makes a large scale strategic and organizational change as of time T1, and as a result suffers a dramatic decrease in performance. (In fact, Hannan and Freeman state this prediction in terms of the organization’s likelihood of outright failure.) But over time, if the organization does survive, its performance improves. And, if the organization’s new strategy is better aligned to its situation, then its performance may even be better than at the start of the ordeal.

The evidence suggests that two distinct effects can be usefully separated when looking at this theory empirically.[4] The initial decrease in performance has been widely documented, and can be thought of as the “process effect” of change. This effect captures the various difficulties that come up as large-system changes cascade through an organization, creating various misalignments along the way. As these difficulties are worked out, eventually the organization reaps the “content effect” of the change – the improvement in performance due to the new strategy being better aligned with the environment.

Structural inertia theory highlights a problem inherent to strategic leadership. If leadership does a good job aligning the different aspects of organization and strategy, then changing that complex, aligned system will be especially difficult. And even if leadership does manage to change that system, the consequential drop in performance is likely to be extreme. Looking again at the Bank of America example, many pundits criticized the company for taking so long to shift away from the earlier strategy and structure centered on brick-and-mortar branches, skilled loan officers in every location, and decentralization to allow for extreme localization. Only after nearly failing did the bank’s leadership manage to break with that strategy and structure. Yet, when seen through the lens of structural inertia theory, this resistance to change is precisely what one would expect from such a well-managed organization. Years of fine tuning had created a complex organization including various, well-aligned features that made the bank particularly good at geographic localization. In this way, the difficulties involved with changing Bank of America were more a measure of its excellence than an indictment – albeit excellence that had become outdated. And this example suggests a caution to business leaders: Your good work aligning your strategy and organization today is precisely what will make changing that strategy so difficult and hazardous tomorrow.


Disruption Theory

By far the most popular theory of strategic and organizational change these days is disruption theory, developed by Professor Clay Christensen and his colleagues beginning with its introduction in 1995.[5] The theory restricts its attention to change involving technologies and innovations, and assumes a market context with differentiation between a low-end of the market, a mainstream market, and a high-end. The product performance, expectations of customers for product performance, willingness to pay for performance, and potential for profitability are all assumed to be low at the low end, middling for the mainstream, and high at the high end. Incumbent firms are assumed to use a well-established technology and focus on the mainstream market. Over time, they are assumed to improve their technologies gradually in an attempt to serve the high-end of the market, where profitability is greatest. In so doing, they are assumed to “overshoot” the needs of the low end and middle of the market. Furthermore, focusing on profitability, they are assumed to largely ignore the low-end of the market.

Under these assumptions, new entrants can enter with completely different technologies. These technologies assumedly will not initially perform as well as the established technologies used by incumbents, but they are good enough to give the new entrant a foothold market: either serving the underserved low-end of the market, or serving previously unserved “non-consumers” in a new-market foothold. Incumbent firms, meanwhile, pay no attention to the new entrant, because they are serving different customers. As time passes, however, the new entrant’s technology improves greatly, ultimately moving the firm into the mainstream and high-end of the market. At that point the new firm gains an advantage over the incumbent, often because the new firm’s technologies are part of a business model that is incompatible and superior to that of the incumbent.

Disruption theory is extremely popular, and this popularity leads to some problems for the theory’s use. For many business leaders, especially in information technology businesses, the pattern described by Christensen and his colleagues reflects their context well. For these leaders the theory is useful – at least if they can spot potential disruptors while still in the “foothold” stage. (A theory is not helpful if you must wait until you see a success to then identify it.) In many other instances, however, the term “disruption” is widely used to mean any change that has a big effect on business regardless of whether the change resembles that addressed by the theory. In these instances, the theory will not apply and in fact could be misleading. Christensen and his colleagues have recently voiced concern about this. In a recent article, they explain that many big, important business changes are not disruption as the theory defines it.[6] Uber, they note, is not a disruptor (based on their understanding of Uber).

For business leaders, the lens of disruption theory directs attention to those start-ups just getting a foothold either in a new market segment or in an underserved low end of an existing market. And the theory helps to evaluate these new entrants not on the basis of how well their technology performs when it starts out, but rather in terms of its potential to become disruptive as it improves over time. To some extent, the theory also directs attention to the new business models that often accompany such start-ups. 


Leading Amidst Change

For you as a business leader, all three of these theoretical perspectives direct attention, first and foremost, to what makes your organization compete well. Discontinuous innovation theory highlights the importance of knowing the capabilities that give your organization a strategic advantage, and how a given technological change affects the value of those capabilities. Structural inertia theory directs your attention to the complexity and intricacy of your organization in order to understand the cascade of misalignments that moving to a new strategy will trigger – if even temporarily. And, for some, disruption theory directs your attention to potential threats that, while benign in their infancy, have the chance to mature into something strategically significant.

One way to sum up these implications is to note that they draw attention to the importance of strategy, and in particular to the logic of your firm’s strategy. Big changes that leave the logic of your strategy intact are not a threat. Discontinuous innovation theory calls such changes “enabling.” Structural inertia theory notes that such changes do not require a fundamental alterations to your organization. And disruption theory would say that such changes are not likely to be problematic for you as an incumbent. But when a change threatens the logic of your firm’s strategy, it is time to act. The changes you set in motion will be difficult and costly for your organization, especially if you have done a good job aligning your strategy and organization around a clear and compelling logic. But the alternative is to become yet another example in the failure lexicon of business school professors.

What’s more, looking to see how changes affect your strategy’s logic gives you a way to identify and diagnose what matters before your performance suffers. As a leader, you do not have the luxury of waiting until changes play themselves out. You must make the call early on in the game, while there is still a chance to affect the outcome. Going back to the Bank of America, the logic of their branch strategy was crystal clear, even as technological changes were afoot that would threaten that logic. Had their leadership honestly reviewed how those changes were likely to affect the bank’s strategic logic, steps could have been taken long before red ink ultimately drove action. This lesson applies to every business leader. The logic of your strategy can be identified today, as can the implications of the changes you see around you for that logic. Use your strategy’s logic as the lens through which you understand and manage change.




[1] Tushman, Michael L. and Philip Anderson. 1986. “Technological Discontinuities and Organizational Environments.” Administrative Science Quarterly, 31: 439-465. 
[2] Hannan, Michael T. and John Freeman. 1984. “Structural Inertia and Organizational Change.” American Sociological Review, 49: 149-164.
[3] Carroll, Glenn R. and Michael T. Hannan. 2000. The Demography of Corporations and Industry. Princeton: Princeton University Press.
[4] Barnett, William P. and Glenn R. Carroll. 1995. “Modeling Internal Organizational Change.” Annual Review of Sociology, 21:217-236.
[5] Bower, Joseph L. and Clayton M. Christensen. 1995. “Disruptive Technologies: Catching the Wave.” Harvard Business Review, Jan-Feb.
[6] Christensen, Clayton M., Michael E. Raynor, and Rory McDonald. 2015. “What Is Disruptive Innovation?” Harvard Business Review, December.

Saturday, October 15, 2016

Differing Without Dividing

Variety is great for innovation. For instance, consider the case of Seymour Cray, the “father of the supercomputer.” In the 1970s, Cray left Control Data to start Cray Research, a company devoted to creating the world’s fastest computer. Cray approached the problem with a revolutionary architecture, so called “vector processing.” By 1976 he and his team introduced the Cray 1, and Cray Research was seen as the Mecca of high-speed computing. John Rollwagen became company President in 1977, bringing business leadership alongside Cray’s technological prowess.


In 1979, Rollwagen brought in another technology genius, Steve Chen, to lead the design of a completely different approach to supercomputing. So as Seymour Cray’s team worked on the Cray 2, Chen’s team worked on the Cray X-MP. Chen’s design built on Cray’s initial innovation, but did so using a revolutionary architecture featuring multiple processors operating in parallel. Released in 1982, the X-MP set a new standard for supercomputer performance, and significantly raised the bar for the team working on the Cray 2.


When we do not know what the future holds, variety helps our organization to discover what is possible. This truth is one reason why we often hear people saying that they want to increase the diversity of their employees. Just like the biosphere, organizations evolve better if they sustain variety.

Yet examples like Cray and Chen’s are rare. One reason is that sustaining variety is expensive. How inefficient to run multiple projects that are trying to do the same thing. But another, bigger problem is that sustaining variety threatens to divide a company. People object to having others in their company working at cross purposes. How can we encourage differences without being divisive?

One way is to live by the adage “disagree and commit.” Here in Silicon Valley people attribute the saying to Intel. The idea is that you should encourage disagreement during the decision-making process, in order to improve the quality of your decisions. But once a decision is made, everybody needs to fully commit to its implementation. Unfortunately, in practice this saying often is used to silence those who see things differently. Often managers say “disagree and commit,” but they are really saying “disagree and shut up.”


I prefer “switch and commit.” The goal is still to end up committing at the end of the process, but during the decision I want the participants to switch roles. The person disagreeing with you needs to take your position and argue it well. Similarly, you must argue the other’s view well. You can think of the approach as devil’s advocacy taken seriously by both sides.

I first tried “switch and commit” when teaching a controversial topic here at Stanford. For the first assignment, the students had to state their position on the topic. For the second, big assignment, they had to write an essay taking the opposite view. (They did not hear about the second assignment until after they handed in the first.) The end results were some fantastic essays, because the authors were legitimately skeptical.

Since then, I have tried “switch and commit” when facilitating hard-hitting business meetings among top managers. The results have been mixed. Many people cannot get their head around a different perspective. But now and then you find an exceptional leader who appreciates the value of differing without dividing.


A readable review of related academic work is Scott Page’s book The Difference.

Friday, September 30, 2016

Leading Growth: Why Strategy Matters

“I have copied Facebook entirely in Spanish,” the student said to Zuckerberg. “You are operating only in English. You should be growing Facebook in other countries and languages.”

“Well,” Zuckerberg replied. “You have a cool accent. Why don’t you do it?”

And so it was that Javier Olivan would join Facebook in 2007, not long after receiving his MBA. By 2008, his team tested a user-generated content translation-mode version of Facebook, initially in France. They released the product at the end of the day, and upon returning to work the next day it appeared that there was a bug. The program’s tracking statistics indicated that tens of thousands of lines of code had been translated, and that users were already flocking to the French version of Facebook. In fact, the translation program had gone viral, and would prove successful beyond expectations. Within a month, Facebook would launch in over 40 countries. Looking back at Facebook’s growth over time, Javier’s arrival at the company coincides with a clear kink in its growth trajectory – the point when it transitioned to become a global giant.
 
Javier Olivan’s story is the archetypal success path for a budding leader: To join a company and spur tremendous growth. Some people even list growth claims next to each job on their resume. But what about those who try to grow and fail? In fact, research shows that many attempts to accelerate organizational growth go awry. Often these attempts fail to produce growth at all. Worse yet, some succeed in triggering growth initially, but then send the firm into a death spiral not long thereafter.

Why are some growth initiatives a great success, while others end in failure? The difference comes down to one word: Strategy.


When Does Growth Succeed?

Growth succeeds when it builds on, and reinforces, a company’s source of advantage. For instance, MercadoLibre has grown since the late 1990s to become the leading platform in Latin America for internet commerce. While it started out looking very similar to eBay in its strategy, soon it evolved to serve as an online marketplace for business-to-consumer traffic with a variety of features that have proven to be especially attractive to customers in Brazil, Argentina, Mexico, and elsewhere in Latin America. The company’s strategy was unique to its context, including its own payment system and an order-fulfillment system both designed to deal with the special circumstances found in Latin America. These features soon paid off. For instance, the company experienced a surge of growth as customers responded very positively to the “MercadoPago” payment processing system, including many customers who previously had difficulty engaging in e-commerce.

MercadoLibre’s steady, organic growth stands in stark contrast to DeRemate, an early rival. Initially, DeRemate grew much faster by acquiring other startups in an attempt to build a large user base. But lacking any clear source of advantage, DeRemate soon saw its growth rate stall. Ultimately the company would fall by the wayside as MercadoLibre steadily maintained its growth trajectory.

The difference here is more than just the pattern of success vs. failure. The difference is that MercadoLibre’s growth has been guided by a clear understanding of its source of advantage. Each move made by the company’s leadership was designed to build on and reinforce that advantage. By contrast, DeRemate pursued growth, but did so without identifying and building on a clearly defined source of advantage. The lesson: Growth without strategy cannot be sustained.


Growing a Cost Advantage

For companies pursuing a low-cost advantage in their markets, growth is especially attractive. With growth, a company’s fixed costs are covered by a larger and larger revenue base, driving down average costs. Examples of such growth abound, but Walmart is particularly instructive. That company invested in sizeable assets, including information technology to control inventory and its own fleet of trucks linking an elaborate warehouse and logistics network. As the number of Walmart stores grew, these investments were then shared by many more establishments, driving down costs. These lower costs, in turn, made Walmart more competitive, further expanding the organization in a classic virtuous circle.

If you are seeking to grow a cost advantage, pay attention to organization. Your deployment of people, organizational structures, and work routines all must reinforce the low-cost strategy. Your organizational culture, too, should also reflect a primary concern with cost minimization. In short, your strategy should guide how you organize growth.


Growing a Quality Advantage

In contrast, some firms grow by building on a high-perceived-quality competitive advantage. For instance, high-end hotel chains such as the Four Seasons or Ritz Carleton charge a much higher price than budget hotels and deliver a luxury experience, and they do so at very large scale. Attention to detail, spacious and opulent facilities, and extremely responsive service set these hotels apart. Growing such a high-quality strategy is a tricky proposition, because increasing scale does not automatically reinforce the strategy as it does in the low-cost case. If anything, increasing scale threatens to dilute quality, replacing distinctiveness with the “cookie cutter” output typical of mass production.

For this reason, growing a high-quality firm requires constant attention to replicating quality. In the luxury hotel example, leadership paces growth so that each new property maintains quality. New structures are given unique designs, or in some cases are acquisitions of classic hotels. New employees are trained first in existing, high-performance properties before being deployed into new locations. And system-wide, standard operating procedures make sure that service at every hotel is at the highest level. To scale high-quality successfully, your strategy must be your guide.


When Growth Fails

It may seem obvious that growth should build on your competitive advantage, but very often companies fail to abide by this rule. The reason is that growth often changes a company’s circumstances in ways that leadership does not understand. Worse yet, increased scale and complexity make it harder for leadership to diagnose their situation. So it helps to review a few of the common ways that companies fail as they try to grow.


Unclear Advantage

Sometimes leadership does not know whether it has a clear source of advantage, and perhaps has not even thought about the matter. Nonetheless, armed with the desire to grow they set out to scale up. When this happens, growth occurs without guidance and so a company ends up scaling without a strategy to sustain success. At DeRemate, leadership thought that growth was their strategy, and so they went on an acquisition spree without understanding why or how they would create value – an invitation to failure. The lesson: Growth is only a strategy if it is based on a clear source of advantage.

Premature growth. In some cases, a company does develop a source of advantage, but too late to shape their growth. The pioneering online retailer Webvan failed for precisely this reason. A darling of the dot-com boom, Webvan and its founder Louis Borders (of Border’s Books fame) attracted premier venture capitalists and went public at record valuations. Using this capital, they built out an elaborate delivery system and infrastructure, and immediately scaled to various regions around the United States. As they grew, each of their regions then started down the learning curve, coming to understand the economics of online grocery delivery. Finally, their first distribution center, in the San Francisco Bay Area, went cash flow positive – the very day the company filed for bankruptcy. By growing before it understood how it created value, the company ended up a failure. (Note: Many of the trucks driven by Amazon’s delivery system today are repurposed Webvan trucks!) The lesson: Before you grow, understand your source of advantage.

Rapid Growth. Oftentimes leadership in a company understands well its source of advantage, but by growing rapidly they may lose that advantage. Such was the case for the Mendocino Brewing Company. Founded in the small town of Hopland in Northern California, Mendocino was a ramshackle facility attached to a brewpub. The brewery, one of the original craft brewers of the modern era, was known for producing some distinctive beers, such as “Red Tail Ale.” But when they tried to expand operations, they did so very rapidly after an investment by a major international firm. Rapid growth meant relying on production from shiny new “turnkey” plants operated using standard industry practice. These plants turned out a beer that was much like the original Red Tail, but it had lost much of its distinctiveness. Today Red Tail sits on more retail shelves than ever, but it is very little different from the many beers around it. The lesson: Beware of rapid growth, lest you standardize and lose what made you unique.

Growing Beyond Your Market. Professors Dan Levinthal and James March coined a term worth knowing: the “competency trap.”[1] We all know about the problem of fighting today’s war with yesterday’s army. This error happens not because an organization is incompetent; rather, it is competent at the wrong things. So it goes with companies. Having succeeded, leadership is sure that they can now scale. But by scaling, they encounter different customers, new geographies, unfamiliar rivals, and various challenges for which they are not prepared. The archetypal case is the expansion of a retail business beyond its first location, as when a restaurant becomes a chain. This initial move is fraught with danger, and ends the life of many businesses as they discover that what made them succeed in one location no longer serves as a source of advantage in other locations. The lesson: know why you win in your current market – and what it will take to win as you grow.


The Two Roads to Growth

Research reveals two distinct paths to scaling up: marketing-led growth and “viral” growth.[2] Marketing-led growth comes from actions taken by your organization to influence the purchasing behavior of customers, such as advertising and merchandising. Viral growth occurs when your existing customers influence others to become customers. Most growing companies experience both forms of growth, depending on their industry. Each path has its distinct challenges, but for both the key to success remains the same: Strategy.


Marketing-led Growth

These days many Silicon Valley firms have no “VP of Marketing;” instead, they have a “VP of Growth.” And the change is more than just a name. Marketing can drive growth, illustrated nicely if you browse the internet for “growth hacks” dreamt up by clever marketeers. But take caution: Quick wins are short lived when it comes to growth marketing.

Easy-come, easy-go has long been understood in the advertising business. Research shows that brand building takes a long time and consistent messaging. Jack Welch, former CEO of GE, used to say that a good market message needs to be “relentless and boring.” By that he meant that long after you are tired of your own message, it still needs to stay on theme because shaping brand identity takes time. To see for yourself, search up some cool Super Bowl ads of the past. Now look at the companies represented. Some are household names, of course, but some will be companies you’ve never heard from since.

Drop into a “marketing strategy” meeting in many companies today and you’ll hear a lot of talk about LTV/CAC: comparing the “lifetime value of customers” to “customer acquisition costs.” Companies are focused on these measures because internet advertising allows them to make reasonably precise calculations of each. If you turn up the “spend” on acquiring customers, you’ll get growth, but are those new customers worth the price? A good deal of time is being spent these days calculating LTV/CAC in order to answer that question. Unfortunately, this question misses the point.

Good brand marketing reinforces the company’s source of competitive advantage. That, in turn, translates into high “lifetime value of customers.” Shimano means quality to the bicycle enthusiast. Walmart means low prices to their shoppers. Ritz Carlton means luxury and service to high-end travelers. Marketing strategy is about aligning your message to your source of advantage, and that gives you high LTV. The tactical question of how much to spend acquiring customers is secondary, since it takes LTV – and thus strategy – as a given. Whatever the “spend,” marketing-led growth works to your advantage when it is guided by your strategy.


Viral growth

Unless you have been lost in a cave for the past decade, you have heard about viral growth. Viral growth occurs when existing customers influence others to become new customers. The subject is an old one, based on the statistics of epidemiology, but it became all the rage in business with the rise of the App economy of iOS and Android. Whether you’re a developer or just a consumer, you know about the importance of viral growth.

Experts on viral growth in business typically focus on what has become known as “growth accounting,” by which they mean the arithmetic of viral growth. It is worth thinking through the basics of growth accounting in order to see why strategy is essential to viral growth. Here I break down the discussion into three elements: contagiousness, susceptibility, and retention.[3]

Contagiousness. A customer who is introducing many other people to your product rates high on contagiousness. Contagious customers are the dream of every business, which is why so many companies calculate the “net promoter scores” of their customers. High NPS customers are out there doing your marketing for you.

Susceptibility. A potential customer who is especially likely to become an actual customer is highly susceptible. A primary job of marketing is to identify susceptible parts of the market; hence all the attention paid to books like Geoff Moore’s Crossing the Chasm. These frameworks help you to identify which people (or companies) are more likely to become customers.

The virality you experience depends on both contagiousness and susceptibility. If you measure contagiousness by N, the number of potential customers introduced to your product by your average existing customer, and if you measure susceptibility by p, the probability that any one of those potential customers becomes an actual customer, then your “virality coefficient” v is just the product v = N×p.

Viral growth depends on the size of v. If it is greater than 1, then your company will grow explosively. But most companies have a virality coefficient well below 1. Nonetheless, since you are probably also generating new customers directly through other marketing, any virality will amplify the effectiveness of your other marketing activities.

Many leaders think that their main job is to increase v, ideally until it is greater than 1. The “growth hacks” you may read about online typically aim to do just that. Some techniques focus on increasing contagiousness; others focus on increasing susceptibility. Of course, since v is the product of these two factors, you’ll need to make sure neither is too low if you want to increase v. Yet most efforts meant to increase v miss the point entirely.

The important point is that if you increase v, you do so in a way that builds on your company’s competitive advantage. Otherwise, your “growth hack” will generate nothing more than a temporary flood of interest, followed by a downturn. For instance, you could give current users an extra incentive to spread the word, and they will become more contagious. Or, you could steeply discount your product so that more potential customers become actual customers, and they will become more susceptible. But perhaps neither of these techniques are based on your company’s competitive advantage. Consequently, you’ll get a short-run boost to size but not for the right reasons. None of your (now more numerous) users will be excited by your product, and so they will soon leave. In short, you should not force virality. Rather, you should design measures that increase virality by building on your company’s source of advantage.

Looking again at Facebook’s internationalization, the user-generated-content translation platform allowed users to tailor Facebook to subtleties of language and culture in each country, which in turn increased the company’s virality in many countries despite the fundamental differences across these contexts.

Retention. Perhaps the most neglected aspect of viral growth is the problem of retention, the rate at which customers remain engaged after they become customers. Think of your favorite flash-in-the-pan success: the short-lived cellphone game or the over-hyped software startup. Such products attract a great deal of attention and experience extremely rapid viral growth. But in no time they vanish because their customers move on.

In the virality model, the retention rate is akin to the plug in a bathtub. Fill the tub as fast as you want, but if the drain is open you’ll soon be empty. When leaders focus exclusively on increasing virality, they enjoy the short-run gains of rapid growth, but without continued engagement they fail to retain customers and soon are history.

Companies trying to increase retention do so by focusing on engagement. Specifically how you engage your customers depends on your source of advantage. So, in the end, the secret to viral growth – like all forms of growth – is for leadership to stay focused on its source of advantage. The lesson: Growth hacking may feel good in the short run, but in the long run there is no substitute for strategy.



[1] Levinthal, Daniel, A. and James G. March. 1993. “The Myopia of Learning,” Strategic Management Journal, 14: 95-112.
[2] More fundamentally, models of “diffusion” distinguish between broadcast and contact-dependent processes. See Bartholomew, 1982, Stochastic Models for Social Processes. NY: Wiley.
[3] The fundamental work distinguishing contagiousness and susceptibility in a diffusion model is found in Strang, David and Nancy Brandon Tuma, 1993. “Spatial and Temporal Heterogeneity in Diffusion,” American Journal of Sociology, 99: 614-639.