Saturday, December 31, 2016

Leading Truth or Denying Reality?

How we talk about a fact shapes its meaning. Was that new product a failure, or did we just come down the learning curve? Did your career just take a hit, or are you pivoting into a promising new future? Facts are ambiguous, and so how we describe facts helps make sense out of them. One person's rebel is another's traitor, and a well-told story (think Hamilton) will tip the balance.

Of course we all know that effective leaders know how to use narratives - stories that give meaning to facts. But the use of narratives can be either very good - or very bad - for the future. It all depends on whether a narrative leads, or denies, the truth.

In some cases, narratives are used to lead the truth. While the major carmakers were spinning narratives against California's zero-emission vehicle mandate back in 2003, Elon Musk and his colleagues created Tesla, a counter-narrative that has changed the truth about electric cars. Because multiple futures are possible, those who shape what we regard to be possible change our efforts to make those possibilities real. So it is that a narrative can lead the truth.

Leading the truth is not just "spin." Spin is about putting a good face on bad facts. In contrast, leading the truth creates reality by helping us see what is possible. The greatest leaders in history are important not for what they created, but for what they helped others to see as possible, and so create. Once created, what was once considered impossible is then seen, rightly, as the truth. In this way, great leaders lead the truth.

Alternatively, other leaders use narratives to deny realities that we wish were not true. How convenient it would be if man-made climate change were not real. Psychologists tell us that we are prone to believe what we wish were true, even if this requires denying reality - as we see with science deniers confronted by the looming reality of climate change.

Time corrects such wishful thinking, of course, as the facts come to be undeniable at some point. But in the meantime, nefarious leaders take advantage of our desire to deny reality by spinning narratives that play to our weaknesses. When I was young, I recall hearing leaders spin narratives to deny reality: cigarettes are not really bad for you; the US was winning in Vietnam; climate change is a hoax.

Time will tell, of course. In time, we'll look back and know that some leaders were visionary - they used narratives to lead the truth. Others will be shown to have been reality deniers, and history will judge them severely. The problem, of course, is the damage they do in the meantime.

To dive into the large academic literature on narratives and counter-narratives, you might start with the work of Michael Bamberg and his colleagues.

Thursday, December 15, 2016

Leading by Design

  In 1993 the software startup FITS moved from Toulouse France to Scotts Valley California (a town at the edge of Silicon Valley). Their founder, Bruno Delean, had invented a radically new approach to editing photographic images – an important breakthrough given the hardware limitations of the time. Delean’s team worked tirelessly, aiming to get the software in shape for a big demo at Silicon Graphics, at that time a powerhouse in computer-aided design. The team worked day and night, stopping now and then just to eat. They were not paid that well, nor were they working for a noble cause; it was just software. They worked because they wanted to do a good job for Delean, who was a legend among French software developers. Delean himself had done most of the coding, and was there working side-by-side with the team, always available when a tough call had to be made. He led in the most compelling, direct, and personal way possible: by example.

Just up the street on the other side of town, Steve Luczo was re-creating what was to become one of the most innovative companies on earth: Seagate Technologies. Steve had taken over the helm at Seagate, and began to turn the perennially late-to-market disk drive company into the industry’s unrivaled leader. He changed the organization’s structures, routines, and culture dramatically, creating within Seagate a global development platform that brought improved technologies to customers sooner and more profitably than any firm in the industry’s history. His people worked tirelessly, and mastered the science of rapid product development. Innovation at Seagate became routine, and the company transformed storage technology as we know it. Steve Luczo led this company, but not like Bruno Delean. Steve Luczo led by design.

Leading by example shows the way, but leading by design creates a system that discovers the way. Those who lead by example are authentic, because they put their words into action – like Delean. But they are limited to what they know, and what they can do. Delean’s firm ultimately was limited to what he could imagine, and so no longer exists today. Those who lead by design do not invent, nor are they involved in the specific decisions to get the job done. Like Luczo, their names are not on patents. Instead, they build the culture, routines, and structures within which others can flourish. Done well, such leadership creates an organization that takes us places we never imagined. Seagate’s innovations were not foretold by Luczo, but they were created by the organization that he put in place. When you lead by design your job is not to know the future, but to create an organization that discovers the future.

Leading by design is especially effective in changing times, because when times are changing it is difficult for any one person to know what is next. In fact, our successful leaders typically do not have a very good track record when it comes to predicting the future. For starters, their very own success likely came about in ways that they, themselves, did not expect when they were starting out. (That is true for Google, Facebook, and Apple, for instance.) And if you look back on the predictions made (and not made) by our luminaries at any point in time, the track record is unimpressive.  In 1992, for instance, virtually no leaders in the technology world were predicting that the worldwide web would soon explode onto the scene. Like a clairvoyant caught in an avalanche, somehow our technology leaders failed to see the worldwide web coming. Look back at what the experts were saying before many of our most profound innovations, from the microcomputer to wireless telecommunications, and you’ll find they were typically off the mark. But when we lead by design, we do not pretend to know what is next. Instead, we create an organization designed to discover possibilities that we never dreamed of.

The classic academic treatment of these ideas is in Selznick's book on leadership.

Wednesday, November 30, 2016

Bake Your Own Pie

Recently I was lecturing a group of high-level Chinese executives, when one asked me: “What do you think of plagiaristic innovation?” Before I could answer, he went on to explain that for China to “catch up,” he felt it needs to have innovation of any kind – even what he called "plagiaristic" innovation.

Don’t worry. I’m not about to rehearse the well-worn arguments about the protection of intellectual property: incentives for continued innovation, just rewards for investors who back authentic creativity, quality guarantees for consumers of branded products, and the like. Nor am I going down the “information must be free” path – indignantly advocating “free as in free speech (not free beer),” “stick it to the man” (the artist is not getting the payments anyway), or the “hackers’ code of ethics.” No, here I’m talking about something else.

My point here is about what “innovation” means. Debates about intellectual property, stealing, and plagiarism are all about who owns the pie. That question is very important, and is obscured when patent “trolls” flood the system with complaints, or when plagiarists masquerade as innovators. But another important point often gets lost in the fray:

Innovation is not about fighting over the pie; it is about baking a new pie.

For example, hybrid vehicles hit the worldwide market starting in 1999 and 2000, and within a few years an echo of patent litigation followed – escalating in 2003. The big car makers battled over who invented what, sometimes with each other and sometimes with small firms, everyone claiming a piece of the pie. Meanwhile, also in 2003 but with far less fanfare, Elon Musk and his team of co-founders created Tesla, the forward-looking innovator that has changed the game in the automobile industry. The noisy pie fights in 2003 were over hybrids; the profound innovations of 2003 were quietly happening at Tesla.

Pie fights extend to all walks of business life, not just battles over intellectual property. For instance, the so-called “browser wars” between Netscape and Microsoft were at their peak in 1998, following Microsoft’s integration of its Internet Explorer browser into its ubiquitous operating system. Advocates of competition howled, and defenders of Microsoft replied with talk of “seamless technology” and “complementarities”. Also in 1998, but unknown to most people at the time, PhD students Larry Page and Sergey Brin created Google – the company that would change the game so thoroughly that we would soon forget about those early browser wars. The noisy pie fights of 1998 were the browser wars; the great innovation of 1998 was quietly taking shape at the newborn Google.

"Wait," you are saying. "After innovating, innovators need to defend their creation." Of course. Take QualComm, for example. That company has an unparalleled track record of continued innovation in wireless technology. As a result, its intellectual property has turned out to be extremely valuable. It has of course defended that property against plagiarists; it owes that to its shareholders. But QualComm transformed its industry by innovating - never mistaking defending IP for creating new, valuable technologies.

All around us, we see real innovators at the cutting edge of knowledge. Have a conversation with my son Burton Barnett, pictured here doing science, and you won't hear about pie fights; you'll hear about amazing new developments in immunology. And similar developments are happening worldwide - in China, Europe, India, the Americas - everywhere forward-looking people are creating new knowledge. This process of innovation is key to our collective future, and it has little to do with plagiarism or pie fighting.

The lesson to innovators: Pie fights are important; we all deserve our piece of the pie. And of course even true innovators often must fight off plagiarists. But being good at pie fighting does not make you good at innovating. Innovation means baking a new pie. 

The lesson to plagiarists: Want to create something useful? Leave the other guy’s pie alone and learn to bake.

Research on the uniqueness of innovators appears in the work of Lee Fleming and Olav Sorenson, among others.

Tuesday, November 15, 2016

The Time-to-Market Strategy

In many industries, products are profitable only during a limited window of time. We see time-sensitive products, of course, in consumer electronics, where new models of phones, computers, and home entertainment products come (and go) frequently. We also see this pattern in fashion markets. Zara, for instance, introduces new clothing products twice weekly across nearly 2,000 stores – amounting to the introduction (and retirement) of over 10,000 designs a year. Timing is key, as well, in the introduction and short shelf life of cultural products, such as popular music and films. The market for video games follows a similar pattern of rapid new-product introduction and short product lives, including flash-in-the-pan hits like Pokemon Go as well as the more cadenced, but still time-sensitive, annual replacements of the various EA Sports games. Innovative pharmaceutical products also compete in a time-sensitive way, since the first to patent enjoys a profit premium that ends abruptly with patent expiration. Consequently, drug companies are in a race against time, first to file the patent, and then to bring the drug to market. Even some durable products, such as automobiles, are introduced and retired on a time-sensitive, if cadenced, basis.

Time-sensitive markets can be identified by the presence of two underlying features. First, these market place a premium on introducing a product early – or at least on a reliable schedule where being late is punished. For instance, releasing a new version of a gaming console like Xbox or Playstation late - after the holiday season – would be unthinkably costly for Microsoft or Sony. Second, the products in these markets also have a limited shelf life. A new clothing style or a new hit song may be wildly popular today, but within weeks (or maybe days) they will be yesterday’s news.

In such time-sensitive markets, success depends on a distinct logic: the time-to-market strategy. To understand this strategy, consider a firm that introduces a successful new product, “Product 1,” as in the plot below. Initially the product takes off slowly, and then it catches on, finally reaching a maximum market size. This “S-shaped” diffusion curve is typical of successful products.
The story does not end there, however, because this company is facing a market where the shelf life of a product is limited in time. Consequently, it is preparing to introduce a second product, “Product 2,” that will compete directly with its own Product 1. Note that if the firm does not introduce Product 2, somebody else will. That is what is compelling the company to continue with the introduction of another product even though it will cannibalize its first product, as shown here.
Now focusing in on Product 1, we can see that there is a limited window of time during which the firm can make money selling the product. It is interesting to think about the price that can be charged for Product 1 over the life of the product. Typically, the price that can be charged will be much higher earlier in the product’s life, for two reasons. First, the earliest buyers of the newly introduced product will be those with a greater willingness to pay for the product. For instance, if Intel comes to market with a new chip that is extremely valuable to cloud service providers running state-of-the-art server farms, these eager buyers will be the among the first to buy this new chip and they have a high willingness to pay. Less enthusiastic buyers will also buy at some point, but only if the price falls. Second, the price will begin to fall over time as other competitors come out with a product that is a direct rival to Product 1. For instance, perhaps AMD will introduce a competitor to Intel’s new chip. As the price falls, now more and more chips are sold as other buyers come into the market with lower levels of willingness to pay. Ultimately, once Product 2 is on the market, the price for Product 1 falls away completely. This pricing dynamic is pictured below.
At this point, the logic of a time-to-market strategy is clear. If we introduce Product 1 as shown, our firm will make a great deal of revenue. To see this, look at both the price and sales volume curves in the plot above. Revenue from Product 1 is found by multiplying these two curves, and total revenue over the life of the product is just the cumulative revenue over time. However, what if we are another firm and we release a competitor to Product 1 – but we do so late. A competitor entering near the end of Product 1’s life may sell at high volumes, but only for a short time and only during the period when the price of the product is very low. This firm will have introduced a product that, over its life, will make very little cumulative revenue. So the earlier a firm enters into this competition, the more it makes in cumulative revenue. In fact, entering earlier increases total cumulative revenue at an increasing rate.

By this logic, it is obvious that we should release Product 1 at the soonest possible date. However, this may not be profitable. To see this, consider now the costs of developing Product 1. If we could take all the time we want, we could carefully research and develop Product 1 and, when it is ready, we will have run up total costs equal to C1. But by taking our time, we might end up introducing the product late, and this will hurt our revenue. So instead of paying C1 over a long period of development, say, 2 years, what if we accelerate development and pay the same amount but all in one year – or even in six months?

Well, here is the bad news. It turns out that compressing development costs into a short period does not give you the same result. This problem, famously dubbed the “mythical man month” by Frederick Brooks, occurs for two reasons. First, compressing the amount spent on development causes many to work simultaneously and in parallel, which results in coordination diseconomies. Second, there is the problem sometimes called “gestation,” where development is inherently sequential and cannot be made to happen all at once. Concretely, gestation requires time in development because answers obtained at one point in time decide the questions asked at the next point in time. Doing all this development at once leaves us asking many questions that we will never need to know the answer to – and failing to answer questions that we wish we had researched.

Consequently, to speed development, it is not enough to concentrate C1 into a more compressed period of time. Instead, we will have to pay more than C1, and the more we compress the development process, this additional amount will continue to escalate until, at some point, we could spend an infinite amount and not improve our release date any more. So development costs increase at an increasing rate as time-to-market shortens, as shown below.
As the figure shows, the time-to-market strategy confronts the firm with a dynamic cost-benefit comparison. To be profitable, the firm needs to introduce the product early enough to benefit from higher revenue over a longer time, but not so early that its development costs skyrocket.

How is a firm to achieve this balance? The answer comes down to organization. Firms that are good at the time to market strategy are organized in a way that minimizes development costs while maximizing the reliability of product introduction. The key factor to be managed by these firms is uncertainty. These firms typically design around customer-defined time milestones, track progress toward meeting those release dates, and hold employees accountable for staying on time. As uncertainties arise, routines such as phase-review processes are used to update release dates or revisit go/no-go decisions. And wherever, possible, the firm contracts with other firms in order to solve “weak link” problems that are slowing its ability to deliver on time.

Time-based competition is discussed in my book on the Red Queen. 

Sunday, October 30, 2016

Leading Amidst Change: Why Strategy Matters

You’ve probably heard people compare the Fortune 500 from a few decades ago with today, noting how fallen greats like Sears, American Motors, and Zenith Electronics have been eclipsed by innovators. The reason seems clear enough. We’ve seen considerable technological and regulatory change over the past 50 years, so the rules of the game keep changing in business and there appears to be no end in sight. In fact, globalization may even be accelerating the rate of change we see around us.

For business leaders, these changes can be daunting. Companies routinely take their leadership teams off-site to discuss the challenges and opportunities implied by the latest innovations. How will augmented reality affect the gaming market? How will mesh computing affect the financial technology space? How will the internet of things affect, well, everything? The idea seems to be that if we can forecast future technology, we will be more likely to survive or even prosper from the changes to come.

Not so. It turns out that often the very technologies that seem to have upended the great firms of the past were well understood, and sometimes even created, by those very firms: the Swiss invented quartz watches; Kodak invented the digital camera; Sears was a pioneer in inventory and brand management; and the list goes on. A number of different theories have been proposed to explain this puzzle. This note briefly summarizes a few of the major theories, and offers a way that business leaders can constructively think about leading amidst change.

Discontinuous Innovation Theory

As early as the 1960s, writers on technology management distinguished between continuous and discontinuous technological changes (albeit using various terms for the same idea). Continuous, incremental advances happen all the time, gradually improving the state of a given technology. Such changes are typically straightforward for existing firms to accommodate. Discontinuous changes, by contrast, represent a radically different approach to a technology, and often bring about an order-of-magnitude improvement in performance. For instance, adding functionality to an old-style “feature phone” would be a continuous innovation, while the invention of the smart phone could be seen as a discontinuous innovation.

This idea has appeared in hundreds of published academic articles since the 1960s, but its implications for business leadership were best explained in a classic article by Professors Michael Tushman and Phillip Anderson.[1] The breakthrough in Tushman and Anderson’s study was to note a key distinction between different types of discontinuous innovations. Some of these changes, while quite significant in technological terms, build nicely on the capabilities of existing firms. Such “competence enhancing” discontinuities retrench the status quo, giving incumbents even more of an advantage over new, upstart organizations. By contrast, other discontinuities render irrelevant, or even counter-productive, the capabilities of existing firms. These “competence destroying” discontinuities are difficult for industry incumbents to employ profitably, since by doing so they harm their existing operations. For that reason, such changes are more likely to be brought to market by new entrants to a business.

Using discontinuous innovation theory as her lens, a business leader will pay attention to how new technologies either enhance or destroy the value of her firm’s existing capabilities. For example, consider the iconic case of the Bank of America confronted by new innovations in electronic funds transfer technology. This technology would be at the heart of many new innovations in retail banking, including the spread of distributed networks of ATM machines. Prior to this discontinuous change, the Bank of America was the unparalleled leader in geographically distributed retail banking, operating a massive brick-and-mortar branch system across California’s 900 miles and hundreds of cities. Decades of experience honed the bank’s capabilities, as it daily cleared millions of paper transactions into its system regardless of where they originated across the state. With over 2,500 branches, the bank had a branch in every town in that state, and so attracted customers wanting convenient access to their accounts. The spread of ATM machines dramatically reduced the unique value of this system, making widespread geographic access to bank accounts commonplace. Although ATM technology was embraced by Bank of America, its competitive standing clearly was set back in the new era. By the late 1980s, the bank saw record losses in part due to its outdated brick-and-mortar system – the very system that had been key to its competitive advantage – and so it laid off tens of thousands of employees, closed branches, and consolidated functions into regional centers. In this way, electronic funds transfer technology can be understood as a competence destroying discontinuity from the perspective of incumbent banks in the era of brick-and-mortar branching. 

Structural Inertia Theory

Large, well-established organizations are notoriously difficult to change. The leading theory to explain this fact is known as “structural inertia theory,” pioneered by sociologists Michael T. Hannan and John Freeman in their seminal 1984 article[2], and since then supported and developed by a large body of research.[3] In this theory, the authors assume that modern society favors the creation and development of organizations that perform reliably and that can account for their actions rationally. Further, they assume that accountability and reliability are greater for organizations that have reproducible structures – meaning routinized, bureaucratic structures. Finally, they assume that such structures are more difficult to quickly change, that is, more inert. These premises lead them to the essence of their theory: Modern society favors the development of inert organizations.

Hannan and Freeman further argue that when inert organizations do change, these changes are hazardous to their performance and their life chances. Especially as they age, grow, and become complex, increasing inertia renders attempts to change difficult and fraught with risk. If the organization survives the change, it may be better off – at least as long as the change keeps pace with the demands of the environment. Stated in terms of a large-scale strategic change, this basic prediction is illustrated below:

Structural-Inertia Theory:
The Effect of Change at time T1 on Organizational Performance

In this illustration, an organization makes a large scale strategic and organizational change as of time T1, and as a result suffers a dramatic decrease in performance. (In fact, Hannan and Freeman state this prediction in terms of the organization’s likelihood of outright failure.) But over time, if the organization does survive, its performance improves. And, if the organization’s new strategy is better aligned to its situation, then its performance may even be better than at the start of the ordeal.

The evidence suggests that two distinct effects can be usefully separated when looking at this theory empirically.[4] The initial decrease in performance has been widely documented, and can be thought of as the “process effect” of change. This effect captures the various difficulties that come up as large-system changes cascade through an organization, creating various misalignments along the way. As these difficulties are worked out, eventually the organization reaps the “content effect” of the change – the improvement in performance due to the new strategy being better aligned with the environment.

Structural inertia theory highlights a problem inherent to strategic leadership. If leadership does a good job aligning the different aspects of organization and strategy, then changing that complex, aligned system will be especially difficult. And even if leadership does manage to change that system, the consequential drop in performance is likely to be extreme. Looking again at the Bank of America example, many pundits criticized the company for taking so long to shift away from the earlier strategy and structure centered on brick-and-mortar branches, skilled loan officers in every location, and decentralization to allow for extreme localization. Only after nearly failing did the bank’s leadership manage to break with that strategy and structure. Yet, when seen through the lens of structural inertia theory, this resistance to change is precisely what one would expect from such a well-managed organization. Years of fine tuning had created a complex organization including various, well-aligned features that made the bank particularly good at geographic localization. In this way, the difficulties involved with changing Bank of America were more a measure of its excellence than an indictment – albeit excellence that had become outdated. And this example suggests a caution to business leaders: Your good work aligning your strategy and organization today is precisely what will make changing that strategy so difficult and hazardous tomorrow.

Disruption Theory

By far the most popular theory of strategic and organizational change these days is disruption theory, developed by Professor Clay Christensen and his colleagues beginning with its introduction in 1995.[5] The theory restricts its attention to change involving technologies and innovations, and assumes a market context with differentiation between a low-end of the market, a mainstream market, and a high-end. The product performance, expectations of customers for product performance, willingness to pay for performance, and potential for profitability are all assumed to be low at the low end, middling for the mainstream, and high at the high end. Incumbent firms are assumed to use a well-established technology and focus on the mainstream market. Over time, they are assumed to improve their technologies gradually in an attempt to serve the high-end of the market, where profitability is greatest. In so doing, they are assumed to “overshoot” the needs of the low end and middle of the market. Furthermore, focusing on profitability, they are assumed to largely ignore the low-end of the market.

Under these assumptions, new entrants can enter with completely different technologies. These technologies assumedly will not initially perform as well as the established technologies used by incumbents, but they are good enough to give the new entrant a foothold market: either serving the underserved low-end of the market, or serving previously unserved “non-consumers” in a new-market foothold. Incumbent firms, meanwhile, pay no attention to the new entrant, because they are serving different customers. As time passes, however, the new entrant’s technology improves greatly, ultimately moving the firm into the mainstream and high-end of the market. At that point the new firm gains an advantage over the incumbent, often because the new firm’s technologies are part of a business model that is incompatible and superior to that of the incumbent.

Disruption theory is extremely popular, and this popularity leads to some problems for the theory’s use. For many business leaders, especially in information technology businesses, the pattern described by Christensen and his colleagues reflects their context well. For these leaders the theory is useful – at least if they can spot potential disruptors while still in the “foothold” stage. (A theory is not helpful if you must wait until you see a success to then identify it.) In many other instances, however, the term “disruption” is widely used to mean any change that has a big effect on business regardless of whether the change resembles that addressed by the theory. In these instances, the theory will not apply and in fact could be misleading. Christensen and his colleagues have recently voiced concern about this. In a recent article, they explain that many big, important business changes are not disruption as the theory defines it.[6] Uber, they note, is not a disruptor (based on their understanding of Uber).

For business leaders, the lens of disruption theory directs attention to those start-ups just getting a foothold either in a new market segment or in an underserved low end of an existing market. And the theory helps to evaluate these new entrants not on the basis of how well their technology performs when it starts out, but rather in terms of its potential to become disruptive as it improves over time. To some extent, the theory also directs attention to the new business models that often accompany such start-ups. 

Leading Amidst Change

For you as a business leader, all three of these theoretical perspectives direct attention, first and foremost, to what makes your organization compete well. Discontinuous innovation theory highlights the importance of knowing the capabilities that give your organization a strategic advantage, and how a given technological change affects the value of those capabilities. Structural inertia theory directs your attention to the complexity and intricacy of your organization in order to understand the cascade of misalignments that moving to a new strategy will trigger – if even temporarily. And, for some, disruption theory directs your attention to potential threats that, while benign in their infancy, have the chance to mature into something strategically significant.

One way to sum up these implications is to note that they draw attention to the importance of strategy, and in particular to the logic of your firm’s strategy. Big changes that leave the logic of your strategy intact are not a threat. Discontinuous innovation theory calls such changes “enabling.” Structural inertia theory notes that such changes do not require a fundamental alterations to your organization. And disruption theory would say that such changes are not likely to be problematic for you as an incumbent. But when a change threatens the logic of your firm’s strategy, it is time to act. The changes you set in motion will be difficult and costly for your organization, especially if you have done a good job aligning your strategy and organization around a clear and compelling logic. But the alternative is to become yet another example in the failure lexicon of business school professors.

What’s more, looking to see how changes affect your strategy’s logic gives you a way to identify and diagnose what matters before your performance suffers. As a leader, you do not have the luxury of waiting until changes play themselves out. You must make the call early on in the game, while there is still a chance to affect the outcome. Going back to the Bank of America, the logic of their branch strategy was crystal clear, even as technological changes were afoot that would threaten that logic. Had their leadership honestly reviewed how those changes were likely to affect the bank’s strategic logic, steps could have been taken long before red ink ultimately drove action. This lesson applies to every business leader. The logic of your strategy can be identified today, as can the implications of the changes you see around you for that logic. Use your strategy’s logic as the lens through which you understand and manage change.

[1] Tushman, Michael L. and Philip Anderson. 1986. “Technological Discontinuities and Organizational Environments.” Administrative Science Quarterly, 31: 439-465. 
[2] Hannan, Michael T. and John Freeman. 1984. “Structural Inertia and Organizational Change.” American Sociological Review, 49: 149-164.
[3] Carroll, Glenn R. and Michael T. Hannan. 2000. The Demography of Corporations and Industry. Princeton: Princeton University Press.
[4] Barnett, William P. and Glenn R. Carroll. 1995. “Modeling Internal Organizational Change.” Annual Review of Sociology, 21:217-236.
[5] Bower, Joseph L. and Clayton M. Christensen. 1995. “Disruptive Technologies: Catching the Wave.” Harvard Business Review, Jan-Feb.
[6] Christensen, Clayton M., Michael E. Raynor, and Rory McDonald. 2015. “What Is Disruptive Innovation?” Harvard Business Review, December.

Saturday, October 15, 2016

Differing Without Dividing

Variety is great for innovation. For instance, consider the case of Seymour Cray, the “father of the supercomputer.” In the 1970s, Cray left Control Data to start Cray Research, a company devoted to creating the world’s fastest computer. Cray approached the problem with a revolutionary architecture, so called “vector processing.” By 1976 he and his team introduced the Cray 1, and Cray Research was seen as the Mecca of high-speed computing. John Rollwagen became company President in 1977, bringing business leadership alongside Cray’s technological prowess.

In 1979, Rollwagen brought in another technology genius, Steve Chen, to lead the design of a completely different approach to supercomputing. So as Seymour Cray’s team worked on the Cray 2, Chen’s team worked on the Cray X-MP. Chen’s design built on Cray’s initial innovation, but did so using a revolutionary architecture featuring multiple processors operating in parallel. Released in 1982, the X-MP set a new standard for supercomputer performance, and significantly raised the bar for the team working on the Cray 2.

When we do not know what the future holds, variety helps our organization to discover what is possible. This truth is one reason why we often hear people saying that they want to increase the diversity of their employees. Just like the biosphere, organizations evolve better if they sustain variety.

Yet examples like Cray and Chen’s are rare. One reason is that sustaining variety is expensive. How inefficient to run multiple projects that are trying to do the same thing. But another, bigger problem is that sustaining variety threatens to divide a company. People object to having others in their company working at cross purposes. How can we encourage differences without being divisive?

One way is to live by the adage “disagree and commit.” Here in Silicon Valley people attribute the saying to Intel. The idea is that you should encourage disagreement during the decision-making process, in order to improve the quality of your decisions. But once a decision is made, everybody needs to fully commit to its implementation. Unfortunately, in practice this saying often is used to silence those who see things differently. Often managers say “disagree and commit,” but they are really saying “disagree and shut up.”

I prefer “switch and commit.” The goal is still to end up committing at the end of the process, but during the decision I want the participants to switch roles. The person disagreeing with you needs to take your position and argue it well. Similarly, you must argue the other’s view well. You can think of the approach as devil’s advocacy taken seriously by both sides.

I first tried “switch and commit” when teaching a controversial topic here at Stanford. For the first assignment, the students had to state their position on the topic. For the second, big assignment, they had to write an essay taking the opposite view. (They did not hear about the second assignment until after they handed in the first.) The end results were some fantastic essays, because the authors were legitimately skeptical.

Since then, I have tried “switch and commit” when facilitating hard-hitting business meetings among top managers. The results have been mixed. Many people cannot get their head around a different perspective. But now and then you find an exceptional leader who appreciates the value of differing without dividing.

A readable review of related academic work is Scott Page’s book The Difference.