MAIN SOURCES

MAIN SOURCES

Tuesday, September 2, 2014

Mastering the Three Worlds of Information Technology

November 2006


In the information era, the best of times are the worst of times. Computer hardware keeps getting faster, cheaper, and more portable; new technologies such as mashups, blogs, wikis, and business analytic systems have captured the imagination; and corporate IT spending has bounced back from the plunge it took in 2001. In 1987, U.S. corporations’ investment in IT per employee averaged $1,500. By 2004, the latest year for which government data are available, that amount had more than tripled to $5,100 per employee. In fact, American companies spend as much on IT each year as they do on offices, warehouses, and factories put together.
However, as IT’s drumbeats become louder, they threaten to overwhelm general managers. One of the biggest problems companies face is coping with the abundance of technologies in the marketplace. It’s hard for executives to figure out what all those systems, applications, and acronyms do, let alone decide which ones they should purchase and how to successfully adopt them. Most managers feel ill equipped to navigate the constantly changing technology landscape and thus involve themselves less and less with IT.
Adding to executives’ diffidence, corporate IT projects have often delivered underwhelming results or been outright failures. Catastrophes—such as the one at American pharmaceutical distributor FoxMeyer Drug, which went into Chapter 11 and was sold in 1997 when a $100 million IT project failed—may be less frequent today than in the past, but frustration, delay, and disappointment are all too common. In 2005, when IT consultancy CSC and the Financial Executives Research Foundation conducted a survey of 782 American executives responsible for IT, 50% of the respondents admitted that “aligning business and IT strategy” was a major problem. The researchers found that 51% of large-scale IT efforts finished later than expected and ran over budget. Only 10% of companies believed they were getting high returns from IT investments; 47% felt that returns were low, negative, or unknown.
Not surprisingly, any fresh IT proposal sparks fiery debates in boardrooms. Some boards say “Why should we bother? IT isn’t strategic, so it doesn’t matter in a competitive sense. We should be minimizing our technology expenditures.” Others argue “Whether IT matters or not, we shouldn’t be doing it ourselves. Companies are becoming virtual, and software is becoming rentable, so why do IT the old-fashioned way?” Thus, executives try to delegate, outsource, rent, rationalize, minimize, and generally remove IT from their already long list of concerns.
But managers who distance themselves from IT abdicate a critical responsibility. Having studied IT for the past 12 years, I believe that executives have three roles to play in managing IT: They must help select technologies, nurture their adoption, and ensure their exploitation. However, managers needn’t do all those things each time they buy a new technology. Different types of IT result in different kinds of organizational change when they are implemented, so executives must tailor their roles to the technologies they’re using. What’s critical, though, is that executives stop looking at IT projects as technology installations and start looking at them as periods of organizational change that they have a responsibility to manage.
Building an Effective IT Model
Everyone who has studied companies’ frustrations with IT argues that technology projects are increasingly becoming managerial challenges rather than technical ones. What’s more, a well-run IT department isn’t enough; line managers have important responsibilities in implementing these projects. An insightful CIO once told me, “I can make a project fail, but I can’t make it succeed. For that, I need my [non-IT] business colleagues.” Managers I’ve worked with admit privately that success with IT requires their commitment, but they’re not clear where, when, and how they should get involved.
That’s partly because executives usually operate without a comprehensive model of what IT does for companies, how it can affect organizations, and what managers must do to ensure that IT initiatives succeed. As HBS professor Clayton M. Christensen and Boston University professor Paul R. Carlile point out in their working paper “The Cycles of Theory Building in Management Research” (Harvard Business School, February 2005), a good model or theory does two things: It groups important phenomena into categories, and, within categories, it makes statements of cause and effect. Yet even state-of-the-art models of IT’s impact consist only of statements about individual technologies, such as “CRM lets you get closer to customers” and “SCM enables you to reduce inventory.” Such declarations don’t help executives; they’re more akin to sales pitches than statements of fact. These assertions are also silent about why technologies will deliver to companies the benefits they have promised. Why will customers start confessing their deepest desires to your customer relationship management system? Why will suppliers start delivering just in time when you set up a supply chain management system? Existing models don’t help executives choose among technologies, either. Every business wants both to be closer to customers and to keep inventory levels low—but is it better to first invest in CRM or SCM improvements?
One way to build a comprehensive model is to place IT in a historical context. Economists and business historians agree that IT is the latest in a series of general-purpose technologies (GPTs), innovations so important that they cause jumps in an economy’s normal march of progress. Electric power, the transistor, and the laser are examples of GPTs that came about in the nineteenth and twentieth centuries. Companies can incorporate some general purpose technologies, like transistors, into products, and others, like electricity, into processes, but all of them share specific characteristics. The performance of such technologies improves dramatically over time. As people become more familiar with GPTs and let go of their old ways of thinking, they find a great many uses for these innovations. Crucially, general purpose technologies deliver greater benefits as people invent or develop complements that multiply the power, impact, and uses of GPTs. For instance, in 1970, fiber-optic cables enabled companies to employ lasers, which had already been in use for a decade, for data transmission.
Andrew McAfee (amcafee@hbs.edu) is an associate professor at Harvard Business School in Boston. Visit his blog at blog.hbs.edu/faculty/amcafee.

No comments:

Post a Comment