Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 47
Measuring and Sustaining the New Economy: Software, Growth, and the Future of the U.S. Economy - Report of a Symposium Panel I The Role of Software—What Does Software Do? INTRODUCTION Dale W. Jorgenson Harvard University Calling Dr. Raduchel’s presentation a “brilliant overview” of the terrain to be covered during the day’s workshop, Dr. Jorgenson explained that the two morning sessions would be devoted to technology and that the afternoon sessions would focus on economics, in particular on measurement problems and public-policy questions relating to software. He then introduced Tony Scott, the Chief Information Technology Officer of General Motors. Anthony Scott General Motors Dr. Scott responded to the question framing the session—“What does software do?”—by saying that, in the modern corporation, software does “everything.” Seconding Dr. Raduchel’s characterization of software as the reduction and institutionalization of the whole of a corporation’s knowledge into business
OCR for page 48
Measuring and Sustaining the New Economy: Software, Growth, and the Future of the U.S. Economy - Report of a Symposium processes and methods, he stated: “Virtually everything we do at General Motors has been reduced in some fashion or another to software.” Many of GM’s products have become reliant on software to the point that they could not be sold, used, or serviced without it. By way of illustration, Dr. Scott noted that today’s typical high-end automobile may have 65 microprocessors in it controlling everything from the way fuel is used to controlling airbags and other safety systems. Reflecting how much the automobile already depends on software, internal GM estimates put the cost of the electronics, including software, well above that of the material components—such as steel, aluminum, and glass—that go into a car. This trend is expected to continue. As a second example he cited what he called “one of [GM’s] fastest-growing areas,” “OnStar,” a product that, at the push of a button, provides drivers a number of services, among them a safety and security service. If a car crashes and its airbags deploy, an OnStar center is notified automatically and dispatches emergency personnel to the car’s location. “That is entirely enabled by software,” he said. “We couldn’t deliver that service without it.” To further underscore the importance of software to the automobile business, Dr. Scott turned to the economics of leasing. When a car’s lease is up, that car is generally returned to the dealer, who then must make some disposition of it. In the past, the car would have been shipped to an auction yard, where dealers and others interested in off-lease vehicles could bid on it; from there, it might be shipped two or three more times before landing with a new owner. But under “Smart Auction,” which has replaced this inefficient procedure, upon return to the dealer a leased car is photographed; the photo is posted, along with the car’s vital statistics, on an auction Web site resembling a specialized eBay; and the car, sold directly to a dealer who wants it, is shipped only once, saving an average of around $500 per car in transportation costs. “The effects on the auto industry have been enormous,” Dr. Scott said. “It’s not just the $500 savings; it’s also the ability of a particular dealer to fine-tune on a continuous basis the exact inventory he or she wants on the lot without having to wait for a physical auction to take place.” This is just one of many examples, he stressed, of software’s enabling business-process changes and increased productivity across every aspect of his industry. Downside of Software’s Impact But Dr. Scott acknowledged that there is also a “bad-news side” to software’s impact and, to explain it, turned to the history of Silicon Valley’s famed Winchester Mystery House, which he called his “favorite analogy to the software business.” Sarah Winchester, who at a fairly young age inherited the Winchester rifle fortune, came to believe that she must keep adding to her home constantly, for as long as it was under construction, she would not die. Over a period of 40 or 50 years she continually built onto the house, which has hundreds of rooms, stairways that go nowhere, fireplaces without chimneys, doors that open into blank walls. It is very
OCR for page 49
Measuring and Sustaining the New Economy: Software, Growth, and the Future of the U.S. Economy - Report of a Symposium finely built by the best craftsmen that money could buy, and she paid good architects to design its various pieces. But, taken as a whole, the house is unlivable. “It has made a fairly good tourist attraction over the years, and I urge you to go visit it if you ever visit Silicon Valley, because it is a marvel,” Dr. Scott said. “But it’s unworkable and unsustainable as a living residence.” The process by which corporations build software is “somewhat analogous to the Winchester Mystery House,” he declared, citing an “ ‘add-onto effect’: I have a bunch of systems in a corporation and I’m going to add a few things on here, a few things on there, I’m going to build a story there, a stairway, a connector here, a connector there.” Over time, he said, harking back to Dr. Raduchel’s remarks, the aggregation of software that emerges is so complex that the thought of changing it is daunting. Sourcing and Costing Information Technology For a case in point, Dr. Scott turned to the history of General Motors, which in the 1980s purchased EDS, one of the major computer-services outsourcing companies at the time. EDS played the role of GM’s internal information technology (IT) organization, in addition to continuing to outsource for other companies, for a dozen or so years ending in 1996; then, GM spun EDS off as a separate company again, at the same time entering into a 10-year agreement under which EDS continued as its primary outsourcer. Owing to the lack of a suitable governance structure—which Dr. Scott called “the only flaw in the model”—GM managers with budget authority were, during the period when EDS was under GM’s roof, basically allowed to buy whatever IT they needed or wanted for their division or department. The view within GM, as he characterized it, was: “ ‘Well, that’s o.k., because all the profits stay in the company, so it’s sort of funny money—not real money—that’s being spent.’” This behavior resulted in tremendous overlap and waste, which were discovered only after GM had spun EDS off and formed a separate organization to manage its own IT assets. Having developed “no economies of scale, no ability to leverage our buying power or standardize our business processes across the corporation,” Dr. Scott recalled, GM had ended up with “one of everything that had ever been produced by the IT industry.” One of the corporation’s major efforts of the past 7 years had been taking cost out. To illuminate the experience—which he doubted is unique to GM—Dr. Scott offered some data. GM started in 1996 with 7,000 applications it considered “business critical”: applications whose failure or prolonged disruption would be of concern from a financial or an audit perspective. The corporation set in place objectives aimed at greatly reducing the number of systems in use “by going more common and more global with those systems” across the company. By early 2004, GM had reduced its systems by more than half, to a little over 2,000, and had in the process driven reliability up dramatically.
OCR for page 50
Measuring and Sustaining the New Economy: Software, Growth, and the Future of the U.S. Economy - Report of a Symposium GM’s IT Spending: From Industry’s Highest to Lowest But the “critical factor,” according to Dr. Scott, is that GM’s annual spending on information technology had dropped from over $4 billion in 1996 to a projected $2.8 billion in 2004—even though, in the interim, its sales had increased every year. This change in cost, he noted, could be measured very accurately because GM is 100 percent outsourced in IT, employing no internal staff to develop code, support systems, maintain systems, operate data centers, run networks, or perform any other IT function. It has meant that GM, which in 1996 had the highest percentage of IT spending in the automotive business, today has arguably the lowest, and with improved functionality and higher reliability. This “huge swing,” which he described as underscoring “some of the opportunity” that exists to lower IT costs, had given “a little over $1 billion a year straight back to the company to invest in other things—like developing new automobiles with more software in them.” Addressing the issues of software quality and complexity, Dr. Scott endorsed Dr. Raduchel’s depiction of the latter, commenting that the “incredible” complexity involved in so simple a task as delivering a Web page to a computer screen has been overcome by raising the level of abstraction at which code can be written. Because of the high number of microprocessors in the modern vehicle, the automotive sector is also struggling with complexity, but there is a further dimension: Since it must support the electronics in its products for a long time, the quality of its software takes on added importance. “Cars typically are on the road 10 years or more,” he noted. “Now, go try to find support for 10-year-old software—or last year’s software—for your PC.” Seeking a Yardstick for Software Complicating the production of software good enough to be supported over time is the fact that, just like economic measures, such qualitative tools as standards and measures are lacking. “There are lots of different ways of measuring a piece of steel or aluminum or glass: for quality, for cost, and all the rest of it,” Dr. Scott said. “But what is the yardstick for software?” There are no adequate measures either for cost or quality, the two most important things for industry to measure. Dr. Scott then invited questions. DISCUSSION Dr. Jorgenson, admonishing members of the audience to keep their comments brief and to make sure they ended with question marks, called for questions to either Dr. Scott or Dr. Raduchel.
OCR for page 51
Measuring and Sustaining the New Economy: Software, Growth, and the Future of the U.S. Economy - Report of a Symposium The Utility and Relevance of the CMM Asked about the utility of the Capability Maturity Model (CMM) developed at Carnegie Mellon’s Software Engineering Institute (SEI), Dr. Scott acknowledged the value of such quality processes as it and Six Sigma but noted that they have “not been reduced to a level that everyday folks can use.” Recalling Dr. Raduchel’s reference to the Sarbanes-Oxley Act, he said that much of what concerns corporations lies in questions about the adequacy of controls: Does the billing system accurately reflect the business activity that took place? Do the engineering systems accurately provide warning for issues that might be coming up in the design of products or services? Registering his own surprise that anyone at all is able to answer some of these questions, he said that he personally “would be very uncomfortable signing Sarbanes-Oxley statements in those areas.” Dr. Raduchel identified the stack issue as one of the challenges for CMM, observing that even if a software module is written according to the world’s best engineering discipline, it must then go out into the real world. He sketched the dilemma before an IT officer faced with a quick decision on whether to apply a critical security patch—that, obviously, has never been tested for the company’s application—to an operating system that has been performing well. On one side is the risk of leaving the system vulnerable to a security breach by trying to ensure it keeps going, on the other the risk of causing a functioning system to go down by trying to protect it. “There’s no Carnegie Mellon methodology for how you integrate a hundred-million-line application system that’s composed of 30 modules built over 20 years by different managers in different countries,” he commented. The limits of the modular approach are often highlighted among the cognoscenti by comparing the construction of software to that of cathedrals: One can learn to build a particular nave, and to build it perfectly, but its perfection is no guarantee that the overall structure of which it is a part will stand. Functioning in the Real World To emphasize the point, Dr. Raduchel described an incident that occurred on the day the U.S. Distant Early Warning (DEW) Line went into operation in the late 1950s. Almost immediately, the system warned that the Soviet Union had launched a missile attack on the United States. Confronted with the warning, a major at NORAD in Colorado Springs wondered to himself: “ ‘The Soviets knew we were building this. They knew it was going live today. Why would they pick today to attack?’ ” He aborted the system’s response mechanism to allow verification of the warning, which turned out to have been triggered by the rising of the moon over Siberia. “The fact was, the software didn’t malfunction—it worked perfectly,” Dr. Raduchel said. “But if the specs are wrong, the software’s going to be wrong; if the stack in which it works has a bug, then it may not work right.”
OCR for page 52
Measuring and Sustaining the New Economy: Software, Growth, and the Future of the U.S. Economy - Report of a Symposium In the real world, the performance of a given application is a minor challenge compared to the operation of the full software stack. In Dr. Raduchel’s experience of implementing major software projects, the biggest challenge arose in the actual operating environment, which could not be simulated in advance: “You’d put it together and were running it at scale—and suddenly, for the first time, you had 100 million transactions an hour against it. Just a whole different world.” In the Aftermath of Y2K Bill Long of Business Performance Research Associates then asked what, if anything, had been learned from the Y2K experience, which refers to the widespread software fixes necessary to repair the inability of older computer systems to handle dates correctly after the millennium changes. Responding, Dr. Raduchel called Y2K an “overblown event” and asserted that fear of legal liability, which he blamed the press for stoking, had resulted in billions of dollars of unnecessary spending. The measures taken to forestall problems were in many cases not well engineered, a cause for concern about the future of the systems they affect. Although Y2K had the positive result of awakening many to how dependent they are on software systems, it also raised the ire of CEOs, who saw the money that went to IT as wasted. “Most of the CIOs [chief information officers] who presided over the Year 2000 were in fact fired,” he observed, and in “a very short period of time.” As to the specifics of whether anything useful might have been learned regarding software investment, reinvestment, and replacement, Dr. Raduchel demurred, saying he had never seen any data on the matter. His general judgment, however, was that the environment was too “panicked” to make for a “good learning experience.” Dr. Scott, while agreeing with Dr. Raduchel’s description of the negatives involved, sees the overall experience in a more positive light. Although a small number of system breakdowns did occur, the “whole ecosystem” proved itself sufficiently robust to give the lie to doomsday scenarios. In his personal experience of pre-Y2K testing, at the company where he worked prior to GM, some potential date-related problems were identified; they were corrected, but the consensus was that, even had they gone undetected, recovering from them would have been “fairly easy.” The testing had a side benefit: It also led to the discovery of problems unconnected to Y2K. As for the downside, he noted that the CEOs’ wrath and the CIOs’ loss of credibility had been factors in the ensuing downturn in the IT economy. How Standards Come into Play The speakers were then asked to state their views on the economics of standards with respect to software, and in particular on how standards relate to the
OCR for page 53
Measuring and Sustaining the New Economy: Software, Growth, and the Future of the U.S. Economy - Report of a Symposium scalability of systems; on their impact on productivity at the macro level; and on their role in software interoperability. Speaking as an economist, Dr. Raduchel lauded standards for opening up and expanding markets, and he declared that, in fact, “everything about them is good.” But he cautioned that unless a software standard is accompanied by what a computer scientist would call a “working reference implementation,” it is incomplete and of such limited value that, in the end, the volume leader defines the standard. He stressed that there is no way to write down a set of code completely as a specification, and one finds out what is left out only in the writing. In many cases where standards do exist, “the way Microsoft implements it is all that matters”—and should Microsoft choose not to follow the standard at all, any software product built to be interoperable with its products must follow Microsoft’s bugs and whatever deliberate decisions it made not to be compatible. Foreshadowing Hal Varian’s talk on open-source software, he noted that Linux provides one of the first instances in which both a powerful standard and a working reference implementation have appeared at the same time, and he credited that for Linux’s emerging influence. Dr. Jorgenson thanked Dr. Scott for conveying how software problems play out in the real world and lauded both speakers for providing a very stimulating discussion.
Representative terms from entire chapter: