Skip to main content

Currently Skimming:

Panel I : Performance Measurement and Current Trends
Pages 8-26

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 8...
... Triplett said, was to build a bridge between technologists and economists -- professions that "don't talk very much to each other" but had been brought together for the symposium because both are interested in performance measures for computers and their components. The opening of his presentation, directed in particular to technologists, would describe in more detail than had Dr.
From page 9...
... (1955) Number produced 3 10 Price, each $400K $600K Current price output $1.2 million $6.0 million Performance index 1.0 1.5 Computer inflation 1.00 1.11 (with estimated performance premium = 1 + 0.7(0.5)
From page 10...
... (1955) Number Produced 3 10 Price, Each $400K $600K Current Price Output $1.2 million $6.0 million Performance Index 1.0 1.8 Computer Inflation 1.00 0.96 (with Estimated Performance Premium = 1+0.7(0.5)
From page 11...
... BEA's "chained price index" shows, however, that computer prices as a measure of the national account fell consistently, and quite rapidly, over the entire period. The change in actual investment, or current-price shipments, divided by the change in the price index yields the deflated output measure for computers -- "billions of chained 1996 dollars," in BEA parlance -- which he described as growing "much, much, much, much faster" than expenditures on computers.
From page 12...
... Noting that a dotted line representing the price index for PCs has been falling more rapidly in recent years than the index for mainframes, he stated: "That's why we've got so many PCs." He then reminded the audience that all the foregoing numbers "depend crucially on having a measure of computer performance." Next, Dr. Triplett took up the question of which computer performance measures economists have actually used.
From page 13...
... The coefficients in this second, "more realistic" hedonic function are those employed by the Bureau of Labor Statistics (BLS) , which now produces the price indexes that go into the BEA national accounts, to adjust for changes in computer performance.
From page 14...
... 5See Michael Holdway, 2001, "Quality-Adjusting Computer Prices in the Producer Price Index: An Overview," Bureau of Labor Statistics, October 16. 6See Annex B, "Computer Hedonic Functions, Other Hardware Features," in Dr.
From page 15...
... McQueeney International Business Machines Dr. McQueeney defined the challenge before the symposium as figuring out how to measure value all the way across information technology's "quite complex food chain of value-added." Information technology insiders and economists who study the industry have both focused on the transistor, which he put near the bottom of that food chain, for a variety of reasons: It was an obvious place to start; it was easy to deal with; and it demonstrated constant, rapid progress that was clearly beneficial.
From page 16...
... "The raw capabilities of the technology have in some cases gotten to a point where either the economics of how you sell them and how you ascribe value to them is changing," he explained, "or you are forced to look elsewhere in the system performance stack to get real improvements." Dr. McQueeney then displayed a chart comparing the growth in transistor switching speed projected in 1995 with growth achieved (see Figure 5)
From page 17...
... (GHz) High Mobility Future Accelerate Lithography New Device Structures Performance 100 SOI Device 2000 Accelerate Lithography Outlook 1995 Semiconductor Outlook 10 1990 1995 2000 2005 Manufacturing Ramp Start FIGURE 5 Growth in transistor switching speed.
From page 18...
... McQueeney explained, "we have to start thinking about using atoms themselves to try to do some of the computations or to build devices on an atomic scale." Already, information technology and the life sciences have begun to converge: "The only manufacturing technology we know of in the entire scien silicon bulk field effect transistor (FET) Oxide thickness is approaching a few atomic layers FIGURE 6 Image of smallest field-effect transistors made to date.
From page 19...
... Several other machines, which he considers "not fully general purpose" because their hardware is designed for a unique problem, describe a somewhat steeper curve: Deep Blue, the chess computer that beat Gary Kasparov in 1995; a Japan molecular dynamics machine, Riken MDM; and Blue Gene, another special-purpose machine IBM expects to build by 2005. Rounding out the chart was a comparison of animal brain speed with computer speed, which showed the 11Teraflop ASCI machine built for Lawrence Livermore National Laboratory to be on the performance level of a mouse and the fastest desktop computer on that of a lizard.
From page 20...
... McQueeney called "efficiency of deployment." Looking at use of computing capacity averaged over a 24-hour day, he offered the following "typical" figures: for a PC, well under 5 percent and perhaps as low as 1 percent; for a UNIX workstation acting as an applications server, between 50 percent at the bottom and 80 percent at the very top; and for a mainframe, somewhere in the 90 percent
From page 21...
... For example, to be sure it had Web hosting capacity sufficient to handle the increased interest it expected in reaction to a Super Bowl advertising spot, a company might have to provision the front end of its site to accommodate a spike reaching 50 or even 100 times average demand. In 1998 IBM began to investigate developing a more efficient delivery model for the middle to low end of IT resources based on aggregating different customers' demand for the front end of Web service, simple computation, and other services that can be handled by generic applications.
From page 22...
... McQueeney turned to a change in the business environment for information technology that IBM had observed in the previous year or two: Buyers had begun demanding that the supplier assume a larger share of the responsibility for the return on investment from purchases of IT goods and services. Looking back to the early days of computing, he said customers took it upon themselves to integrate components they bought, to build applications for them, and to add the business value.
From page 23...
... Dr. McQueeney listed and commented on some key technical elements of this offering: · it provides customers, who IBM believes will insist on open systems, choices in components; · it delivers higher levels of integration, relieving customers of handling that integration themselves; · it is virtualized, so that a customer can blur boundaries between companies -- for instance, running a business process out to a key supplier and back; · it is autonomic, because system complexity has driven up total cost of ownership, as so much expertise is needed to handle management and maintenance.
From page 24...
... Triplett's tables of features used in computer hedonic functions, that it dealt largely with individual devices and their characteristics to the exclusion of "big picture" qualities such as integration, reliability, and downtime. He raised the question of how researchers might move beyond the characteristics in existing models to measure systemic performance characteristics.
From page 25...
... Dr. McQueeney said he has observed that customers' interest in improving their business processes has not waned but that they have been taking a different approach designed to avoid net cash outflow.
From page 26...
... When someone goes to an IBM and pays one fee for the whole service utility, they really are capturing all that in an investment in IT business value; when they buy chips and boards and assemble them into boxes and you only measure the cost of the chips and boards, all that other investment looks like overhead." Dr. Bregman singled out the appropriate placing of an aggregation point that moves very rapidly as one of the main challenges in looking at the information technology industry over a period of 30­50 years.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.