Skip to main content

Currently Skimming:

Panel III: Peripherals: Current Technology Trends
Pages 48-70

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 48...
... McQueeney's point that, to see where productivity gains are coming from, it is necessary to look above the hardware level. Introducing his company, Veritas Software, he said that with about $1.5 billion in annual revenue it is the number-one player in the storage software busi 48
From page 49...
... Bregman recounted, he had begun with Moore's Law, then turned to the communication between the elements that do the computing as reflected in Gilder's Law: Bandwidth grows three times faster than computing power.13 This latter formulation, he commented, "really just says that bandwidth is growing dramatically -- getting cheaper, more available -- and that that's driving something." He then cited a yet-unnamed law stating that storage achieves 100 percent growth in density annually, saying that translated into better cost at a dramatic rate. "The improvement at the storage element level is happening faster than it has been in the microprocessor level over the last several years," he asserted.
From page 50...
... The CIO, whose responsibility was more and more frequently extending beyond technology to the business itself, faced two conflicting, or apparently conflicting, demands. The first demand was to increase service levels, driven by both employees' growing need for online access to applications from inside the enterprise and -- even more important -- customers' growing need for access from outside.
From page 51...
... He stated, however, that high-quality software could in reality optimize efficiency, particularly in storage systems: Intelligent software can manage storage and improve performance efficiency, and "you can do things like drive the storage subsystem to look ahead and fetch information from a disk -- which is intrinsically slower than DRAM memory -- and put it into a cache." As an example of the dramatic difference software could make, he pointed to a Veritas file system designed to run on top of UNIX operating systems that, when substituted for the UNIX file system that Sun Microsystems shipped with its Solaris, provided a 15-fold performance improvement in "real-world, transactional applications" without additional hardware. He further pointed out that, although software is certainly not free, its price is far more negotiable than that of hardware because there is no intrinsic manufacturing cost, which he called "a real leverage point" and "something that has been overlooked in the debate about IT productivity metrics." Finally, software could provide value by managing complexity, achieving lower cost both through labor efficiency -- better tools, so that a storage adminis
From page 52...
... Typical storage utilization in enterprises was only about 40 percent of its capacity; firms could acquire very large amounts of storage as a result of buying another server for each new application required, but the storage remained in islands. Through technology whereby Veritas had put high-density storage together with high-performance communication, the entire storage capacity could be connected into a single storage network accessible from all servers, so that actual storage needs could be met across the network.
From page 53...
... All of these elements needed to be connected or managed through three categories of storage software: data protection software, the traditional backup or data-replication software; high-availability software, which manages the failover between systems and the dynamic workload provisioning between systems; and storage management software, which handles the virtualization of that storage pool, the management of those file systems, and the performance of the storage system. He called these three areas "critical to being unable to unlock the value that people have invested in by buying all this high-performance hardware." Focusing on the storage capacity represented at the bottom of the graphic, Dr.
From page 54...
... NVIDIA, he said, took responsibility for manufacturing quality without managing the manufacturing process directly. The risk of outsourcing was "mitigated somewhat when you keep your expertise in the area in which you've outsourced," he commented, adding that "when you take ownership of that, you're not losing a skill, you're just not investing in the infrastructure of that process." Although NVIDIA itself was marketing a card it made for the workstation market, in the PC market it was selling integrated circuits that went onto add-on cards rather than selling the actual add-ons.
From page 55...
... Bregman had discussed, "we're nothing." He displayed a chart tracking the rapid increase in power and complexity through four generations, each separated from the next by about 12­18 months, of one of NVIDIA's product lines: the GPU, or graphics processor unit, in which independent logic elements predominate over memories and caches. A second chart documented product performance improvements between the second half of 1997 and the first half of 2003, which ran at an annualized rate of 215­ 229 percent.
From page 56...
... It also owned $45 million worth of emulation, a technology used to map a design onto programmable hardware elements that then act like a chip. "It's slower," he conceded, "but it's operating on hardware and it gives you the ability to plug it into a real system." According to Mr.
From page 57...
... The graphics processor now became a programmable unit that could mimic the capabilities of a proprietary software, and the added capability arising from the software had opened the door to increased creativity, he noted. "We still haven't re-created reality," Mr.
From page 58...
... Alluding to the rise in resolution of displays, the subject of the program's next talk, he said the higher the resolution, the more elements one might want to process, something that offered graphics a future beyond the PC "as more and more displays and computing elements find themselves embedded in all sorts of applications." The graphics industry would pursue the higher end of the market -- with its "`hell-and-brimfire, make-it-thebest-you-can-at-any-cost' type of solutions" -- because that was the segment driving innovation, which then bled into the mainstream product line. As the largest consumer of wafers from the world's largest fab, NVIDIA drove technology, said Mr.
From page 59...
... In 2002 DuPont formulated five growth platforms -- Agriculture & Nutrition, Electronics & Communication Technologies, Performance Materials, Coatings & Color Technologies, and Safety & Protection -- in the second of which displays was housed. The Electronics & Communication Technologies platform comprised four business units: Electronic Technologies, covering electronic materials, where DuPont was the world's second-largest supplier; Fluoroproducts, of which an outstanding example was Teflon®; Display Technologies; and Imaging Technologies, where the company had focused on professional imaging for the printing industry.
From page 60...
... The fact that the consumer PC market was the lowest-price market was no source of contentment among the display fabricators, who had taken to looking to other applications and market segments in search of profitability. One advantage display makers enjoyed over chip makers was that the former could change their substrates almost on a routine basis -- there had been seven changes over the previous 15 years -- with the result that profitability was enhanced.
From page 61...
... 61 TV's 100 Displays Screen Projection Screen KTRG1 Large Large Public Monitors HDTV Station TV Work 01 Monitor PC Devices ATM's Notebook Internet Devices Information Computers Appliances Navigation Books Terminal Inches Electronic Text PDA, Phones Clocks Diagonal Gaming Devices Based Display Cell Microwaves, Portable Camera Digital Instrumentation Panels Appliances, 1 Calculator Watches Display Mounted Projectors Head applications. 0.1 High Low Display Medium Quality Image 18 FIGURE
From page 62...
... 62 technologies. Display 19 FIGURE
From page 63...
... Keys said that the display industry in the United States "must move away from LCD technology to have a future hope, not only from performance but also from applications." In part because of the industry's problems achieving profitability, and in part because no killer market application existed as yet for a flexible display, industry would not make the necessary moves alone and needed the government to participate, he said. "The manufacture paradigm related to cost, as well as the form factor, will come from the above functions," he explained.
From page 64...
... "If we all had systematic metrics," he posited, "then it would be clear: You'd publish who's the best according to some abstract measurement, and all the customers would buy that, and the game would be over." Still, he acknowledged that it was problematic to neglect the things one could not measure and said the industry should persist in trying to find a way to measure them. In the graphics industry, said Mr.
From page 65...
... Bregman observed that many external metrics or benchmarks used in industry were aimed at being able to compare products at a particular moment, making them unsuitable for the type of longitudinal study that economists seeking to establish the connection between product performance and productivity were engaged in. Industry had been known to change benchmarks from year to year.
From page 66...
... Recalling earlier allusions to internal technical metrics on the one hand and to external metrics such as "usability" and "customer likeability" on the other, Victor McCrary of the National Institute for Standards and Technology asked Mr. Malachowsky and Dr.
From page 67...
... LCDs had reached a pinnacle in color fidelity, he noted, but acknowledged that fidelity was highly dependent on the color filter and that having to add additional layers causing color shifts and angle dependencies that are a detriment was a disadvantage. To do what Dr.
From page 68...
... Describing the market for graphics processors as half in the hands of Intel-with NVIDIA and ATI, a Canadian firm, owning the other half -- Mr. Malachowsky characterized a typical road map as "soft and fuzzy, and quantized into nonspecifics: `photorealistic,' `cinematic quality,' `speeds and feeds at double or triple algorithmically.'" To the extent that his own company's road maps were accurate, it was only because "we're progressing like that and we'll call it that when we get there regardless." On an industry-wide level, there might be projections for a few years out or for a generation or two out for each product segment, but there was nothing analogous to the semiconductor industry's road map.
From page 69...
... Mr. Malachowsky added that NVIDIA's virtual manufacturing business model, under which it owned the manufacturing line but not the infrastructure, was designed for extracting more value out of the product chain.
From page 70...
... NVIDIA had entered the workstation market two years before and, at the medium and low end, had established dominant market share-producing around 20 million processors per quarter -- and had put in a minimum feature set in order to be in a position to supply extremely fast machines for very focused, high-end manufacturing customers.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.