Introduction and Overview

January 1, 1994, marked the tenth anniversary of the divestiture of AT&T. That event started an amazing period of rapid change in the telecommunications industry. Few predicted a decade ago the effects divestiture, deregulation, the exponential growth of technology, and increasingly intense international competition would have on the telecommunications/information infrastructure in the United States by 1994.

In 1984 it was quite clear what the telecommunications/information infrastructure was and who defined it. It was, in essence, the telephone and broadcast networks. The defining players were AT&T, the Federal Communications Commission (FCC), and the broadcasters. You got only the connectivity and services that were offered; compared with what is available today, it was not much.

All of this has changed radically. Instead of being defined by monopoly suppliers and regulators, the telecommunications/information infrastructure has become more closely defined by both market demand and the explosion of supporting technologies that have been brought to market by myriad suppliers. There has been much movement away from a supplier-defined infrastructure to a user- and market-defined infrastructure.

There is now an enormously complex array of networks and services, all interconnected and all changing almost daily. The distinction between broadcast and point-to-point services (e.g., in electronic mail) has blurred. So have the separations among classes of service—voice, data, video, and so on—as more services become digital. Boundaries for the production, distribution, and use of information are vanishing. Many services that were wired in 1984 are becoming wireless, and many that were wireless are becoming wired. The line between communication and computation is becoming ever fuzzier. Users are demanding ever-greater functionality, and suppliers are scrambling to respond to and lead these user demands.

All indications are that, as revolutionary as these changes are, they are just the beginning. Scarcely a day passes without the heralding of some event by the media as an indication that the day of the "information superhighway" is upon us. As we are swept along by the onrush of events, many questions arise. What, in fact, is the telecommunications/information infrastructure?1 Who defines it, since AT&T and the FCC no longer uniquely do? What are reasonable roles for the various private stakeholders in the infrastructure's evolution? What should become of the government's traditional roles of regulation and various forms of infrastructure investment? On whom should the benefits arising from the new infrastructure devolve and in what proportions? Who should choose these beneficiaries? A national debate now swirls these questions.

A Computer Science and Telecommunications Board (CSTB) workshop, titled "The Changing Nature of Telecommunications/Information Infrastructure," explored these questions. In particular, it tried to illuminate the spectrum of opinion on a broad range of potential government



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure Introduction and Overview January 1, 1994, marked the tenth anniversary of the divestiture of AT&T. That event started an amazing period of rapid change in the telecommunications industry. Few predicted a decade ago the effects divestiture, deregulation, the exponential growth of technology, and increasingly intense international competition would have on the telecommunications/information infrastructure in the United States by 1994. In 1984 it was quite clear what the telecommunications/information infrastructure was and who defined it. It was, in essence, the telephone and broadcast networks. The defining players were AT&T, the Federal Communications Commission (FCC), and the broadcasters. You got only the connectivity and services that were offered; compared with what is available today, it was not much. All of this has changed radically. Instead of being defined by monopoly suppliers and regulators, the telecommunications/information infrastructure has become more closely defined by both market demand and the explosion of supporting technologies that have been brought to market by myriad suppliers. There has been much movement away from a supplier-defined infrastructure to a user- and market-defined infrastructure. There is now an enormously complex array of networks and services, all interconnected and all changing almost daily. The distinction between broadcast and point-to-point services (e.g., in electronic mail) has blurred. So have the separations among classes of service—voice, data, video, and so on—as more services become digital. Boundaries for the production, distribution, and use of information are vanishing. Many services that were wired in 1984 are becoming wireless, and many that were wireless are becoming wired. The line between communication and computation is becoming ever fuzzier. Users are demanding ever-greater functionality, and suppliers are scrambling to respond to and lead these user demands. All indications are that, as revolutionary as these changes are, they are just the beginning. Scarcely a day passes without the heralding of some event by the media as an indication that the day of the "information superhighway" is upon us. As we are swept along by the onrush of events, many questions arise. What, in fact, is the telecommunications/information infrastructure?1 Who defines it, since AT&T and the FCC no longer uniquely do? What are reasonable roles for the various private stakeholders in the infrastructure's evolution? What should become of the government's traditional roles of regulation and various forms of infrastructure investment? On whom should the benefits arising from the new infrastructure devolve and in what proportions? Who should choose these beneficiaries? A national debate now swirls these questions. A Computer Science and Telecommunications Board (CSTB) workshop, titled "The Changing Nature of Telecommunications/Information Infrastructure," explored these questions. In particular, it tried to illuminate the spectrum of opinion on a broad range of potential government

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure actions for nurturing the development of the infrastructure and to identify where consensus might be easy to achieve and where it might not. The workshop was not designed to delve deeply into the underlying technical issues, but rather to identify key technical trends and questions as they relate to economic, legal, and regulatory possibilities.2 This volume captures the proceedings of the workshop, building on its discussions and those of the steering committee that organized it. It contains the papers presented at the workshop, grouped into three parts reflecting the activities of the three panels and introduced by overviews prepared by the panel chairs. The papers present many different perspectives, but two distinctively different ones dominate: that of the analyst/academic and that of the practitioner/user. The former set provides the somewhat-removed sense of direction and mission important for policy development; the latter is ''from the trenches," presenting the concerns of those who deal with concrete problems of application. While the gap between the two perspective is often large, they complement each other. Policy analysts must consider the broad public interest and how to serve it. Practitioners must say what it means in practice to pursue policy goals and must identify problems and opportunities that cannot be deduced from policy principles alone. For example, the practical experience of Citibank with networks, as explained by panelist Colin Crook, illuminates some of the prospects presented by the availability of new infrastructure. There are huge implications for the organization of economic activity (such as the ability to separate the locations for processing information from the activities generating that information) that are rarely addressed systematically in discussions of policy implications. Such insights from practice can help extend and test the more abstract analyses emerging from the policy community. This introductory chapter summarizes the workshop discussions, capturing key panel issues and cross-cutting themes. It draws on remarks offered by participants and places their comments and the papers into perspective by referring to subsequent developments in industry and government. PART 1—SETTING THE STAGE Defining the telecommunications/information infrastructure is not trivial, for restricted definitions can artificially limit discussion. An appropriate conceptual framework is needed. So also is an appreciation for the scope of applications, features, and benefits desired in the modern infrastructure and for the shortcomings of the current infrastructure. Such considerations suggest the challenges faced in developing tomorrow's infrastructure, and they were the focus of Panel 1. Charles Firestone of the Aspen Institute, one of the two keynote speakers of Panel 1, advanced a broad definition that served as a valuable foundation for the workshop's efforts and helps elucidate a number of thorny issues. The infrastructure, according to Firestone, comprises the following elements of electronic communications: Production of information in film, video, audio, text, or digital formats; Distribution media, that is, telephony, broadcasting, cable, and other electronic transmission and storage media; and Reception processes and technologies such as customer-premise equipment, videocassettes, satellite dishes, and computers. This unconventional definition bridges both the telecommunications and information dimensions of the evolving National Information Infrastructure. According to Firestone, it "places attention on an increasingly important but often overlooked area of regulation, that of reception. As First Amendment cases move toward greater editorial autonomy by the creators, producers, and even distributors of information, attention will have to be focused on reception for filtering and literacy concerns." Conversely, as choice at the reception end is empowered—for example, as the consumer

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure has more power to choose a telephone service provider or which subset of 500 TV channels to subscribe to—the need for government regulation of production and distribution is reduced. According to Firestone, each of the three infrastructure components—production, distribution, and reception—has already passed through two stages of government regulation as the consumer has become more and more empowered and is now embarking on a third. An understanding of this history helps to explain why certain options are the focus of attention today, and which of these may ultimately be preferable. Stage 1 is a regime of scarce resources (e.g., information, spectrum, equipment), centralization, monopolization, vertically dominant enterprises, and the potential for anticompetitive practices. In this stage the government's regulatory role is to control monopolies and to try to approximate the efficiencies of competition where none exist. The agenda is dominated largely by lawyers and a search for equity in the distribution of the scarce resources. Stage 2 is a regime of abundant resources, intense competition, decentralization of control, and deregulation. The government steps back from its earlier omnipresent regulatory role, and the free market starts to replace monopolies and oligopolies. The agenda is taken over by economists and a drive for market efficiencies. Firestone likened Stages 1 and 2 to infancy and adolescence. In the mature and complex world into which the infrastructure is heading, he believes that yet another paradigm—adulthood—is needed. This paradigm must redress the balance, as yet unachieved, among a number of cherished values: not just equity and efficiency but also liberty, community, and participatory access. The agenda in Stage 3 might be set, Firestone suggested, by political scientists and the quest for democratic access to infrastructure abundance. Just as the conceptual framework for the infrastructure has evolved, so, too, have the technologies and the businesses that implement it. The second keynote speaker of Panel 1, Robert Lucky of Bell Communications Research (Bellcore), reviewed the past century's technological evolution of infrastructural capabilities, concentrating on the explosion of technology in the past decade. Along with that explosion has come the rapid growth of whole new industries to support it—for example, the hundreds of software firms that specialize in various aspects of the production, distribution, and reception of information. Indeed, the movement of computer-based intelligence to the periphery of the network has made some of the distinctions among production, distribution, and reception quite fluid; for instance, some components of switching—and thus distribution in Firestone's framework—are being carried out on desktop personal computers connected to the network and may now be partially identified with reception. However, Lucky pointed out that the local loop has not changed much; it still represents a bottleneck. As Lucky put it, unlike the backbone network, there is only one subscriber on the local loop to bear its cost: "In the end, you are on your own, and there is no magic invention that makes this individual access cheap." Lucky pointed out that local access represents about 80 percent of the total investment in the (existing wireline) network and that, incrementally, it will take on the order of $25 billion to add broadband optical fiber to that access.3 Yet there are persistent efforts to bring more capability to the local exchange and to the local loop itself. "Moreover," Lucky complained, "we are going to do it twice, or maybe even three times…. [That multiplicity] is called competition.'' Lucky's concern about access to the local loop was shared by other workshop participants. Some worried about the effects on infrastructure development of continued monopolization of the local exchange and local loop, although new possibilities for competition in local exchange are opening up. Others concentrated on the economic, social, and technical costs that might be incurred if access to the local loop—especially the loop's extension to broadband multiple services—is not approached by regulators with extreme care. This topic is elaborated on later in this introduction. Another key point addressed by Lucky was the contrast between the telephone network and the Internet, the former built primarily for voice and largely during Firestone's Stage 1, the latter

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure built primarily for data (but more open in architecture) and largely during Stage 2. Lucky high-lighted the contrast, noting the differing (yet evolving) pricing schemes: If you send a fax over the Internet, it really appears to be free. You go to the telephone on your desk, and if you send the same fax, it costs you a couple of dollars…. I can tell you honestly that I don't understand this…. I have had economists and engineers digging down, … [thinking] there will be bedrock down there somewhere. I cannot find it…. The biggest single difference … is a large subsidy for residential service built into the [telephone network] tariff…. [Internet] violates the rules of life—someone must be in charge; someone must pay. The flat-fee charging structure associated with the Internet to date is consistent with the nature of the cost structure. Internet costs do not tend to vary with usage, nor do they have the access and settlement fees or other components found in regulated telephony.4 The Internet experience may be instructive in examining how use of demand-side policies to stimulate network growth requires sophisticated use of the pricing mechanism so that user values can be signaled to suppliers. On the other hand, other cost elements, as Lucky suggests, may ultimately be factored in, whether to support social programs such as universal network access, or to support a growing set of interconnected networks and information resources, which may give rise to access and settlement fees. The value of the infrastructure derives ultimately from how it is used. Four application areas examined during the workshop may drive infrastructure development: financial services, health care, education and schools, and libraries. These applications are characterized by many innovations and accomplishments but also by frustration and mismatches between desires and practical implementations. A fifth application, entertainment—much subject to mass media hyperbole but not explicitly addressed at the workshop—is also generally expected to drive infrastructure development. However, the broadband and multimedia applications associated with entertainment remain unproven, and market trials have been largely unsatisfying to both consumers and service providers. The four application areas addressed at the workshop span a range of private and public interests. They illustrate an inexorable tendency toward international connectivity and place into bold relief the tension among Firestone's five values—equity, efficiency, liberty, community, and participatory access. At one extreme on the scale of values are banking and financial services (described by panelist Crook), which lead in the use of modern infrastructure technology. This technology is so essential to modern banking, contended Crook, that Citibank is progressing under the "assumption that the new infrastructure for the bank essentially is a network." Efficiency is the major driving force that has led to a banking subnetwork that supports 100 billion transactions per year involving transfers of hundreds of trillions of dollars. With these dimensions, explained Crook, "there are more financial transactions in the telephone network than there are in the U.S. economy, excluding low-level cash payments." Compared to efficiency, banking places relatively little emphasis on the other four Firestone values, except for the privacy component of liberty. At the other extreme on the values scale are applications in education, health care, and libraries. The very titles chosen for their workshop papers by panelists Robert Pearlman, a private consultant ("Can K-12 Education Drive on the Information Superhighway?"), and by Clifford Lynch of the University of California ("Future Roles of Libraries in Citizen Access to Information Resources Through the National Information Infrastructure") underscore the concerns many have about equity, community, and participatory access. As Lynch put it, "We will be challenged as a society to define a base level of information resources that we believe must be available to all members of our society, regardless of the ability to pay." This challenge was a recurring theme

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure during the workshop and was often raised in connection with universal service, as explored further in the section "Cross-cutting Issues" below. Considerable analysis, advocacy, and debate are being aimed, in the abstract, at demands for equity and access. Conversely, developments in such specific application areas as health care, covered by panelist Edward Shortliffe of Stanford University's School of Medicine, suggest that policy objectives in specific areas also will be important in determining the practical demands that will be made of the telecommunications/information infrastructure in the future. The "holy paradigm" for infrastructure development should be sensitive and flexible enough to respond to both policy requirements and user demands as they evolve and interact. It is sometimes tempting to identify the information infrastructure with only the physical elements of its technology. The insights brought by panelists and attendees from their applications areas show how far off the mark that view is. As Robert Pepper of the FCC observed at the workshop, [W]hen we talk about infrastructure, we tend to think about wires, hardware. Infrastructure is far more than that. It is people, it is laws, it is the education to be able to use systems. If you think about the highway system, we tend to think about bridges and interstates, but the infrastructure also includes the highway laws, drivers' licenses, McDonalds along the roadside, gas stations, the people who cut the grass along the highways, and all of those support systems. You cannot talk about infrastructure in the telecom-information sector without also talking about the human support systems. PART 2—REGULATION AND THE EMERGING TELECOMMUNICATIONS INFRASTRUCTURE Roger Noll of Stanford University described two foci of debate about the merits of regulation. One is philosophical, considering legitimate boundaries to government coercion: Does regulatory policy go too far or not far enough in trading off individual liberty for collective rights? (Note the reprise here of the theme of values.) The other focus is policy oriented and is concerned with an instrumental question: How will the performance of a regulated industry be changed by imposing regulation? Noll suggested that even within the limited context of the historical use of regulation in the United States, the philosophical issue is unlikely to be resolved and is largely irrelevant: "Citing a meritorious policy objective is not sufficient to justify the conclusion that regulation is warranted." This conclusion reflects a recognition that regulation, while intended to correct for market failures, can itself create inefficiencies, a point made by Noll and others, including Philip Verveer of Wilkie, Farr, and Gallagher, who remarked at the workshop: [I]f we are interested principally in efficiency—and I think we are … we would be better off trying to clear out as much [regulation] as possible, recognize that there are some exceedingly legitimate issues that the local telephone industry confronts with respect to historic obligations that may no longer be sustainable or appropriate, and also that there may clearly be distributional concerns with respect to low-income people or folks who live in high-cost areas. Consequently, maintained Noll, the debate should focus on "how to design the details of regulation to ameliorate to the maximum feasible extent the inherent infirmities of the regulatory process." He admonished that, whatever actions are taken, they must be carefully structured to be feasible within the very particular U.S. political system.

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure Workshop discussions during Panel 2 illustrated the extent to which we have progressed into and accepted Stage 2 of Firestone's schema: deregulation and the search for market efficiency—what Eli Noam of the Columbia Institute for Tele-Information, Columbia University, called a "post-deregulatory agenda." Indeed, there were virtually no advocates of Stage 1 regulation; the acceptance of Stage 2 was acknowledged (perhaps grudgingly) even among stakeholders, such as public interest advocates, traditionally in favor of strong regulatory positions. A consensus position seemed to be that uses of regulation have been appropriately reduced in the past decade and that in the future they should be invoked only with extreme care. Despite relative agreement on the principle of regulatory restraint, workshop discussions reflected some of the disagreement that exists about what to do in practice; the debate on the uses and details of regulation was not one-sided. The following contrapuntal quotations illustrate the differences: Panelist Robert Harris of the University of California at Berkeley: "In too many instances, state regulation has become a major obstacle to competition, deployment of new technology, and development of new services. The best route to the information infrastructure of the future is not through more regulation but through different regulation and less regulation." Panelist Dale Hatfield of Hatfield Associates: "In either case—with limited competition (between telephone companies and cable companies) or a two-player oligopoly—the resulting rivalry would hardly meet the test for robust competition that would justify full deregulation of the local exchange carriers. Thus, policymakers and regulators would be well advised to exhibit a healthy amount of skepticism regarding the lifting of the existing line of business restrictions and deregulating local exchange carriers." Panelist Nina Cornell, a private consultant: "[A]lthough it has become fashionable to argue that regulation of local exchange carriers is impeding the development of new technologies, the proposed cure—deregulation—would be worse than the status quo…. [R]egulation, however, needs a change of focus." Panelist Thomas Long of the California organization Toward Utility Rate Normalization (TURN): "[R]egulators should … hunker down for at least 5 to 10 years of hard work … [in] protecting consumers and competitors from monopoly power…. [But] managing infrastructure improvements should not be the endeavor that keeps regulators busy in the next decade." As Cornell also noted, decisions about regulation should be cast within a broader-than-traditional context, considering antitrust enforcement as well as classical regulatory options. This outlook was reflected by Verveer, who was lead counsel in the antitrust investigation and prosecution of AT&T. "Divestiture," he said, was itself "… an act of deregulation … [eliminating] the principal regulator … AT&T." As noted in the quotations above, an issue of continuing uncertainty is timing: When is the transition to a sufficiently competitive outcome complete, allowing for further rollback of regulation? Timing has been a central concern in debates over lifting of the Modified Final Judgment constraints, encouragement or permission of entry into local exchange competition, and aspects of the universal service debate. Timing of federal regulatory decisions should also reflect actions at the state level, some of which were discussed by Harris. The states appear to have an evolving and varied role in overcoming the types of underinvestment traps described by Noam and Bridger Mitchell (then of the RAND Corporation) when new network technologies are either at their initial

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure stages of adoption or when networks are maturing without pushing penetration to the point of achieving universal service. A function of technology development, business development, public policy, and the interaction of these factors, timing is an inherently difficult point on which to achieve consensus. PART 3—PUBLIC INVESTMENT IN INFRASTRUCTURE Through several forms of direct investment, the federal government, in particular, has had and can have a significant impact on the shape and growth of the telecommunications/information infrastructure. As Panel 3 keynote speaker Walter Baer of the Rand Corporation noted, federal infrastructure investments have a long history, often motivated by defense considerations. These investments go back at least to 1843, when Morse telegraphy was supported by a congressional appropriation. As summarized by Baer during Panel 3's deliberations, investment vehicles include financial incentives (e.g., tax credits), research and development (R&D) funding, support of operating systems, development of applications, creation of information and associated resources, and support for agency activities related to standards setting. Each of these areas itself may present several options. As an example of the impact of government investment, Robert Kahn of the Corporation for National Research Initiatives examined the history of the Internet, beginning with the Department of Defense's development of the ARPANET packet-switched network, to illustrate how the federal government can involve itself in hands-on prototyping and procurement in an R&D undertaking. The government can also have less project involvement; it can undertake some form of benevolent partnering with nongovernmental entities or it can provide more hands-off oversight and steering of the enterprise. However, Baer particularly cautioned about the need for strong market feedback through industry involvement and cost sharing whenever the government's role extends beyond the R&D stage. A strong point of agreement by Panel 3, as articulated by Baer and panelists Charles Jackson (of Strategic Policy Research Inc.) and Laura Breeden (then of FARNET), was that public investment in the telecommunications/information infrastructure is minuscule compared to private investment, in a ratio on the order of 1:50. Speaking for the Clinton administration at the workshop, Michael Nelson of the Office of Science and Technology Policy emphasized the dependence on private investment: "The administration does not have $100 billion sitting around. We are not going to build this network. We need to find incentives to encourage the private sector to spend the money that is needed." As Nelson's remarks and the discussion of regulation suggest, there are ways for the federal government to influence private spending even if it does not invest directly. Clearly, who in industry does the investment, when, and how will depend on the environment, itself shaped by policy. Long cautioned that increasing regulated telephone rates has the effect of making ratepayers involuntary investors in advanced telecommunications infrastructure. This, he asserted, has the effect of sparing the regulated firms from competing for capital like other firms and also results in those ratepayers who do not require advanced services nonetheless helping to defray the costs of those who do. But Jackson noted that an environment that limits returns on local loop investment, for example, might drive local exchange carriers to a "harvest strategy," leaving new infrastructure construction to new entrants. A regulatory response to Jackson's concern has been a move toward price-cap rather than rate-of-return regulation, allowing carriers to increase return on investment and presumably investments by increasing their efficiency. As discussed below, because of the rapid changes in telecommunications technologies, many think it is unlikely that regulators can accurately predict the investment behavior their policy actions will induce. This situation gives rise to the further complication, noted by Panel 2 member Robert Crandall of the Brookings Institution, that forcing (or subsidizing) investments by regulated carriers

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure in the telecommunications/information infrastructure creates pressure on regulators to limit competition to the extent necessary to ensure a fair return on these investments. Thus, subsidized investments in new technologies can become vehicles for locking in the regulated status quo and precluding the development of competitive markets in advanced services. The difficulty of predicting who will do what, and when, was dramatized by an event that commenced during the workshop. In the middle of the workshop, an unscheduled presentation was made announcing the proposed merger of Bell Atlantic and Tele-Communications Incorporated (a major cable TV operator), an announcement that led to much speculation and analysis in the subsequent months. The February 1994 decision by the parties not to proceed with that merger—partly on the complaint that new FCC constraints on cable TV pricing would make the merger less profitable—underscores the problems that both government and industry face in predicting and planning. There was considerable consensus at the workshop on the need to concentrate government investment at points of market failure so as to maximize that investment's leverage. A basis for the consensus came from the conceptualization of three phases of network development by Noam of Panel 2 and was expanded on by Mitchell and Kahn during Panel 3's deliberations. (These phases should not be confused with Firestone's three stages of regulatory evolution.) In the first phase, the start-up phase when the network size is small, the cost per average user is higher than the benefits accruing to that user; the network has not yet achieved "critical mass." The second phase occurs after critical mass is attained, when enough users have joined the network to reduce the average cost per user (because of the spreading of fixed costs), and the presence of more users has increased the utility (benefits) to all users. This is the regime of profitable commercial operation, where average benefits exceed average costs. As the network continues to expand, higher-marginal-cost users join, increasing the average cost while not substantially increasing network utility by their presence. These developments lead to the third phase, universal service. Two concentration points for government investment, it was broadly agreed by workshop panelists, are phases 1 and 3. In the early stages of network development, before critical mass is attained, the government should invest to offset network start-up costs and to subsidize services (e.g., library databases) that add high utility to the infrastructure but that may not be able to bear implementation costs by themselves. It is here that government investment in fundamental research and in experimental testbeds has traditionally had significant payback, epitomized by the ARPANET and Internet experiences described by Kahn. During this period, explained Mitchell, investments can serve to lower costs, facilitating the expansion of supply, or they can serve to expand demand by generating new or improved applications. At a much later stage, phase 3, investments can be made to achieve equity and universality of access by users whose marginal cost is excessive or who cannot afford the cost of access. A number of participants argued that since even Noam's phase 2—commercial viability—has not yet been achieved in the "information superhighway," it is very premature to consider investments in universal service now. (See "Cross-cutting Issues" below.) A third concentration point for government investment, it was agreed, is standards setting, an area discussed in Kahn's paper. Note that by focusing on the net benefit from network technology, Noam's model appears to imply that there is some way of identifying which technologies are winners and which are not, in order to target investments. Furthermore, even if all technologies are winners in some absolute social accounting sense, to the extent that they can substitute for one another, there is still the problem of picking the most beneficial. How we come by that knowledge or determine what are useful rules of thumb for making choices among alternatives is a difficult and critical question that was not well addressed by workshop participants. But this very question of targeting technologies does suggest a vital role for R&D—to evaluate options and the costs and benefits of new technologies. A recent NRC report suggests that it is possible to encourage development and deployment

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure of the information infrastructure in ways that leave open choices of implementing technologies (CSTB, 1994). When government support extends beyond the R&D stage, however, it is important to assess and respond to market feedback. Despite the cautions expressed above, several workshop participants were passionate in their conviction that federal investments in infrastructure-related R&D have been and will continue to be enormously productive. The skepticism expressed on this point by others is in part symptomatic of the broader fiscal context in which federal investment in R&D is being reexamined. Because of changing constraints on government investment, Kahn speculated that the federal government probably could not repeat what it had done in launching the National Research and Education Network (NREN) program, which built on and fueled development of the Internet. That is, although the Clinton administration's National Information Infrastructure (NII) initiative is sometimes perceived as involving both substantial government investment and control, neither is realistic. Kahn posed the question of how the process that resulted in NREN and its successes could be institutionalized as part of the R&D business. The larger question is how the Internet model can be leveraged to advance the NII.5 CROSS-CUTTING ISSUES Although policymakers face choices among various options, some of which can be grouped under the headings of regulation or public investment, several issues cut across policy regimes. Such issues were raised by all three panels and include accommodating rapid technology change, standards and standards-setting, and democratization of the infrastructure. Verveer listed a similar set of issues: interconnection and standards, open entry to the local exchange, universal service, and telephone company integration into adjacent markets. Verveer's last issue relates to the others as a factor affecting timing and direction, because the telephone network (and the cable networks as well) were originally built for efficient delivery of one service—voice communications—whereas the emerging concept of the NII implies an architecture or technical framework that is more general and flexible, and thus more capable of supporting multiple services (CSTB, 1994). Technological changes and standards will be prerequisites; the nature of the technology and architecture, in turn, will affect who benefits from the infrastructure and when, as well as what, policy intervention(s) will be most effective. Accommodating Rapid Technological Changes Rapidly changing technology poses major challenges for infrastructure policy. The rapid shift in the telephone network system from analog to digital switching technology, described by Lucky, was but the first of several waves of change that are transforming, integrating, and adding complexity to the telecommunications/information infrastructure. The rapid change (and uncertainty) is today underscored by the contrasts—previously noted—between the telephone network and the Internet. As Lucky and Kahn explained, these networks involve very different uses of technology and also very different technical architectures, and their cost structures and patterns of use are also very different. Complicating the situation is the fact that existing understanding and models were conditioned on the telephone infrastructure, whereas different technologies and architectures—such as the Internet approach toward networking, wireless transmission, and competition from or interconnection with cable networks—will change many assumptions (e.g., about urban-rural cost differentials). Furthermore, the emphasis on intelligence at the periphery of the network, which is characteristic of the Internet, will have implications for the logic of network operations, its structure,

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure and especially the types of public infrastructure services that will be in demand in the future. Conversely, the need for peripheral intelligence also highlights problems associated with variations in ability to pay. For example, Pearlman's discussion of conditions in education shows how limited resources for communication outside the traditional public information infrastructure, in this case computers and telecommunications facilities in public schools, constrain the use of public infrastructure to promote educational policy objectives. Both sets of concerns—network operations and affordability—will be affected by decisions about network architecture. The openness of an Internet-style architecture, for example, implies a movement toward the unbundling of network facilities and services, which will, in turn, affect costs, competition, and innovations in both fundamental technologies and the services that build on them (CSTB, 1994). The rapidity of technological change is also reflected—along with a certain amount of media "hype"—in the proposed broadband multimedia applications of the NII. While typical Internet applications (e.g., electronic mail, file transfer) and other narrowband applications are now available with no more than the current twisted-pair telephone connections, implementation of a broadband NII has enormous technical, economic, and policy implications.6 It will require billions of dollars, but so far—some at the workshop asserted—its applications are unproven. The skeptics therefore questioned the wisdom of the pell-mell dash to an advanced-feature NII. In response, Kahn, one of the primary architects of the Internet (via its predecessor, ARPANET), replied that the original plans for ARPANET provoked the same sort of skepticism from the research community, the very user set that initially benefited most from it. The skepticism dissolved only when an ARPANET testbed was available. Kahn asserted that, until a committed user community has developed for the advanced-feature NII, skepticism toward it will remain. He felt strongly that a government-sponsored testbed is the way to develop such a user community. While infrastructure investment discussions typically focus on hardware elements, substantial portions of the emerging telecommunications/information infrastructure consist of software—amounting to billions of dollars of investment. Software will become increasingly important because it is what makes possible the development of common (and differentiated) infrastructural services, which greatly enhance the usefulness of the physical components of the infrastructure. The spur to the growth in Internet usage resulting from the introduction of the Mosaic interface to the World Wide Web illustrates the impact of software. But one often unappreciated problem, noted by Alfred Aho, then of Bellcore, is that substantial amounts of embedded software present a bottleneck to change. Another problem, noted by Baer, is a tax structure that sometimes makes it difficult to reward or encourage investments in developing new kinds of software; for example, federal research and experimentation tax credits in place since 1981 have not always been interpreted to be applicable to software development. The rise of software as a vehicle for adding value to underlying network services has enabled the growth of network-based systems integration. Noam, Breeden, and others remarked on the competition between network service providers and systems integrators. Reflecting on the evolving patterns of competition, several participants speculated about which capabilities and services could or would be regarded as commodities and what commoditization may mean for technology development and deployment as well as the evolution of the industry. The criticisms leveled at regulation during the Panel 2 session reflected an appreciation for the association between past regulation and the relatively slow innovation and modernization in telecommunications. Part of the problem is a human one, the difficulty faced by regulators trying to decide which technologies are preferable. Observed Crandall, [T]he technology is changing so quickly in this area and the number of players is so potentially large that trying to use regulation as the instrument for inducing investment in infrastructure is fraught with enormous dangers. Even if regulation could

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure bring about the type of investment that you think is desirable today, it would be inefficient to do it through regulation, it would be extremely costly, and what I fear is if it turned out to be a mistake, it would be very difficult to correct that mistake. Crandall's concerns were complemented by Lynch's observations on how information technologies are inducing changes in the complex system of information providers of which libraries are a part. The process of change, apparent in citation and publishing practices, is ongoing. Neither the end results nor their sensitivity to coercive policy interventions, such as regulation, can be forecast. Concerns about the risks of the slow pace of regulatory processes and the costs of mistakes imply assumptions that (1) the more rapidly technology advances the better and (2) there will be no cause for regret over the outcomes if we just let natural economic forces take their course. Both assumptions call for further analysis. Indeed, a CSTB committee that considered these issues has argued that market forces may not readily yield either the unifying, open architecture or the kinds of general and flexible technology required to maximize the societal benefits implied by much of the NII rhetoric (CSTB, 1994). To minimize mistakes, commented Cornell, participation by multiple players should be encouraged "so that lots of new ideas can be tried in the marketplace." The opening of the local exchange to competition is consistent with this view, although the proliferation of alliances across industries (e.g., between telephone and cable or communications and software or hardware systems companies) suggests new possibilities for reducing healthy competition among approaches. Also, as Hatfield observed, competition will be affected by the structural and human knowledge constraints on cable companies providing telephone service and on telephone companies providing cable service. All this having been said, there was broad consensus on the poor fit between the political process by which regulations are made and that of innovation. As Noll noted, "Technological progress is not something that happens through consensus and compromise. It happens with crazy people going out and doing things that others did not think could be done." The fact that much current and anticipated technological change is not incremental underscores the weaknesses of "the incremental consensus approach of regulation and policymaking," according to Pepper, who suggested that "there may be an opportunity to have some nonincremental change on the policy side." Standards and Standards Setting Standards are essential for making a multifaceted multiprovider infrastructure work. They can shape the nature of the services and capabilities that are available, as well as how they are implemented (see Box 1). Although, as noted above, many comments were made at the workshop about the difficulties and therefore the undesirability of having bureaucrats select specific technologies or players, Pepper noted that that mind-set should not preclude a major government role in the standards arena: [T]here is a very important role for the government to establish the framework within which the smartest engineers in industry develop standards…. [T]here is a balancing because people…who might lose in a standards-setting process too often try to use the process against the setting of standards that might be somebody else's standard.

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure BOX 1 The Economic Role of Standards Joseph Farrell University of California at Berkeley Standards are an explicit or implicit agreement to do things—such as encode information—in common ways so that different services can work together, information can be exchanged easily, and so forth. Standards may be dictated, benevolently or not, by an authority or powerful player (often a seller or buyer). This was historically the main mode of standardization in telecommunications, with AT&T in the United States and governments elsewhere determining the standards. But as the telecommunications industry has become less concentrated, authority-dictated standards are no longer available and other means of reaching agreement must be developed instead. These methods include the use of a formal or informal consensus process ("committees," or voluntary standards organizations) and a process of letting "de facto" standards emerge from a positive-feedback process among many industry participants. None of these mechanisms for generating standards always generates a "good" standard, a rapidly available standard, or any standard at all. Government-set standards may be unduly influenced by pressure groups or simply reflect a lack of sophistication in underfunded government agencies, and standards set by big players may be chosen to preserve their position. Consensus negotiations may be protracted, and the outcomes may reflect bargaining power more than technical or economic merit. Finally, the economic forces that usually tend to make de facto marketplace outcomes economically efficient are at best weak in contexts where standards are important, as the recent economic literature on the subject has shown. Part of the problem with standards development is that standards not only make interconnection and information exchange easier, but they also affect the nature of competition. Open standards level the playing field, reducing the commercial advantages conferred by size or historical dominance. They also facilitate specialization, especially specialized entry. These complex competitive effects often give participants in standards-setting efforts mixed motives, or even create incentives for sabotage. Even if all participants want a standard, problems remain. Perhaps likelier than a choice of ''the wrong" standard is the wrong timing of the standards decision: a choice of standard may freeze the fundamental technology (although it may accelerate nonfundamental innovation), and so the timing of standardization requires a subtle balancing of the benefits with the foreclosure of options. With hindsight, the Japanese and European governments probably chose high-definition television (HDTV) standards prematurely, slightly too soon for full digitization, whereas the United States (fortuitously, and by a hair's breadth) probably waited long enough. Some would say that the same is true of digital wireless telephone. But without hindsight, the timing problem is hard: opinions will genuinely differ on the payoff for waiting (most experts in 1990 thought that all-digital HDTV within the confines dictated by the Federal Communications Commission (FCC) was not possible, until General Instrument showed otherwise), and people experience different trade-offs between speed of standardization and the technical quality of the standard chosen. Moreover, if a market is developing in advance of standardization, participants may become locked in and standards deliberations become moot. Many standards organizations, including the International Telecommunications Union and the International Electrotechnical Commission, have begun to pay more attention to the speed and timeliness of their standards development, which should reduce unintended delays but will probably do nothing for misjudgments about optimal timing. In the United States, "government" standards setting often is closer to government-overseen standards setting. For instance, in HDTV, the FCC created the Advisory Committee on Advanced Television Service, involving industry participants, and asked it to recommend a standard. This approach echoes the FCC's approach with earlier generations of television standards: indeed, the current television "NTSC" standards are named for the National Television Systems Committee. Compared to European governments, for instance, the U.S. government has also been quite reluctant to set standards.

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure Government involvement may be less risky in standards setting than in some other forms of regulation, because even standards set with government involvement are often voluntary and can be ignored if unsuitable enough: if there is a clearly superior alternative it may replace a poorly chosen standard. For example, the U.S. government tried to use its purchasing power to encourage the use of Open Systems Interconnection (OSI) computer networking protocols, but attractive products were not readily forthcoming, and most other buyers continued to prefer the more established TCP/IP protocol suite. As a result, the government recently withdrew its procurement specification of OSI. Of course, this is at best an imperfect check on errors—the government's choices are powerful and can impose bad outcomes. But government involvement in standards, if properly managed, may be more like "indicative planning" and less like compulsory regulation. This can potentially help private parties coordinate without coercion. Economics Literature on Standardization Economists have long recognized that some products are more valuable when more people use them; the phenomenon we now call "network effects" was studied many years ago by Leibenstein (1950). But the modern literature took off when economists became interested in how firms might make strategic choices in the arena of standards—particularly whether to try to become compatible with rivals or insist on remaining incompatible: for recent discussions, see Katz and Shapiro (1994) and Besen and Farrell (1994). Further references can be found in these latter two articles and in a survey by David and Greenstein (1990).. In characterizing the FCC's involvement in standards for high-definition television (HDTV), Pepper attested to the role of functional standards in militating against overly short-term technical solutions. Similarly, Kahn's remarks addressed how government funding of Internet protocol development and standardization resulted in an architecture that embraces heterogeneity (i.e., heterogenous hardware, software, underlying transmission technologies, and capacities). Several workshop participants commented on elements of the standards-setting process that they believed should be promoted Lucy Richards of the Department of Commerce, for example, commended the FCC for pushing the HDTV standards process to involve not just broadcasters but also the computer industry and small innovative technology firms. Vinton Cerf of MCI, drawing on his experiences with the Internet, noted that interoperability standards must involve a process requiring implementation and testing that shows that independent implementations can, indeed, interoperate. Although they have received the most attention in discussions of information infrastructure, interoperability standards are only one of the many kinds of standards needed. Another kind of standard relates to the encoding and formatting of the information to be transmitted, stored, and accessed via the infrastructure. Comments made by Shortliffe describing difficulties in the health care context appear to apply more generally; they are relevant to the development of applications in several areas. Shortliffe noted, for example, that because it is difficult to encode electronically subjective knowledge and impressions about quality of care that are important in making informed medical and health insurance decisions, there may be a danger that reliance on electronic formats will exclude useful data and reduce efficiency. There is thus a need for more work on information classification or categorization systems, which should properly be considered part of the infrastructure. Shortliffe commented on the standards-like nature of such classification systems, noting the need to eliminate inconsistencies across information classification or categorization systems that

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure impede effective use of data already collected. His observations from health care were echoed by the more general assertions by William Gillis of Motorola that investing in a uniform organization and structure for data would make access to and use of data more efficient and could ultimately lower the costs of accessing and publishing data. A wag once pinpointed the beauty of standards: everyone can have his own. The irony identifies the tension in standards setting. Were governments to have the authority and responsibility for setting unique, definitive standards, one could easily have bureaucratic infighting leading to ossification of standards while technology leapfrogs them—witness the endless international bickering over narrowband integrated services digital network (ISDN) standards while ISDN became all but obsolete, or the federal promotion of the Open Systems Interconnection (OSI) scheme through GOSIP while markets moved toward TCP/IP in data network technology. On the other hand, if companies compete to set proprietary standards to gain market share, chaos may be the result. A popular recent middle ground—establishment of multicompany consortia, sometimes subsidized by government, to create de facto industry standards—has been attacked by larger companies on the grounds of inefficiency and by smaller companies on the grounds of exclusivity. Workshop participants seemed to agree that the frequently referenced government roles in fostering the TCP/IP standards through support of Internet development and use (i.e., setting standards by supporting testbed development and research projects) and the HDTV standard through competitive testing for spectrum allocation (i.e., setting standards by forcing competing interests to cooperate) may be models for future government action. Democratization Many workshop participants, beginning with Firestone, appeared to envision a greater democratization of information access and use. This theme was explored in discussions of universal service, the changing roles of such information providers as libraries, and increasing access and use in education. The Internet was at least implicitly recognized as a model of democratization, since it has demonstrated the ability of small players to become providers of content to a global network market. On the Internet, even individuals can make Motion Picture Experts Group (MPEG)-encoded movies available to large numbers of people; no longer is movie distribution limited to the large studios. These features of the Internet reflect its technology and architecture and therefore cannot be assumed to translate automatically to the larger, more complex NII. The fact that existing policy mechanisms may be ill suited to promote democratization was recognized. As Pepper observed, "[T]he traditional regulatory process all too often puts agencies like the FCC in the business of refereeing between the private interests of providers as opposed to putting users and user needs at the forefront." Industry faces similar problems. Crook, for example, lamented that the "most complicated part of networking … is how to deliver things to ordinary people who walk in off the street." Education represents an applications area that is fundamental to the democratization of infrastructure access and use—and one that epitomizes the challenge of achieving that democratization. Difficulties experienced in integrating the information infrastructure into education underscore the importance of the human and institutional factors needed to realize the potential benefits promised by the hardware and software components. As Pearlman recounted, there are already many examples of K-12 schools obtaining access to events, people, and information around the world, sometimes in real time. However, since most of the enabling services and resources are underwritten by parties other than the educators or students, "one of the characteristics that unites all these wonderful examples is that most have truly demonstrated their commercial unavailability," Pearlman pointed out.

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure Another route for using public institutions to democratize access to information and the information infrastructure is public libraries, which, as noted by Carol Henderson of the American Library Association, serve as community institutions and information providers. Workshop participants suggested, however, that no single role was likely to emerge for libraries. Reflecting on the capabilities made possible by technology, for example, Linda Roberts of the Department of Education speculated that, with access to the information infrastructure, individuals could have libraries in their own homes. Moreover, based on experience in education, such personal libraries would not necessarily contain information derived only from the publishing industry-that is, not only might the role of libraries change, but so, too, might the role of other information providers, in part because individual infrastructure users are likely to be information providers as well as consumers. The many possible ways in which libraries could change were addressed by several speakers and summarized by Lynch: [I]t is important to recognize that libraries are part of a much more complex system of information providers and consumers which includes publishers; the government, which creates a certain amount of public domain information; scholars who use it; [and] researchers…. At the same time libraries are struggling to modernize into an environment where much of their content is electronic. Information technology in networks is going to cause some fundamental reassessment of the roles of other groups within which libraries are part of the public system. Fundamental to democratization is the achievement of universal service, which will be shaped by regulation—driving private investment—and public investment. George Gilder, a private consultant, cautioned that "you cannot instantly create universal service," but the political and moral imperatives of striving for it pervaded the workshop discussions. Universal service has been advanced as an essential principle by the Clinton administration, Congress, and a range of consumer, industry, and public interest advocacy groups; it was also the focus of a set of commentaries commissioned by the administration in October 1993 and published in March 1994 (NTIA, 1994). Despite a tendency in the public debate to polarize the possibilities with respect to those who have access and those who do not, there will clearly be a spectrum of possible outcomes, which may itself evolve over time. The recognition that network evolution passes through stages and achieves critical mass over time suggests that universality may naturally be delayed. As Noam explained, "There is a connection between critical mass and universal service. And that is you would never get to that second problem without having resolved the first problem." Moreover, although competition may contribute to network expansion, Noam cautioned against assuming it would solve the universal service problem—competition can help make production of network services more efficient, but that issue is different from allocatable distribution. Competition can also complicate the challenge of collecting funds for universal service subsidies if the current model of collection by service providers is presumed to continue. Alternative models for collecting and distributing communications-related subsidies may be needed. Two main dimensions of the universal-service problem were discussed by workshop participants. One, clearly the more general concern, is access by individuals with low incomes regardless of where they live. Workshop participants suggested that a critical factor in terms of cost and timing would be the choice of mechanism—some vehicles for extending service to individuals who could not afford it on their own are more efficient than others; the fact that one vehicle has been used in the past should not be the basis for selecting a vehicle in the future. The other dimension, on which a greater range of opinions was expressed at the workshop, is access in rural areas. Some skepticism surfaced about whether the historic rationale for subsidizing

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure rural access can be justified. Commented Crandall, "[T]here is no particular reason at this stage of our economic development to suggest that people who live in the country are so poor that they could not pay for the full cost of their telephone service." The tension among the values articulated by Firestone (equity, efficiency, liberty, community, and participatory access) was expressed in the range of opinions on the roles for private and public action in achieving democratization. For example, some felt that if infrastructure access were as important to schools, libraries, and health care organizations as it is to banks, these organizations would rationally reallocate their resources to buy such access at the expense of other needs. Others expressed the belief that schools and libraries, if left to their own resources, would never have the wherewithal to buy the access they need; lacking government intervention, they claimed, a large information-disadvantaged class will arise. The confounding factor of politics also was invoked as a constraint on access and use of the information infrastructure. Commented Noll, [I]f the world is such that the politics of education select against the delivery of information to students in favor of something else, it is exactly the same politics that will drive a regulatory institution. They are not separate. They are run by the same state legislature and governor. CONCLUSION The decade since the divestiture of AT&T and the ensuing deregulation have wrought huge changes. In reflecting on these changes, Verveer (a key player in the 1984 events) characterized some of these changes (Box 2). The telecommunications/information infrastructure has become a significant and increasingly important component of the social overhead that contributes to the productivity and growth of other economic activities. This social overhead factor is one reason for treating the development of policies for telecommunications and information industries as somehow different from the development of policies for other industries. The issues discussed at the workshop illuminate several possible future roles for government. Pepper identified six government roles: Goal-setting and leadership; Setting the regulatory framework, which provides incentives for investment; Facilitating or establishing the framework for the standards-setting process; Reducing risk at the margins (e.g., by supporting R&D and developing innovative applications); Procurement, which provides incentives for investment; and Addressing market failures on the communications and information sides. The breadth of opinions aired at the workshop shows that decisions on each possible government role will not be easy. As Michael Roberts of EDUCOM remarked, "Certainly, this [workshop] has shown how primitive the policy space is around the NII. We have to manage that issue, not wish hopefully that it will go away." The failure of the 103d Congress to pass telecommunications reform legislation underscores the value of more fully understanding the issues and perspectives discussed in this report. In short, the workshop elicited lively and at times contentious discussion of key issues of governmental action in the development of the telecommunications/information infrastructure. A surprising degree of consensus on some of the immediate issues emerged. Some of these points of consensus are as follows:

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure BOX 2 Lessons from the Postdivestiture Decade Philip Verveer Wilkie, Farr, and Gallagher Competition and its counterpart, deregulation, work. The Modified Final Judgment's (MFJ) institutional structure is fragile but sustainable as long as the Department of Justice lends material support or the presiding judge is a person of exceptional ability and background. Realignment of the comparative authority of federal and state regulatory authorities is needed in the wake of the Supreme Court decision in Louisiana Public Service Commission v. Federal Communications Commission, 476 US 355 (1986). The salient question about the line of business restrictions is not if they will be removed but when. The divestiture and arrangements that surround the MFJ, including the line of business restrictions, made way for technological and consumer demands, just as the cast in the AT&T case believed they would. Government spending on the NII, as large as it is, is still a very small percentage of total spending on the information infrastructure. The government should therefore not be thinking about building the NII, but rather about creating conditions that promote private-sector investment. Key leverage areas are precompetitive R&D (in which a free market tends to underinvest) and regulatory policy. The deregulatory trend of the past two decades has been basically healthy and should be continued, although with some caution; there are still roles for regulators, but these are much changed. Regulation—particularly state regulation—has often impeded the adoption of innovative technologies. Given the exponential advance of technology, incremental changes in regulation in the future could be a major stumbling block to implementation of the NII. Universal service is a long-term goal that must assist the NII's achievement of critical mass and profitability. As emphasized by the Clinton administration's NII initiative, potential applications in the areas of education, health care, and libraries could serve valuable social goals, but it must be recognized that: K-12 schools form a "cottage industry" that is woefully unprepared to utilize an NII; most classrooms lack even the most rudimentary high-technology tools, including access to telephone lines; and Health care providers are typically a decade or more behind in the use of information technology. Electronic distribution of information may so greatly change the economics and other aspects of publishing and libraries that the future forms of these institutions are unpredictable.

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure Fundamental philosophical differences never lie far from the surface, however, indicating that the current debate over development of the telecommunications/information infrastructure will not soon die out. Some of these differences are as follows: There are those who think that our inability to predict the course and outcome of technological developments severely restricts the utility of any governmental intervention—regulation or investment—in market processes and those who believe that we know enough about the general patterns by which infrastructure industries develop to make useful policy prescriptions. Even within the consensus, noted above, that government investment in precompetitive R&D for the NII will continue to be productive, a few workshop participants expressed skepticism that anything like the success of the Advanced Research Projects Agency in developing the seeds of the Internet could be repeated. There are those who would immediately reduce regulation to the vestigial areas where technological abundance has not been achieved (e.g., the local exchange) or eliminate it altogether, trusting market efficiencies. Others argue that regulators have decades of work ahead of them (e.g., in setting terms of interconnection, allocating costs) before they perish in the full blaze of Firestone's Stage 2. There are those who abhor any government intervention in the standards-setting process, claiming that market mechanisms ("shakeouts") are more efficient, and there are those who insist that, without a government role, interoperability and compatibility will suffer greatly. The persistence of disagreement on these and other issues, and the related likelihood that policy reform related to telecommunications/information infrastructure will be an ongoing process, suggest that such issues are fruitful fodder for research as well as further debate. The papers presented in this volume, as well as the introductory and discussion material for Parts 1 through 3, provide perspectives on potential roles and positions for the federal government, illuminating the pros, cons, and trade-offs. REFERENCES Besen, Stanley M., and Joseph Farrell. 1994. "Choosing How to Compete: Strategies and Tactics in Standardization," Journal of Economic Perspectives 8(2):117–131. Computer Science and Telecommunications Board (CSTB), National Research Council. 1994. Realizing the Information Future: The Internet and Beyond. National Academy Press, Washington, D.C. David, Paul, and Shane Greenstein. 1990. "The Economics of Compatibility Standards: An Introduction to Recent Research," Economics of Innovation and New Technology, Vol. 1. Harwood Academic, Chur, New York. Katz, Michael L., Carl Shapiro. 1994. "Systems Competition and Network Effects," Journal of Economic Perspectives 8(2):93–115.

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure Leibenstein, Harvey. 1950. "Bandwagon, Snob, and Veblen Effects in the Theory of Consumers' Demand," Quarterly Journal of Economics, Vol. 64. National Telecommunications and Information Administration (NTIA). 1994. 20/20 Vision: The Development of a National Information Infrastructure. U.S. Department of Commerce, Washington, D.C. NOTES 1.   This report uses the term "telecommunication/information" because the facilities and concepts associated historically with the telecommunications infrastructure and more recently with the information infrastructure are fundamentally intertwined. That linkage, in fact, is fundamental to much of the discussion captured in this report. 2.   A recent CSTB report, Realizing the Information Future (CSTB, 1994), addresses many relevant technical issues. 3.   This represents the best estimate available at the time, recognizing that the costs of upgrading the telephone network from copper to optical fiber have been falling. 4.   See Chapter 5 in CSTB (1994). 5.   For a discussion of related issues, see CSTB (1994). 6.   Of course, there are broadband NII capabilities today, most notably in cable television systems. As typically discussed, references to "a broadband NII" embrace a bigger mix of services, greater two-way service, and more integration of services than has been typical of cable television offerings.

OCR for page 1
The Changing Nature of Telecommunications/Information Infrastructure This page in the original is blank.