. "Computer and Information Technology in Biomedical and Neuroscience Research." Mapping the Brain and Its Functions: Integrating Enabling Technologies into Neuroscience Research. Washington, DC: The National Academies Press, 1991.
The following HTML text is provided to enhance online
readability. Many aspects of typography translate only awkwardly to HTML.
Please use the page image
as the authoritative form to ensure accuracy.
MAPPING THE BRAIN AND ITS FUNCTIONS: INTEGRATING ENABLING TECHNOLOGIES INTO NEUROSCIENCE RESEARCH
packet-switching technology, began a project to develop technology to interconnect different packet-switching networks. A series of computer communication protocols were developed for this purpose, along with “gateways” to interlink different LAN and WAN computer networks. The so-called TCP/IP protocol suite that resulted is used today to interconnect 5,000 networks in 35 countries supporting more than 300,000 computers that range from supercomputers to workstations and even personal computers. The system, which is called the Internet (Quarterman, 1989), is operated on a collaborative, cooperative basis involving government, military, university, and industrial resources and thousands of volunteers who keep the system operating.
In 1988, the National Science Foundation began work on the NSFNET to interconnect a small number of supercomputer centers and to sponsor the creation of intermediate-level, or regional, networks to provide access to the NSFNET and its supercomputing resources. Some 13 supercomputer centers and 400 universities are linked to the NSFNET, along with more than 2,000 other networks whose traffic the NSFNET supports. The NSFNET and its “entourage” are now an integral, vital component of the international Internet.
The full utility of such large-scale networking can be realized only if standards are established to enable effective communication among the computer systems that constitute the network (National Academy of Sciences, 1989). The Internet Activities Board and its subsidiary groups, the Internet Research and Engineering Task Forces, are responsible for guiding research and development of standard protocols for Internet computer communication. The quest for useful, yet practical, standards for the communication of numeric, text, image, and other signals data continues and could play an important role in the creation of an effective national neural circuitry database.
One of the most critical limitations of networks for scientific applications is the capacity, or bandwidth, of the present links. Bandwidth refers to the amount of data that can be transmitted per second. For example, NSFNET began with links capable of transmitting 56 kilobits of information per second. A full-text article from a scientific journal averages 20 kilobytes, or 160 kilobits; an abstract averages 16 kilobits. Thus, NSFNET's initial bandwidth was sufficient to transmit three to four abstracts, but not even one entire journal article, per second. In 1989, NSFNET upgraded its system to links capable of transmitting 1.5 megabits per second—large enough for 10 full-text articles or approximately 100 abstracts. Upgrades now in progress will increase the NSFNET backbone-link bandwidth to 44.7 megabits per second, almost a 30-fold increase in the amount of information transmitted every second. Traffic in the NSFNET backbone has reached