Skip to main content

Currently Skimming:

Computing and Data Processing
Pages 257-282

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 257...
... economy is coming to be widely recognized. As a consequence, a major national focus (the High Performance Computing Program)
From page 258...
... formed the national supercomputer centers and began the national NSFNET network. These computational resources were financed from divisions of the NSF separate from disciplinary divisions.
From page 259...
... Thus, the national supercomputer centers give researchers a chance to experiment with the future. Essentially all active researchers need convenient access to good workstations, as well as "clear channel" coupling into the national network.
From page 260...
... As discussed in the "Array Telescope Computing Plan", a conceptual proposal submitted by the National Radio Astronomy Observatory (NRAO) to the National Science Foundation (NSF)
From page 261...
... The context for these findings is the assumption that current support from non-astronomical funding sources for the national network and the supercomputer centers continues throughout the decade. Arrangement of This Report Section II reviews some major challenges and technology trends encountered in facing the transformation to a digital astronomy - on both theoretical and observational grounds.
From page 262...
... Massively parallel real-time processing is constrained by the problem of transferring parallel data from a serial data stream at sufficiently high data rates. As in biological systems, analog image preprocessing at the detector becomes an advantage.
From page 263...
... If, instead, an astronomer at his or her home institution could monitor the data as it was obtained remotely with the same data rate as he or she would have at the site, and if telescopes could be dynamically scheduled as weather or other conditions change, the utilization of both the telescopes and the productivity of astronomers would increase. The high performance national network will provide for "teleobservir~g"- the remote control of telescope systems and the real time transport of the data to the astronomer.
From page 264...
... There is a long tradition of using analytic simplified models to capture the essence of a complicated astrophysical phenomena. Desktop computers are becoming increasingly important to support this work.
From page 265...
... We just now have the first generation of these students receiving degrees and becoming professional astronomers. The percentage of professional astronomers who will be carrying out large computational simulations will grow rapidly over the next decade (assuming that the national supercomputer centers remain adequately supported)
From page 266...
... It will be essential that modern modular software standards be followed and that software be written to be portable to a variety of computers and usable over national high speed networks. Both the the national observatories and national supercomputer centers must lead in the astronomical software development effort.
From page 267...
... Efforts are underway at the national centers to develop, distribute, and support national users with new application software for astrophysical fluid dynamics research, incorporating the most accurate algorithms available for modeling astrophysical fluids. Versions will be developed for dynamics in 2- and 3-spatial dimensions, incorporating the important physical effects of self-gravity, magnetic fields, radiation, and thermodynamic properties of the gas.
From page 268...
... These packages provide standardized, welldocumented methods to analyse a large percentage of the types of data produced with ground-based telescopes. In optical astronomy, CCD's have become the dominant detector for many purposes, and have thus become - at least temporarily - a de facto standard, further simplifying the task of archival research.
From page 269...
... Impediments to Establishing a Data Archive for Ground-Based Astronomy Data obtained from ground-based telescopes have traditionally been the property of the observer in perpetuity. Establishing proprietary periods and development of data archives would break this tradition.
From page 270...
... Finally, the scientific community should have free access to the archive. It is desirable to develop cost-effective, useful archives of digital data from ground-based astronomical observations, available over a high-speed national network.
From page 271...
... IV. HIGH-PERFORMANCE DATA PROCESSING: OBSERVATIONAL IMAGES AND THEORETICAL SIMULATIONS This section presents four independent "case studies" - individual visions by panel members of the computational frontiers of astronomy in the 1990s.
From page 272...
... The largest memory presently available to astrophysicists at NSF supercomputer centers (1 Gigabyte) is about 2.5f3 times smaller.
From page 273...
... At the same time, some knowledge of the plasma's behavior is often essential to constructing a credible large-scale model of the astrophysical system in question. Thus in order for astrophysical plasma physics to produce quantitative results that can be meaningfully related to astronomical data, an iteration must be performed between the microphysics (simulations of microscopic plasma processes)
From page 274...
... Similarly, source terms for non-thermal or relativistic particles can be developed using plasma simulations, and then applied when the appropriate conditions emerge in a large-scale macroscopic model. Computational Requirements: Microscopic Plasma Simulations Most state-of-the-art microscopic plasma simulation codes are currently being run on multiprocessor vector supercomputers, particluarly if more than one spatial dimension is involved.
From page 275...
... The design of the data system must emphasize computational power, fast data transfer paths, flexibility, and expandability. Almost as critical as processing the raw data, this would provide the astronomer with a powerful workstation for exploration of the reduced images and catalogs.
From page 276...
... High efficiency CCD imagers covering most of the useable focal plane, together with specialized on-line computers using automatic image classification software will radically alter our ability to observe the universe. Case Study D: A "Typical" Large VLA Data Processing Request It might be difficult to grasp the enormity of the computing problem for VLA data without an example given in some detail.
From page 277...
... V NATIONAL HIGH PERFORMANCE NETWORKING: OBSERVATIONAL IMAGES AND THEORETICAL SIMULATIONS In this section, we describe a major national experiment in networking, just getting underway, with a goal to determine how remote users will be able to interact at high speeds with remote supercomputers, observatories, and digital archives.
From page 278...
... BIMA—A High Performance Computing Observatory on the Gigabaud Testbed Future supertelescopes will have as an essential component a very high speed data link between the sensor and a computer. Real-time radio astronomy would revolutionize the field by permitting an observer using a synthesis array to see an image of the radio sky as the observations were being made.
From page 279...
... The remote researcher will have the ability to process much of the existing raw data which has not yet been viewed or evaluated over the network. Remote Control of Fourth Dimension Supercomputers Tools will be developed in the gigabaud network testbed project to build applications which support real time collaboration among multiple, remote scientists on scientific and computational aspects of a simulation running in real time on a supercomputer.
From page 280...
... For collaborative interactive data exploration, this also requires gigabit transfer rates. Long term improvements, only possible using a gigabit/second wide area network, will allow the display of simulation output, interactive control of the simulation and interactive analysis of the output to take place concurrently at multiple, separate workstations on the network.
From page 282...
... VANDEN BOUT, National Radio Astronomy Observatory JACQUELINE H VAN GORKOM, Columbia University J


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.