Skip to main content

Currently Skimming:

7 Building Computing Systems of Practical Scale
Pages 127-158

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 127...
... In doing so, we often raise new questions about fundamental issues. Started as an attempt to share scarce computing resources, the Internet has become a ubiquitous global utility service, powering personal and commercial transactions and creating domestic and international policy challenges.
From page 128...
... Sudan reviews the history of the cryptography and security mechanisms that underlie secure Web protocols and other forms of secure computer communication. Also evident is another example of how new opportunities arise when we find a way to eliminate a significant premise of a technology -- in this case, the advance exchange of decryption information, or the codebook.
From page 129...
... Layering and Abstraction Several design principles, many of them sharpened by years of experience building early operating systems like Multics, helped shape the Internet architecture. The most important of these was to employ multiple layers of abstraction (see earlier essays)
From page 130...
... Here, early Internet researchers recognized the need to keep the common interfaces minimal, thereby placing the fewest constraints on the future users of the Internet, including both the designers of the underlying technologies upon which it would be built and the programmers that would write the next generation of applications. This allows for autonomy among the entities that connect to the Internet: they can run whatever operating system they want, on whatever hardware they want, as long as they support the agreed upon interface.
From page 131...
... to which they belong. This means that machines are assigned hierarchical addresses, such that finding a path from a source machine to a destination machine reduces to the problem of finding a path to the destination domain, which is then responsible for delivering the data to the right physical segment, and finally to the destination machine.
From page 132...
... Concluding Remarks The Internet is arguably the largest man-made information system ever deployed, as measured by the number of users and the amount of data sent over it, as well as in terms of the heterogeneity it accommodates, the number of state transitions that are possible, and the number of autonomous domains it permits. What's more, it is only going to grow in size and coverage as sensors, embedded devices, and consumer electronic equipment become connected.
From page 133...
... Certainly this is reflected in the set of applications that run on the Internet, ranging from video conferencing to e-commerce, but it is also now the case that the Internet has grown to be so complicated that the computer scientists that created it can no longer fully explain or predict its behavior. In effect, the Internet has become like a natural organism that can only be understood through experimentation, and even though it is a deterministic system, researchers are forced to create models of its behavior, just as scientists model the physical world.
From page 134...
... The new medical consumer arrives at the doctor's office better informed. The Pew Center for Internet Life reports that "fifty-two million American adults, or 55% of those with Internet access, have used the Web to get health or medical information," and of those, "70% said the Web information influenced their decision about how to treat an illness or condition" (Fox and Rainie, 2000)
From page 135...
... Items that would otherwise have been discarded can now find their way to just the person who needs them, leading to a less wasteful society. The remainder of this paper discusses three application areas where the impact of Internet technology merits special attention: the expansion of scientific knowledge, entertainment, and education.
From page 136...
... Popular types of Internet-based games include traditional games (like bridge and chess) , fantasy sports, action games, and massively multiplayer games (MMPs)
From page 137...
... While traditional multiplayer games allow a handful of people to interact in the same game space, MMPs support thousands. It's likely we are just at the beginning of their growth in popularity.
From page 138...
... AquaMOOSE looks much like the purely entertainment environments many students find so compelling, but time spent there is educationally valuable. We need to develop more such environments to encourage students to choose to spend their free time wisely.
From page 139...
... MOOSE Crossing is a text-based virtual world (or "MUD") in which kids 8 to 13 years old learn creative writing and object-oriented programming from one another (http://www.cc.gatech.edu/elc/moose-crossing/)
From page 140...
... In answering a question, one child may tell another, "I got confused by that too at first." The online community provides a ready source of role models. If, for example, girls are inclined to worry that programming might not be a cool thing for a girl to do, they are surrounded by girls and women engaging in this activity successfully and enjoying it.
From page 141...
... In the Palaver Tree Online project (http://www.cc.gatech.edu/elc/palaver/, the dissertation research of Georgia Tech PhD student Jason Ellis) , middleschool students learn about history from elders who lived through it.
From page 142...
... The Palaver Tree Online community makes this not only possible but also relatively easy for the elders, students, and teachers. Teachers are already overwhelmed with work, and any successful school-based learning technology needs to make their lives easier, not harder.
From page 143...
... Ellis, J., and A.Bruckman, 2001, "Palaver Tree Online: Supporting Social Roles in a Commu nity of Oral History," paper presented at the CHI 2001 Conference on Human Factors in Computing Systems, Seattle, Wash.
From page 144...
... She wishes to transmit to Bob her account number and password and yet does not want her Internet service provider to know her account number and password. The potential for commerce over the Internet relies critically on the ability to implement this simple scenario.
From page 145...
... formally asserted that secret communication is possible whenever the communicating parties, Alice and Bob, share some secret information. Furthermore, this secret information had to have some randomness associated with it, in order to prove the security of the transmission.
From page 146...
... Diffie and Hellman noticed that several forms of computation transform meaningful information into incomprehensible forms. (As a side remark they note that small programs written in a high-level language immediately reveal their purpose, whereas once they are compiled into a low-level language it is hard to figure out what the program is doing!
From page 147...
... The protocol can be used to exchange information secretly, where secrecy is now determined using a computational notion of indistinguishability, where computational indistinguishability suggests that no efficient algorithm can distinguish between a transcript of a conversation that exchanges a secret m and one that exchanges a secret m. (Giving a precise formulation is slightly out of scope.)
From page 148...
... and ( g x ; 1 2 3 1 g x ; g x 2 1 x2) seem to contain the same amount of randomness, to any computationally bounded program that examines these triples, when x1; x2; x3 are chosen at random independent of each other.
From page 149...
... And the science rises to meet the new challenge by creating a rich mathematical structure to study, analyze, and solve the new problems. The solutions achieved (and their deployment in almost every existing Web browser!
From page 150...
... Keeping this in mind, one hopes that cryptographic research can continue to thrive in the future uninhibited by external pressures. REFERENCES Blum, Manuel, and Silvio Micali, 1984, "How to Generate Cryptographically Strong Sequences of Pseudorandom Bits," SIAM Journal on Computing 13:850-864.
From page 151...
... The principal foundation is a body of core computer science concepts relating to data structures, algorithms, programming languages and their semantics, analysis, computability, computational models, and so on; this is the core content of the discipline. The second is a body of engineering knowledge related to architecture, the process of engineering, tradeoffs and costs, conventionalization and standards, quality and assurance, and others; this provides the approach to design and problem solving that 2"Develop" -- Software engineering lacks a verb that covers all the activities associated with a software product, from conception through client negotiation, design, implementation, validation, operation, evolution, and other maintenance.
From page 152...
... As a result, for example, an analysis program can produce a symbolic description of the path for a machine tool; another program can take this symbolic description as input and produce a symbolic result that is the binary machine code for a cutting tool; and that symbolic representation can be the direct control program for the cutting tool. Notations for symbolic description of control and data enable the definition of software, both the calculations to be performed and the algorithms and data structures.
From page 153...
... Designing systems as related sets of independent components allows separation of independent concerns; hierarchy and other relations help explain the relations among the components. In practice, independence is impractical, but software designers can reduce the uncertainty by using well-understood patterns of software organization, called software architectures.
From page 154...
... Engineering Fundamentals The systematic method and attention to pragmatic solutions that shapes software engineering practice is the practical, goal-directed method of engineering, together with specific knowledge about design and evaluation techniques. Engineering quality resides in engineering judgment.
From page 155...
... Both the scale and the diversity of knowledge involved in many modern software applications require the effort and expertise of numerous people. They must combine software design skills and knowledge of the problem domain with business objectives, client needs, and the factors that make creative people effective.
From page 156...
... Understanding the widely followed research strategies helps explain the character of this research area and the reasons software engineering researchers do the kinds of research that they do. Physics, biology, and medicine have well-refined public explanations of their research processes.
From page 157...
... Development includes all the synthetic activities that involve creating and modifying the software, including the code, design documents, documentation, and so on. Evaluation includes all the analytic activities associated with predicting, determining, and estimating properties of the software systems, including both functionality and extra-functional properties such as performance or reliability.
From page 158...
... This commitment to dealing with the real world, warts and all, means that software engineering researchers will often have to contend with impure data and under-controlled observations. Most computer science researchers aspire to results that are both theoretically well grounded and practical.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.