Skip to main content

Cutting Edge Technologies (1984) / Chapter Skim
Currently Skimming:

Software and Unorthodox Architectures: Where Are We Headed?
Pages 27-36

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 27...
... I have asked this question of many people working on the leading edge of computer science research, and their typical answer is "Give us $2 million and about two years." While I have not carried the experiment further? I know that if I were to pursue it, in two years the answer would be either "Here it is, and it's answering correctly 75 percent of the queries" or "Give us another $2 million and another two years." The point here is that with the exception of computer science theory and compiler design, there is very little science in this field, which in my opinion is still largely experimental in nature.
From page 28...
... Traditional software generation cannot be substantially improved with the techniques that are in hand today or that have been developed over the last 20 years. While the hardware cost/performance improves at the rate of roughly one decade per decade, the corresponding softwareimprovement is by comparison insignificant: During the last 10 years software productivity measured as lines of program produced per programmer per year has increased by about 50 percent.
From page 29...
... Different functional programs can be combined in the same way that complex mathematical functions can be composed out of simpler ones. It is too early to predict how well this approach will work, and it is difficult at this time to bend into the functional mold traditional programs used, for example, in data base management, where side effects are the governing mechanisms.
From page 30...
... In my opinion the answer is no for this reason: Because software involves paper and pencil and not the traditionally difficult tasks of putting hardware together into complex physical systems, there is a tendency to assume that software ought to be easier. This is even reflected in the legal differentiation made between software and hardware: one is patentable while the other is "copyrightable." Compare, however, the design of the airline office program mentioned earlier with the design of a complex physical system like a jumbo jet and not the routine twentieth design of a jumbo jet, but the original design of such an airplane.
From page 31...
... When I am told that a certain program or machine has a friendly user interface I get a bit worried, because I find that I need 40 or 50 hours to become truly acquainted with a new machine, and I am a computer professional. Such a relationship may be friendly compared with a life-threatening situation, but it is not a relationship we should have with something next to our desk.
From page 32...
... Large companies that have begun to establish networks of distributed systems are approaching these systems from an era in which dumb terminals became progressively more intelligent in the presence of a central powerful program that controls everything in the system. It is now becoming clear that these interconnected machines need to be autonomous and that there can be no central "executive." The reason for this decentralization lies in the desire to establish arbitrarily large systems: A centrally directed system has an upper size limit of perhaps 50 to 100 nodes for the same reason that an individual can only converse simultaneously with a small, limited number of people.
From page 33...
... If we focus on greater program intelligence, which is the key to making these machines truly useful to us, then we shall discover that a forefront-research intelligent program occupies today 3 or 4 megabytes and, incidentally, takes about 30 to 70 man-years to develop. So having a personal computer that is truly friendly and that tries to comprehend some free-format English as well as to provide certain intelligent services involves the equivalent of 4 or 5 of today's forefront intelligent programs, i.e., some 15 to 20 megabytes of primary memory.
From page 34...
... The second and perhaps biggest challenge before us is to discover a minimal semantic base, a minimal form of "computer English" that is understood by all machines and that simultaneously maximizes local autonomy and application cohesiveness. The third challenge involves the assurance that the information carried on distributed systems is protected and that the signatories of messages are indeed the purported agents and not impostors.
From page 35...
... Finally, I think that following our experience with single-processor machines there will be special-purpose multiprocessor systems dedicated to specific applications that will always carry out these applications without the need or ability to be reprogrammed for other applications. SUMMARY AND CONCLUSION In the software domain, programs will move from the artisan to the mass-manufacturing era, and they will manifest themselves progressively more as hidden computers or mass-manufactured applications on diskettes.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.