Skip to main content

Currently Skimming:

Appendix B
Pages 116-170

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 116...
... Although the discussion is wide ranging and concerned primarily with topics distant from those usually discussed by the Department of the Navy and DOD modeling communities, understanding how those topics relate to one another is essential for appreciating both the potential and the enormous intellectual challenges associated with advanced modeling and simulation in the decades ahead. BACKGROUND Perhaps the most generic trend in technology is the creation of increasingly complex systems together with a greater reliance on simulation for their design and analysis.
From page 117...
... In considering how to construct such a base, we observe a unifying theme in VE: Complexity is a by-product of designing for reliable predictability in the presence of uncertainty and subject to resource limitations. A familiar example is smart weapons, where sensors, actuators, and computers are added to counter uncertainties in atmospheric conditions, release conditions, and target movement.
From page 118...
... We then discuss briefly significant lessons from software engineering and computational complexity theory. There are important generalizable lessons, but as we point out repeatedly software engineering is not a prototype for VE.
From page 119...
... Finally, we include a section on what we call "soft computing," a domain that includes "complex-adaptive-systems research," fuzzy logic, and a number of other topics on which there has been considerable semi-popular exposition. Our purpose is to relate these topics to the broader subject of VE and to provide readers with some sense of what can be accomplished with "soft computing" and where other approaches will prove essential.
From page 120...
... can occur even in simple settings such as a rigid coin in a vacuum with no external forces, not even gravity. With zero initial velocity, the coin will remain stationary, but the smallest initial nonzero velocity will cause the coin to drift away with distance proportional to time.
From page 121...
... The point is that linear systems can exhibit very extreme sensitivity to initial conditions because of exponential growth. Of course, STIC is a matter of degree.
From page 122...
... and so on. This still has, in the small, the same exponential growth as the linear system, but its orbits stay bounded.
From page 124...
... These models will necessarily be nonlinear if they are to reproduce the fluttering motion, as this requires a nontrivial nonlinear model for the fluids. Bifurcation is related to chaos in that bifurcation analysis has often been an effective tool to study how complex systems transition from regular behavior to chaotic behavior.
From page 125...
... . If the coin flexibility interacts with the fluid sufficiently, we could quickly challenge the state of the art in computational fluid dynamics.
From page 126...
... If only average or typical behavior is of interest, this can be easily evaluated with a modest number of repeated Monte Carlo simulations with random initial conditions. In this case the presence of parametric uncertainty adds little difficulty beyond the cost of a single simulation.
From page 127...
... This too can be computationally intractable, depending on the number of the parameters and the functional dependence on them. In some cases, exact calculation of probability density functions can be easier, and numerical methods for evaluating probability density functions by advanced global optimization methods is a current research topic.
From page 128...
... The coin may exhibit erratic and apparently random motion as it falls. While we could in principle model the fluid dynamics of the atmosphere in detail, the error-complexity tradeoff is very unfavorable.
From page 129...
... It is currently beyond the state of the art to do computational fluid dynamics (CFD) for a moving vehicle in a way that would allow the use of CFD codes in vehicle simulation.
From page 130...
... Indeed, robust control theory often models parameters, noise, and unmodeled dynamics in such a way that control design can be viewed as a differential game between the controller and "nature." Such methods could be quite useful for designing robust strategies. We will not pursue this subject further here, except
From page 131...
... .5 Uncertainty Management The motivation for introducing uncertainty mechanisms into our models is to predict the range and distribution of possible outcomes of our experiments. What is somewhat misleading about coin tossing with respect to the broader VE area is the small scale of the experiment and its relative homogeneity.
From page 132...
... In the coin flipping experiment, terms such as inviscid flow might be mentioned, but not something like the assumption of chemical equilibrium. Problems with conventional modeling arise in VE because the consumer of a model will be a larger model in a hierarchical, heterogeneous model, with possibly no intervention by human experts.
From page 133...
... These are some of the fundamental limitations on the predictability of models, which will not be eliminated by advances in computational power, measurement technology, or scientific understanding. Thus in developing robust VE, there are certain "hard" limits on predictability, and it is important to understand and quantify the limits on the predictability of full system models in terms of the uncertainties in component models.
From page 134...
... From Homogeneous to Heterogeneous Systems Heterogeneity in modeling arises from the presence in complex systems of component models dealing with material properties, structural dynamics, fluids, chemical reactions, electromagnetics, electronics, embedded computer systems, and so on, each represented by a separate and distinct engineering discipline. Often, modeling methods in one domain are incompatible with those in another.
From page 135...
... This disaggregation approach is essentially the only way that complex system models can be developed, but it leads to a number of difficult problems, including the need to connect heterogeneous component models, and the need for multi-resolution or variable granularity models. Perhaps more importantly, this neo-reductionist approach to modeling may represent a reasonable scientific program for discovering fundamental mechanisms, provided one never wants to
From page 136...
... The standard approach to developing variable error-complexity component models is to allow multi-resolution or variable granularity models. Simple examples of this include using adaptive meshes in finite element approximations of continuum phenomena, or multi-resolution wavelet representations for geometric objects in computer graphics.
From page 137...
... We will argue that uncertainty management together with dynamics and interconnection are the key to understanding both these successes and failures and the future challenges. Boeing 777 Boeing invested more than $1 billion (and insiders say much more)
From page 138...
... What are the next steps in CAD for projects like the 777? Broadly speaking, they involve much higher levels of integration of the design process, both laterally across the various subsystems, and longitudinally from preliminary design, through testing, manufacturing, and maintenance.
From page 139...
... (n -1~/2 = 4.5 x 10~2 possible intersections to be checked. Although this grows only quadratically with the number of parts (not the exponential growth we are so concerned with elsewhere)
From page 140...
... intersections to computing 1 pairwise component and a few bounding boxes. The bounding boxes have simple geometries, so are more easily checked than the components, but need to be constructed.
From page 141...
... Note that if we introduce uncertainty in our description of the components, it can drastically increase the computation required to do pairwise checking and make the bounding box approach even more attractive. Actually, while the basics of solid modeling are well developed, there is no standard approach to uncertainty modeling even here and many open questions.
From page 142...
... FIGURE B.lOb Cost versus number of points at which aerocoefficients are obtained. chance to discuss computational fluid dynamics (CFD)
From page 143...
... One important point to note is that despite earlier euphoric visions of the role of CFD in aircraft design that suggested it would almost entirely replace wind tunnels, only a tiny fraction of the millions of aerodynamic simulations generated for a modern aircraft design are done using CFD. The remainder continue to be done with physical models in wind tunnels, and this is not expected to change in the foreseeable future.
From page 144...
... Such numerical airflow simulations are the subject of computational fluid dynamics (CFD)
From page 145...
... Uncertainty Management in Commercial Aircraft Design What is particularly interesting for this appendix is the fact that, according to Paul Rubbert,7 the chief aerodynamicist for Boeing, uncertainty management has become the dominant theme in practical applications like aircraft design. He claims that uncertainty management is replacing the old concept of "CFD validation." He argues that both CFD and wind tunnels are "notorious liars," yet modern aerodynamic designs are increasingly sensitive to small changes and error sources.
From page 146...
... We must be realistic and cautious about the way in which VE technology should interact with this process. VLSI Design Design Today Our third case history involves very large scale integration (VLSI)
From page 147...
... Placement and routing also tend to be the most complex and timeconsuming of the various VLSI design phases, since these problems are computationally intractable. Extensive verification and testing occur at each level, with subsequent iteration and modification.
From page 148...
... Unfortunately, as discussed below, by following the software engineering paradigm, the VLSI design process also inherits the problems inherent in that area. The key feature of VLSI design that has traditionally greatly simplified uncertainty management is that the logic level could be effectively decoupled from the physical level.
From page 149...
... The confluence of these trends suggests that fundamental physics considerations will become a dominant component of VLSI design, the distinction between digital and analog circuits will become increasingly blurred (certainly, analog considerations will migrate up well into the higher levels of the VLSI design process) , and uncertainty management will become a much more difficult problem.
From page 150...
... The hope is that future VE software environments, including VR features, will relieve design engineers from the tedious, repetitive, and routine tasks that still dominate much of engineering design and let them focus on the critical decisions involving uncertainty management in the face of cost constraints. The fear is that it will also give highly unqualified people the illusion that they can click a few menu items, run some multidisciplinary optimization, and design a new airplane or automobile that will actually work in the real world.
From page 151...
... Some computer programs now contain over 15 million lines of code, and the programs for NASA's proposed space station will have more than 80 million lines of code. Despite modern programming tools such as interactive debuggers and visual programming environments, the average productivity of professional software engineers working on large systems is only a couple of dozen lines of code per day.
From page 152...
... Most code is by definition written by average programmers, and programming will always be a very labor-intensive activity; it is therefore crucial to select software team leaders and chief architects carefully. In summary, the difficulties we encounter in software engineering are not unique to computer programs, but rather are fundamental to many areas of science and engineering that are concerned with building large, complicated systems (and indeed result from the limits of our current technology and our own inherently limited capabilities)
From page 153...
... Thus, despite the various difficulties, software engineering has made great strides and contributions over the years as well. Computational Complexity Theory Since the beginning of the 20th century, a number of practical global optimization problems have been extensively studied.
From page 154...
... . Implications for Virtual Engineering Both the history of software engineering and the theory of computational complexity have important implications for VE.
From page 155...
... FAMOUS FAILURES OF COMPLEX ENGINEERING SYSTEMS In this section we will briefly review case studies of famous failures of engineering systems: the Titanic, the Estonia Ferry sinking, the Tacoma Narrows Bridge collapse, subsynchronous resonance in power systems, telephone and power system outages, the Denver airport baggage handling system, and Ariane 5. While each of these failures was due partly or primarily to factors beyond engineering or technical considerations, we will concentrate on the technical issues.
From page 156...
... However, a weak door lock was one of the main reasons for the 1994 Estonia ferry disaster that caused the deaths of more than 800 people. The ferry's bow visor, a huge top-hinged door at the front of the ferry that swung up to allow vehicles to be driven into and out of the ferry's car deck was secured by several locks.
From page 157...
... This is a classic example of uncertainty management gone awry. The capacitors were introduced to improve the stability on the electrical side and reduce the potential vulnerability to electrical side disturbances, but they had the unanticipated effect of destabilizing the mechanical side.
From page 158...
... Denver Airport Baggage Handling System The automated system was supposed to improve baggage handling by using a computer tracking system to direct baggage contained in unmanned carts that run on a track. Originally scheduled for completion in March 1994, the unfinished $234 million project helped postpone opening of the airport until February 1995.
From page 159...
... Yet we see an accelerating trend to build increasingly complex systems because uncertainty management demands that we introduce complexity in our models. Let us illustrate this now with two simple and familiar current examples: smart weapons and airbags.
From page 160...
... All these solutions again highlight that the design is driven by uncertainty management, and complexity is introduced as a by-product. What these two examples illustrate is a kind of conservation principle that is at work in complex systems.
From page 161...
... Furthermore, scaling of problem size can make the interaction of these issues overwhelming. As we will see, control theory addresses uncertainty management explicitly, but from a very narrow perspective.
From page 162...
... The situation is changing dramatically, and the trend is to more integration of system design and control design, but we need to accelerate this trend, and control theorists must expand their vision and make greater contact with other disciplines. Although control theory by itself offers only a piece of a potential foundation for a theory of VE, it provides a very important complement to dynamical systems and computer science because uncertainty management is the central issue in automatic control systems.
From page 163...
... This view of control is no longer relevant to even today's design environment where the systemwide control engineer's view of performance is needed at the earliest design stages. As cost-effective uncertainty management correctly takes its place as the dominant design issue, control engineers are forced to play a broader role, and control theory must catch up just to address the current needs, let alone the expanded needs of future VE.
From page 164...
... While mathematical sophistication is a strength of control theorists, they must overcome the natural distance this tends to create with other engineering disciplines. This is one reason why control theory has been applied to dynamical systems and computational complexity with some early successes, but has achieved less success in other areas.
From page 165...
... So far, this is the only known method that successfully handles both parametric uncertainty and unmodeled dynamics and overcomes to some extent the intractability of these problems. While the generalized bounding box methods (they are not called this in robust control theory, but are referred to with a variety of other, more technical terms)
From page 166...
... The telephone and power system failures and the Denver airport baggage handling system fiasco are more clearly examples where uncertainty management in complex systems went awry. These highly interconnected and automated systems are intended to improve performance and robustness and at the same time reduce cost, and they generally do so with respect to the uncertainties and objectives that are considered primary in these systems.
From page 167...
... We touched briefly and informally on all these topics except statistics. CASE here means computeraided software engineering, and complexity theory is computational complexity in theoretical computer science.
From page 168...
... It is here where fuzzy logic and soft computing hold the greatest promise. Advocates argue, though, that fuzzy logic is also ideally suited to handle uncertainty of type 2 as well.
From page 169...
... Based on our experience, which tends to be in hard areas of engineering rather than, say, softer problems of military combat modeling, we remain skeptical. Despite the strong market demands for commercial software to assist in global search in such problems as VLSI design and analysis of uncertain dynamical systems, genetic algorithms have had almost no impact relative to more mathematical approaches such as branch and bound, and problem-specific heuristics.
From page 170...
... Hard complexity equals information theory, algorithmic complexity, computational complexity, dynamical systems, control theory, CASE/CAD, nonequilibrium physics, statistics, numerical analysis, and so on. This appendix has clearly advocated the relative importance of "hard" over "soft" complexity in VE.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.