National Academies Press: OpenBook
« Previous: 1 A Vision for Integrated Computational Materials Engineering
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 36
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 37
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 38
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 39
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 40
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 41
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 42
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 43
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 44
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 45
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 46
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 47
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 48
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 49
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 50
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 51
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 52
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 53
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 54
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 55
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 56
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 57
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 58
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 59
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 60
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 61
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 62
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 63
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 64
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 65
Suggested Citation:"2 Case Studies and Lessons Learned." National Research Council. 2008. Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security. Washington, DC: The National Academies Press. doi: 10.17226/12199.
×
Page 66

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

2 Case Studies and Lessons Learned In the 1990s, mechanical engineers began to build and apply integrated com- putational systems to analyze and design complex engineered systems such as aircraft structures, turbine engines, and automobiles. By integrating structural, fluid, and thermal analysis codes for components, subsystems, and full assemblies, these engineers performed sensitivity analyses, design trade-off studies, and, ulti- mately, multidisciplinary optimization. These developments provide substantial cost savings and have encouraged the continued development of more advanced and powerful integrated computational systems and tools. Using these systems and tools, aircraft engine design engineers have reduced the engine development cycle from approximately 6 years to less than 2 years while reducing the number of costly engine and subcomponent tests. Integrated product development (IPD) and its primary computational framework, multidisciplinary optimization (MDO), form the core of this systems engineering process (Boxes 2-1 and 2-2). IPD and MDO have revolutionized U.S. industry, but materials have not been part of this computerized optimization process. While the constraints of diverse materials systems strongly influence product design, they are considered only out- side the multidisciplinary design loop; an example of this is shown for hypersonic vehicles in Box 2-2. In a large, multidisciplinary engineering environment, an integrated product development team (IPDT)—that is, the group of stakeholders  Michael Winter, P&W, “Infrastructure, processes, implementation and utilization of computational tools in the design process,” Presentation to the committee on March 13, 2007. Available at http:// www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007. 36

Case Studies and Lessons Learned 37 BOX 2-1 IPD and MDO A key cultural change introduced into many U.S. industrial sectors toward the end of the twentieth century was the integrated product development (IPD) process, which dramatically improved the execution and efficiency of the product development cycle. IPD is also known as “simultaneous engineering,” “concurrent engineering,” or ”collaborative product develop- ment.” Its central component is the integrated product development team (IPDT), a group of stakeholders who are given ownership of and responsibility for the product being developed. In an effective IPDT, all members share a definition of success and contribute to that success in different ways. For example, systems engineers are responsible for the big picture. They ini- tially identify development parameters such as specifications, schedule, and resources. During the design process, they ensure integration between tools, between system components, and between design groups. They are also responsible for propagating data throughout the team. Design engineers have responsibilities specific to their capabilities and disciplines. Typically, design engineers determine the scope and approach of analysis, testing, and modeling. They define the computational tools and experiments required to support the development of a design and its validation. Manufacturing engineers ensure that the components can be made with the selected manufacturing process, often defining the computational simulation tools that are required. Materials engineers provide insights into the capabilities and limitations of the selected materials and support development of the manufacturing process. IPDTs can range in scale from small and focused to multilevel and complex. Regardless of size, however, the defining characteristic of an IPDT is interdependence. The key to a suc- cessful IPDT is that the team members, and the tools they use, do not work in isolation but are integrated throughout the design process. This approach may entail communicating outside the original company, country, or discipline. Owing to the demonstrated success of the IPD process, many engineering organizations, particularly at large companies, have invested considerable human and capital resources to establish a work-flow plan for their engineering practices and product development cycles as executed by the IPDT. The capability and dynamics of the IPD process are illustrated by the execution of a computationally based multidisciplinary design optimization (MDO). Modern engineering is a process of managing complexity, and MDO is an important computational tool that helps the systems analysts to do that. For example, a modern gas turbine engine has 80,000 sep- arate parts and 5,000 separate part numbers,1 including 200 major components requiring three-dimensional computer-aided engineering (CAE) analysis with structural finite element and computational fluid dynamics codes. This CAE analysis can easily require over 400 per- son-years of analytical design and computer-aided design (CAD) support. The only rational way to accomplish this engineering feat organizationally is by means of an IPDT. Owing to the development and validation of computational engineering analysis tools such as finite- element analysis, computer-based MDO has become routine for many systems or subsystems to improve efficiency and arrive at an optimized design or process. Computer-based MDO automates work flow, automates model building and execution, and automates design explo- ration. A block diagram of the relevant analytical tools utilized by MDO is shown in Figure 1Michael Winter, P&W, “Infrastructure, processes, implementation and utilization of compu- tational tools in the design process,” Presentation to the committee on March 13, 2007. Avail- able at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007.

38 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g 2-1-1. Figure 2-1-2 shows the “electronic enterprise” required to support automated MDO. This includes libraries of validated and certified analysis tools; an integration framework; and electronic process or work-flow maps and IPD tools for work-flow management, collaborative engineering, and secure business-to-business information sharing. MDO has allowed the IPDT to focus on product design decisions based on the results of MDO rather than on generating data. This has greatly improved the robustness of final product designs and dramatically reduced the time to reach design solutions. FIGURE 2-1-1  The computational tools required for successful MDO. SOURCE: Michael Winter, P&W, “Infrastructure, processes, implementation and utilization of computational tools in the design process,” Presentation to the committee on March 13, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007. who are given ownership of and responsibility for the product under develop- ment—operates with materials as a static, limiting constraint on the overall IPD rather than as an optimizable parameter. Typically the list of materials from which a choice is to be made is either taken as a fixed constraint or consists of a small subset of materials that are evaluated statically outside the optimization loop. This approach narrows the design space, resulting in suboptimal vehicle performance in an application with low margins for error. Conversely, the development of the

Case Studies and Lessons Learned 39 FIGURE 2-1-2 A typical electronic enterprise required to support computationally based IPD. NOTE: Individual codes and tools are depicted as labeled boxes (for example, A is shown as Unigraphics, C is Ansys, F is Fluent, and so on). B2B, business to business. The flow through this figure shows the individual steps involved in establishing the computational process. Adapted from Michael Winter, P&W, “Infrastructure, processes, implementation and utiliza- tion of computational tools in the design process,” Presentation to the committee on March 13, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations. html. Accessed February 2007. right class of advanced materials, which is an inherently expensive process, could be better justified and motivated by integration of materials into the MDO com- putational process. During the late 1990s, materials engineers who worked on IPDTs witnessed these achievements and began to consider the need for similar computational methods for analysis and development of materials. The need to shrink the growing discrepancy between the system design cycle time and the typical time to develop

40 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g BOX 2-2 An Example of MDO The MDO approach to design has become a powerful vehicle for broad exploration of design space at relatively low cost. An example is the development of hypersonic vehicles for space access and defense applications, where there is a complex interdependence of ve- hicle structure and aerodynamic and thermal loads; static and dynamic structural deflections; propulsion system performance and operability; and vehicle control. Since many conditions of hypersonic flight cannot reasonably be replicated in any current or foreseeable wind tun- nel, designs are not fully validated until actual flights are conducted. Thus the integration of validated analytical design tools with automated data transfer between disciplines in an MDO platform is essential for arriving at realistic but innovative and high-performing solutions. A tool integration scheme used by Boeing Phantom Works for the design of hypersonic vehicles is shown in Figure 2-2-1. This MDO scheme could, for example, be used to design an air- breathing, reusable, hypersonic flight vehicle, where strong interactions between aerodynam- ics, propulsion, aerothermal loads, structures, and control have a substantial impact on the optimal shape of the entire vehicle. With an MDO platform, thousands of potential vehicle shapes can be explored within the time frame of days (see Figure 2-2-2). Materials tools are notably absent from the MDO tool set (Figure 2-2-1). FIGURE 2-2-1 Boeing’s integration of analytical tools for design of hypersonic vehicles. SOURCE: K.G. Bowcutt, “A perspective on the future of aerospace vehicle design,” American Institute of Aeronautics and Astronautics Paper 2003-6957, December 2003.

Case Studies and Lessons Learned 41 FIGURE 2-2-2 Side and top views of vehicle shape explored in an MDO analysis of hypersonic vehicle design space. This evaluation allows vehicle performance at hypersonic speeds to be explored. A materials analysis is not part of this process. SOURCE: Kevin Bowcutt, Boeing.

42 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g a new material (8-20 years) provided further motivation. Materials engineers recognized that by building such tools and methods for materials, a process now termed integrated computational materials engineering (ICME), materials could be incorporated into the overall engineering system, thereby allowing full systems optimization. Within the last decade, industrial organizations, using their own and govern- ment funds, have begun to explore the application of ICME to solve industrial problems, to estimate the payoff for ICME implementation, and to identify devel- opment needs. For example, the Defense Advanced Research Projects Agency (DARPA) launched the Accelerated Insertion of Materials (AIM) initiative in 2001 to apply ICME to jet engine turbine disks and composite airframe structures. The goal of AIM was to establish an ICME tool set, integrate materials analysis with design engineering, and demonstrate the benefit of ICME in terms of reduced materials development time and cost. In that same time frame, similar efforts were launched in other industrial sectors to improve product performance and provide information that would be impossible or extremely costly to obtain using experimental methods. In general, case studies reviewed by the committee demonstrate that while ICME is still an emerging discipline it can nevertheless provide significant benefits, such as lower costs and shorter cycle times for component design and process development, lower manufacturing costs, and improved prognosis of the subsys- tem component life. In some cases, these benefits would have been impossible to achieve with any other approach, regardless of cost. In this chapter, several case studies involving ICME are presented along with a discussion of how other scientific disciplines have successfully undertaken large- scale integrated efforts. Case studies where the return on investment (ROI) could be documented are reviewed in more detail. The chapter ends with some lessons learned from these programs. Case Studies—Current Status and Benefits of ICME In this section, the current status of ICME is assessed by relating some case studies that illustrate the ICME processes. The vast majority of design engineers see material properties as fixed inputs to their design, not as levers that may be adjusted to help them meet their design criteria. Until materials, component design,  John Allison, Mei Li, C. Wolverton, and XuMing Su, “Virtual aluminum castings: An industrial application of ICME,” JOM (Journal of The Minerals, Metals & Materials Society) 58(11): 28-35.  The committee notes that none of the information provided to the committee was sufficient for the committee to make an independent assessment of the ROI. This is not surprising, because providing such information would involve releasing proprietary or private/classified information. The commit- tee therefore reports the ROI information provided to it without any independent assessment.

Case Studies and Lessons Learned 43 and the manufacturing process become fully integrated, designers will not be able to optimize product properties through materials processing. Using an ICME approach, however, this optimization can be accomplished in a virtual environ- ment long before the components are fabricated. The case studies discussed here attempt to show why that is so. For clarity, the case studies are divided into examples that use ICME to (1) integrate materials, component design, and manufacturing processes; (2) predict component life; and (3) develop manufacturing processes. While each ICME case study achieved different levels of integration and implementation, all of them resulted in substantial benefits. The case studies chosen for inclusion here generally had some level of detailed information on the ROI and other benefits of the ICME activity; this was a self-reported assessment, however, and validating the ROIs was beyond the scope of the committee. The majority of these case studies involved metallic systems. The committee identified similar examples in nonmetallic sys- tems such as polymers and semiconductors, but they either lacked any detailed information on ROI or did not approach full ICME integration. For example, in the area of integrated circuits, models for dielectric constants, electromigration, dislocation generation, and barrier layer adhesion have not yet been integrated with CAD circuit design models or “equipment” models for lithography or pro- cessing., The committee notes, therefore, that while the examples shown in this report all involve metallic systems, it believes that ICME will be applicable and will demonstrate significant benefits for integrating materials, component design, and manufacturing processes across all materials regimes, including nanomateri- als, biological materials, polymeric materials, ceramics, functional materials and composites. However, the challenges can be formidable. For instance, the modeling of polymeric materials is extraordinarily complex at all length scales, and owing to the long chains of the polymer backbones, the gap between feasible atomistic simulations and solid polymer properties is still substantial. Another challenge for industrial polymers is the plethora of suppliers, who offer many proprietary vari- ants of these materials. However, it is this kind of complexity that should motivate an ICME approach.  Sadasivan Shankar, Intel, “Computational materials for nanoelectronics,” Presentation to the committee on May 29, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_ Presentations.html. Accessed February 2007.  Dureseti Chidambarrao, IBM, “Computational materials engineering in the semiconductor indus- try,” Presentation to the committee on May 29, 2007. Available at http://www7.nationalacademies. org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007.

44 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g Integrating Materials, Component Design, and Manufacturing Processes in the Automotive Sector The virtual aluminum castings (VAC) methodology developed by the Ford Motor Company offers one detailed case study for integrating materials, component design, and manufacturing. The methodology was based on a holistic approach to aluminum casting component design; it modified the traditional design process to allow the variation in material properties attributable to the manufacturing process to flow into the mechanical design assessment. Fully funded by Ford to address specific power-train components, the VAC methodology was implemented by the company for cast aluminum power-train component design, manufacturing, and CAE. As discussed below, VAC has resulted in millions of dollars in direct savings and cost avoidance. For VAC to be successful, it required: • Models of the structure evolution and resulting physical and mechani- cal properties of aluminum alloy systems, using the classic processing– structure­–property flowchart depicted in Figure 2-1, • The ability to link the various models together while maintaining compu- tational efficiency and simplicity, • Modifications to the traditional design process to allow for spatial variation of material properties across a component to be considered, and • Extensive validation of the predictive models. Quantitative processing–structure–property relationships were defined using a combination of science-based models and empirical relationships and linked to quantitative phase diagram calculations. As shown schematically in Figure 2-1, the influence of all the manufacturing processes (casting, solution treatment, and aging) on a wide variety of critical microstructural features was captured in com- putational models. These microstructural models were then used to predict the key mechanical and physical properties (fatigue, strength, and thermal growth) required for cast aluminum power-train components. Microstructural modeling was required at many different length scales to capture the critical features required to accurately predict properties. The outputs of this processing–structure–property information are predictions of manufacturing-history-sensitive properties. These predicted properties and their spatial distributions were mapped into the compo-  John Allison, Mei Li, C. Wolverton, and XuMing Su, “Virtual aluminum castings: An industrial application of ICME.” JOM (Journal of The Minerals, Metals & Materials Society), 58(11): 28-35.  Thermal growth, a common term used for aluminum alloys, represents the dimensional changes brought about by precipitation of phases with different volumes. It is used as a critical input in advanced component durability procedures.

Case Studies and Lessons Learned 45 FIGURE 2-1 The processing–structure–property flowchart for cast aluminum alloys for engine com- ponents. The chart shows the wide variety of individual models required for ICME. The proper- ties listed are required as the critical inputs to durability (performance) analysis of cast aluminum components. nent and subsystem finite-element analysis (FEA) of operating engines by which cast aluminum component durability (performance) is predicted. Both commercial software and Ford-developed codes are used in the VAC sys- tem. Commercially available casting-simulation software (Pro-CAST, MagmaSoft), thermodynamic and kinetic modeling software (Pandat, Thermo-Calc, Dictra), first principles code (VASP), and FEA software for stress analysis (ABAQUS) are used, integrating the methodology with the standard codes used in the manu- facturing, material science, and design environments. However, many additional codes were developed during the course of the program for specific tasks such as the interfacial heat-transfer coefficient (IHTC) optimization (OptCast), micro- structure evolution (MicroMod, NanoPPT, MicroPore), physical or mechanical property evolution (LocalYS, LocalTG, Local FS), and the resultant stresses and component durability (QuenchStress, HotStress, Hotlife). These codes, developed internally by Ford Research and Advanced Engineering, involved coordinating the fundamental research efforts of five universities across the United States and the United Kingdom. Substantial effort was applied to developing efficient links between the output of the casting modeling and structure and property prediction tools to feed seamlessly into the FEA codes and to facilitate reorganization of the design process. Figure 2-2 shows the process flow for predicting the influence of the casting and heat treatment process on the spatial distribution of yield strength in a typi-

46 Virtual Aluminum C as tings Proc es s Flow Initial Geometry Filling Thermal Analysis •CAD Geometry •Accurate filling Profile •Boundary Conditions (OPTCAST) and Mesh (ProCast, OPTCAST) •Fraction solid Curves (ThermoCALC) Yield Strength Microstructure (Al2Cu) • LocalYS • Micromodel (MicroMod, PanDat) • Solution treatment (Dictra) • Aging Model (NanoPPT, PanDat) FIGURE 2-2 The ICME process flow for Ford’s VAC tool for local property prediction of yield strength (LocalYS). The flow starts with a geometric (CAD) representation of the component. This CAD is then used as input to the simulations of filling and solidification (thermal). The outputs of these simulations are used to predict microstructural quantities. Finally, these microstructural quantities are used to predict the manufacturing history sensitive, spatial distributions in yield strength. Specific commercial codes and Ford-developed programs and subroutines that have been integrated into the process are identified at each step. SOURCE: John Allison, Ford Motor Company.

Case Studies and Lessons Learned 47 cal cast aluminum engine block. The finite element or finite difference mesh is the basis for defining the geometry of the part. This geometric mesh is then used with commercial casting software to predict the flow of molten metal during casting and the influence of the casting process and the geometry on the local thermal history. The spatial distributions of the key microstructural features are predicted based on this local thermal history. Finally, the spatial distributions of strength are predicted based on the distribution of these key local microstructural features. Normally, a manufacturing modeling analysis is initiated only after final design and durability prediction are completed. In the new ICME process, however, the manufacturing analysis is done prior to the durability prediction to (1) provide a more realistic durability prediction and (2) allow for full component design improvements by adjusting both the manufacturing process variables and the component dimensions. Although limited manual optimization of the manufactur- ing process and product geometry is feasible, computational limitations preclude full multiattribute optimization. Finally, to ensure the acceptance of not only the new tools but also the new design process, extensive validation of the approach was essential. Figure 2-3 shows an example of a typical validation of the LocalYS FIGURE 2-3 Validation of ICME predictions are essential to gain acceptance by the engineered product development community. This figure represents a typical experimental validation of VAC local yield strength predictions for a wide range of components, heat treatments, and casting conditions. The data points represent matched pairs of model predictions and experimental validation. The dotted and dashed lines represent statistical bounds. SOURCE: John Allison, Mei Li, C. Wolverton, and XuMing Su, “Virtual aluminum castings: An industrial application of ICME,” JOM (Journal of The Minerals, Metals & Materials Society) 58(11): 28-35.

48 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g tool in VAC. Yield strength was measured in a wide variety of locations in castings manufactured under a multitude of processing conditions in a number of different engine components. The necessary degree of accuracy was determined in consulta- tion with product design engineers to build confidence in these predictions. Note the greater model fidelity required in regions of typical production. In the development of new engines, the cast aluminum engine block and cyl- inder head are critical components. Before VAC was used, these components were generally developed via iterative engine testing, rework, and retesting. Traditional CAE durability analysis provided a starting point, but due to the lack of influences from manufacturing on properties and residual stresses, this analysis was approxi- mate and these critical components often failed during new engine development. This resulted in costly redesign, retooling, and program delays. The benefits of VAC include a 15-25 percent reduction in new cylinder head or block development time, a reduction in the number of component tests required for assurance testing, and a shorter cycle time for the casting or heat treatment process, giving Ford a cumula- tive saving of millions of dollars. The VAC system continues to be improved, with future modifications extending it to the high-pressure die-casting process as well as to magnesium alloys, further increasing the potential savings. The VAC program involved approximately 25 people, according to Ford, and $15 million in expenditures over 5 years. Well over 50 percent of this effort was for experimental work in either model development or validation. The VAC technology is projected to have saved Ford up to $100 million in cost avoidance by resolving product issues and between $2 million and $5 million per year in reduced testing requirements and reduced development time, providing an estimated combined ROI of well over 7:1.,,10 These achievements would not have been possible without both the upfront investment in ICME tool development and the acceptance of a change to the manufacturing culture associated with modifying the design analysis to allow the results of manufacturing simulations of the component to be used as inputs to the component life predictions. VAC was initially developed for a single alloy system. It served as a platform for the ICME development of additional cast aluminum alloys at a substantially lower cost. For example, a recently developed VAC tool set for a new alloy for aluminum cylinder heads for diesel engines was completed in less than half the time and at a cost of between 10 and 20 percent of that required for development of the initial VAC tool set. Ford has extended the ICME approach to coupling analysis of sheet steel stamping with predicting  John Allison, Mei Li, C. Wolverton, and XuMing Su, “Virtual aluminum castings: An industrial application of ICME,” JOM (Journal of The Minerals, Metals and Materials Society) 58(11): 28-35.  Private communication, John Allison, July 2007. 10 NRC, Retooling Manufacturing: Bridging Design, Materials, and Production, Washington, D.C.: The National Academies Press (2004).

Case Studies and Lessons Learned 49 crash performance for vehicle bodies. It is also developing advanced approaches for coupling the casting analysis of magnesium and aluminum body components with crash performance predictions. Integrating Materials, Component Design, and Manufacturing Processes in the Aerospace Sector The DARPA AIM program is another example of an effort to integrate material properties in the design and manufacturing optimization process. As depicted in Figure 2-4, when the program was initiated in 2001, the materials discipline was not yet integrated into the turbine engine design stream. Within 1 year, material behavior models were integrated sufficiently to enable the execution of statistically designed matrices and generation of response surfaces. Within 2 years, the material behavior models were fully integrated, enabling execution of the several case stud- ies (described below). At that point the fundamental infrastructure was in place, FIGURE 2-4 Integration of the materials discipline into the aeroengine disk design process at P&W dur- ing the DARPA AIM program. Over the 4 years, the materials discipline was at first absent altogether from the automated design process. It then participated as statistically derived response surfaces and advanced to integration as physically based models, eventually becoming integrated into complex mechanical systems with multiple material performance characteristics.

50 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g facilitating the addition of incremental capabilities (complex systems, additional material properties). Once the significant investment in the underlying ICME infrastructure had been made, the incremental investment required to extend the capability and applicability was significantly reduced. In its AIM program, Pratt & Whitney (P&W) used this approach to optimize the design and manufacturing of jet engine turbine rotors. The design figures of merit were rotor weight and burst speed. Decreasing rotor weight provides both direct cost savings and improved system performance. Increasing rotor burst speed improves system performance. The maximum benefit in both figures of merit was achieved with the simultaneous optimization of the disk design (geometry) and the manufacturing process (material properties), requiring the integration of commercially available engineering analysis codes and material behavior models developed at universities. An initial constraint in this program was the lack of appropriate mechanical property models for the turbine rotor materials. Once the models were developed, it became apparent that key features of material structure (precipitate sizes and distributions, for example) were not available for use in the models developed. Thus, in spite of the fact that the material had been in service for many years, substantial materials characterization was still required. A key element of the analytical code integration was the use of optimization software in conjunction with a parametrically defined geometry in the FEA codes, enabling both the component configuration and material processing to be con- currently optimized. The ICME process was validated in three different exercises to determine how well the methodology would improve the burst capability of a turbine engine rotor, as illustrated in Figure 2-5. Taking only the rotor weight reductions into consideration, system life-cycle benefits on the order of $200 million have been projected to be possible for a typical turbine engine application and fleet. Reductions of $3 million in the cost of design data testing are achievable with a sixfold reduction in test costs. To fully realize these benefits, the ICME methodology needs to be expanded to include other critical models of material behavior. This would require an additional $20 million investment. Taking into consideration the increased investment, these estimated benefits have been projected to yield an ROI of around 10:1 for a fully developed ICME methodology.11 However, the investments needed to extend this approach in a more pervasive fashion to other material systems used by P&W have not been forthcoming. Chapter 4 discusses industrial inertia in some detail. 11 Privatecommunication with Jack Schirra, committee member; also, Leo Christodolou, DARPA, “Accelerated insertion of materials,” Presentation to the committee on November, 20, 2006. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007.

Case Studies and Lessons Learned 51 FIGURE 2-5 Full integration of materials science with design process: turbine disk validation studies. This chart summarizes the application of integrated engineering analysis tools to a series of case stud- ies for aeroengine disk applications. Models integrated: component processing, microstructure evolu- tion, microstructure-based strength model, component geometry, and stress. Case Study 1 represents the current state of the art and shows that integration of structural engineering analysis tools with part design can identify optimal configurations that result in measurable performance improvements. Case Study 2 represents the scenario where ICME enables the improvement of a production part capability without requiring a system-level redesign and product reconfiguration. The cost for a rotor redesign and reconfiguration effort can be on the order of $200 million depending on fleet size. Case Study 3 represents the full application of ICME in the design process and shows that system-level performance can be substantially improved over the current state of the art. Integrating Materials, Component Design, and Manufacturing Processes: Other Examples Other examples of integrating materials, component design, and manufac- turing processes can be found in Table 2-1, along with their corresponding ben- efits. In all of these ICME efforts, the potential cost savings are significant. It is worth noting, however, that in many cases maximizing benefits from ICME will require additional investments and may encounter the cultural barriers identified in ­Chapter 4. While some ICME efforts represent large programs at major manufacturers, Table 2-1 shows that small companies are also making important contributions

52 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g TABLE 2-1  Examples of ICME for Integrated Manufacturing, Materials, and Component Design Company Case Study Benefits Ford Motor Co. Virtual aluminum castings 15-25 percent reduction in product development time, substantial reduction in product development costs, optimized products (greater than 7:1 ROI) General Electric Company/ Accelerated Insertion of Potential 50 percent reduction P&W/Boeing Materials in development time, potential eightfold reduction in testing, potential improvement in component capability Livermore Software Technology Use of stamping CAE output Enabled use of advanced high- Corporation/ESI Group/Ford as material property input strength steels in body structures (separate studies) for crash CAE in automotive for significant weight savings structures Naval Surface Warfare Center Development of an Accelerated Development in progress; Insertion of Materials system optimization of thermomechanical for an aluminum extrusion history of AA6082 during the extrusion of sidewall panels for a littoral combatant ship Knolls Atomic Power Fracture of high-strength Optimization of critical embrittling/ Laboratory (Lockheed Martin superalloys in nuclear industry strengthening impurities Corporation)/Materials Design Toyota central R&D labs/ Clean Surfaces Technology Reduced time to identify new Materials Design Program—develop dopants for dopants, new products (air UV photocatalyst purifiers) QuesTek Alloy development Reduced insertion risk, reduced experimental cost Boeing Airframe design and Development in progress, Boeing manufacturing estimates a reduction in material certification time by 20-25 percent (3-4 years) to ICME, often with government sponsorship. A number of small companies that offer new tools for developing new alloys or processes are in the start-up stage, and are somewhat reliant on government funding. QuesTek has developed several alloys for both commercial (automotive, consumer products) and government (Army, Navy, NASA) applications, while Materials Design, Inc., has developed materials varying from ultraviolet catalysts (Toyota) to optimized superalloy chemistries for the nuclear industry (Knolls Atomic Power Laboratory). The committee concluded

Case Studies and Lessons Learned 53 that the agility of small companies is an asset in developing new approaches to the design process, including ICME. Integrated Materials Prognosis (Component Life Estimation) In some cases, quantifying a component’s lifetime is irrelevant; the component will last longer than the system. However, for certain components, the lifetime is critical both for design decisions and for subsequent system maintenance and sup- port. The current methodology for predicting lifetime is based on accelerated aging tests. Such tests are expensive, uncertain, and sometimes impossible. By integrating component environment with material property evolution, ICME can be used to predict bounds on component lifetime. The longer the lifetime of the component, the greater the savings associated with such an ICME approach. One study involving an integrated materials prognosis used synergistic com- putational and experimental efforts at Los Alamos National Laboratory (LANL) and Lawrence Livermore National Laboratory (LLNL). In the nuclear weapons stockpile, credible estimates of component lifetimes are required to plan for the future refurbishment and manufacturing needs of the weapons complex. One of the most important of these components is the pit, that portion of the weapon that contains the fissile element plutonium. The U.S. government had proposed construction of a modern pit facility, and a key variable in planning both the size and schedule for this facility is the minimum estimated lifetime for stockpile pits. To support this effort, the National Nuclear Security Administration (NNSA) of the DOE sponsored a program to provide predictions of primary-stage pit lifetimes owing to plutonium aging. Because accelerated aging tests of plutonium are diffi- cult and expensive, and performance tests of pits are prohibited by national policy (in voluntary compliance with international treaty), an integrated computational approach was required. The funded program was based on analyses of archival underground nuclear-explosion testing (UGT) data, laboratory experiments, and computer simulations. By combining (1) the results of past UGTs with pits of various ages, (2) experimental and theoretical investigations of the metallurgi- cal properties of plutonium containing various combinations of impurities, and, finally, (3) computer simulations of primary performance with model plutonium properties varying with age, the groups at LANL and LLNL were able to provide estimates of pit lifetimes such that any new pit facility construction could be scheduled appropriately.12 A combination of commercial codes (Thermo-Calc, QMU-Crystal Ball), molecular dynamics codes, design sensitivity tools, and specific codes developed 12 Dimitri Kusnezov, National Nuclear Security Administration, “Mission challenges in large-scale computational science,” Presentation to the committee on December, 1, 2006. Available at http:// www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007.

54 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g for this particular program was used. The laboratories used the Quantification of Margins and Uncertainty (QMU) methodology to assess pit lifetimes based on simulations of primary performance. Age-dependent models derived from data, science-based computational methods, and conservative assumptions were used to calculate the change in pit lifetime owing to the change in material properties. To quote a review of this program, “For these systems it is important that each contri- bution to the lifetime be well understood and validated. In a sense, the issue is not one of accounting for aging but of managing the margins and uncertainties that are already present at zero age, and this is best done by understanding the trade-offs involved and the consequent mitigation strategies that can be applied.”13 The LANL/LLNL pit lifetime prognosis program involved approximately 200 people over 7 years, at a total cost of $150 million—clearly a large-scale effort. Because the results showed much longer pit lifetimes than initially expected, the construction of a manufacturing facility with an approximate cost of $1.5 billion was not required, giving an ROI of 10:1. Even if a manufacturing facility had been required, this ICME program cost no more than an equivalent accelerated aging effort, and provided more reliable results.14 One key lesson from this case study was that only 10-20 percent of the program cost was for computation. A heavy investment was made in experimental data to provide subscale/accelerated validation for the models. The cost of computation for this application was modest because the program leveraged the prior invest- ments of DOE’s Advanced Scientific Computing program.15,16 So the reported $150 million cost considerably underestimates the actual cost. The fact remains, however, that not only was the ROI quite high, but these results could not have been provided using conventional prototyping and component testing without violating national nuclear policies. Integrated materials prognosis is perhaps the least mature of any of these areas of ICME. However, there are a few other examples of current programs that 13 R.J. Hemley, D. Meiron, L. Bildsten, J. Cornwall, F. Dyson, S. Drell, D. Eardley, D. Hammer, R. Jeanloz, J. Katz, M. Ruderman, R. Schwitters, and J. Sullivan, Pit Lifetime, National Nuclear Security Report JSR-06-335, U.S. Department of Energy, January 11, 2007, p. 17. 14 Louis J. Terminello, LLNL, “Synergistic computational/experimental efforts supporting stockpile stewardship,” Presentation to the committee on March 13, 2007. Available at http://www7.national academies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007. 15 Joseph C. Martz, and Adam J. Schwartz. “Plutonium: Aging mechanisms and weapon pit lifetime assessment,” JOM, September 2003. 16 R. J. Hemley, D. Meiron, L. Bildsten, J. Cornwall, F. Dyson, S. Drell, D. Eardley, D. Hammer, R. Jeanloz, J. Katz, M. Ruderman, R. Schwitters, and J. Sullivan, Pit Lifetime, National Nuclear Security Report JSR-06-335, U.S. Department of Energy, January 11, 2007, p. 17.

Case Studies and Lessons Learned 55 TABLE 2-2  Examples of ICME for Integrated Material Prognosis (Life Estimation) Company or Organization Case Study Benefits IBM/Intel Electromigration Not provided P&W/General Electric Engine systems prognosis In progress; potential to provide substantial savings for maintenance of military fleet Lockheed Martin/Sandia Aging of solder joints No other feasible way to estimate National Laboratories/ 50-year life expectancy U.S. Navy are ongoing, and the benefits of these programs are still only estimates of future implementation, as shown in Table 2-2. Developing a Manufacturing Process The most widespread applications of ICME have been in the component manufacturing process. While large-deformation finite element analysis (FEA) and computational fluid dynamics (CFD) commercial codes are now widely used for process modeling, the incorporation of microstructure evolution in these simula- tions is relatively new, indeed in its infancy in terms of commercialization. For the early adopters, there exists the potential for improving yields, reducing cycle time and energy costs, and optimizing a process for a given material or desired structure. The Controlled ThermoMechanical Processing (CTMP) of tubes and pipes program provides one case study of the integration of materials modeling into the development of a manufacturing process. The Timken Company makes steel tubes that are subsequently formed into bearings, gears, or pipes. Tube fabrica- tion is a batch process, and customers request alloys of diverse composition and properties. Historically, both forming and annealing processes were specified in an ad hoc manner, which decreased yields. The ultimate goals of the CTMP program were reduced process variability as well as optimizing the tube-making process by means of real-time control. Improved control and more consistent properties lead to increased yields, improved performance in customer applications, and reduced energy, cost, and cycle time. This program, led by the Alloy Steel Business of the Timken Company, was co-funded by DOE, with Timken providing approximately 50 percent of the cost. Program costs totaled approximately $10 million over 5 years and required extensive collaboration among more than 25 different industrial, academic, and government facilities in the United States and around the world.

56 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g The main components of the CTMP program were the development of ana- lytical tools, measurement techniques, process simulation, and product response. The two most important analytical tools of the program were the tube optimiza- tion model (TOM) and the virtual pilot plant (VPP). TOM offers a PC-based framework for developing a tube-making process that generates the desired micro- structure. VPP allows process and equipment scenarios to be evaluated in com- puter simulations rather than in the production facilities. The TOM was initially developed for use with three different grades of steel; however, many grades were added to make the models more broadly applicable. Timken made extensive use of a “design of experiments” approach to develop data-driven relationships between controllable factors in the manufacturing process and the material microstructure, which were then hard-wired into the TOM.17 The majority of codes in TOM were developed within the program, such as those that describe controlled slow cooling, inverse heat conduction, austenite decomposition, thermal expansion coefficients, recrystallization, grain growth, flow stress, and some specialized finite element mill models (ELROLL). Commercial codes used were QuesTek MCASIS codes for continuous cooling transition curves, various finite element (FE) or finite differ- ence (FD) codes (ABAQUS, DEFORM, SHAPE), and optimization codes such as EPOGY in the VPP. Direct, on-line measurements of austenitic grain size using a laser-ultrasonic gauge and an eccentricity gauge were used to validate in-process grain size, building on an earlier DOE program in this area. TOM was used to assess the impact of various manufacturing process parameters on the machinability of the finished material, a critical customer requirement. For the case of machining tubes into automotive gears, this capability allows the steel microstructures to be optimized to extend the lifetime of broaching tools. The benefits of the CTMP program can be demonstrated by reviewing some of its elements. The thermal-enhanced spheroidization annealing (T-ESA) “process recipe” is an annealing cycle that reduces the time and energy requirements involved in the spheroidization heat treatments applied to 52100 and other homogeneous high-carbon steels. The recipe has been perfected and implemented, reducing the process cycle time 33-50 percent and energy costs by $500,000 per year. The accumulated annual savings of other direct benefits are estimated at nearly $1 mil- lion, not to mention the opportunity cost that the tools provided by reducing the time that would be required of the technical staff to execute those studies without TOM. Moreover, TOM mill simulation and a mill trial demonstrated, with limited experiments, a strong potential for application of a minimum-capital manufac- turing process. If new capital equipment were installed, the benefits from a single 17 Design of experiments is a systematic approach to investigation of a system or process. A series of structured tests is designed in which planned changes are made to the input variables of a process or system. The effects of these changes on a predefined output are then assessed.

Case Studies and Lessons Learned 57 application of TOM/VPP would be in the millions of dollars. Finally, a preferred microstructure was discovered for maximizing gear broach tool life. Since cutting tools cost roughly half as much as gears, the industry-wide savings could amount to more than $10 million annually. Obviously not all ICME case studies have realized their full potential. For example, while the direct financial benefits at Timken were reportedly limited because there was no capital investment in an optimized plant, the experience there shows it is reasonable to expect that an ROI of 3 to 10 could be realized over the entire industry if capital investments were made. Commercialization of some of the codes developed in the CTMP program is planned, however, and would increase the use of ICME in this industry. The CTMP program has demonstrated that sci- ence developed into applicable technology can differentiate domestic products and further the cause of increasing U.S. competitiveness in the global market.18 There are a number of other examples in which manufacturing vendors have begun to embrace ICME, as shown in Table 2-3. While this area of integrated materials and manufacturing promises to provide the most immediate benefits to industry, it must be recognized that additional development and validation remains for these new techniques to be more widely utilized. LESSONS LEARNED FROM OTHER DISCIPLINES In addition to investigating the current status of ICME, the committee also explored some major integration efforts in other scientific and engineering fields. Integration experiences in other disciplines teach important lessons about accom- plishing major technical and cultural shifts. It is important to note that fields that have achieved successful integration share some advantages over materials engi- neering. First they enjoy a cohesive data structure, a common mathematical frame- work, and well-defined objects for investigation. For example, genomics describes a single kind of data, gene sequences, while astronomy focuses on a common set of celestial objects. In contrast, materials engineers working on a large array of engineering components must know about the physical and mechanical properties of the materials used as well as their spectroscopic and two-dimensional and three- dimensional microscopic characteristics. Given the diversity of the information and how this information is applied, the materials challenges more closely resemble the challenges of bioinformatics. In general, communities that have benefited from integration of information have set explicit goals for information acquisition and 18 For more information on the CTMP program, see Timkin Company, Final Report: Controlled Thermo-Mechanical Processing of Tubes and Pipes for Enhanced Manufacturing and Performance, November 11, 2005. Available at http://www.osti.gov/bridge/purl.cover.jsp?purl=/861638-qr9nuA/. Accessed May 2008.

58 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g TABLE 2-3  Examples of ICME in Materials Manufacturing Processes Company Case Study Benefits The Timken Company CTMP Potential to reduce manufacturing cost by 20 percent with new heat treat/mill on-line Ladish Co./Rolls- Titanium modeling The summary of the business case Royce/General Electric for this project is a 4.83:1 ROI Company/P&W/Boeing/Timet with a $24.442 million ROI within a standard 10-yr analysis period Ladish Co./General Electric Processing science Cost avoidance for production of Company methodology for metallic forged aerospace components structures Special Metals Corporation Four case studies in Savings of more than $500,00 in (a PCC company) alloy development using development materials, estimated Thermo-Calc $1.5 million in new revenue Böhler Schmiedetechnik Optimizing the forging of Cost avoidance for production of critical aircraft parts by the forged aerospace components use of finite element coupled microstructure modeling Carmel Forge/Scientific Forming Grain size modeling for Cost avoidance for production of Technologies Corporation Waspalloy forged aerospace components RMI Titanium Company Rolling modeling for Faster, more cost-effective development of fine grain optimization of rolling parameters titanium sheet at RMI and pass schedules Alcoa Howmet Grain size modeling Improved prior beta grain size control in investment casting of Ti-64 successfully executed “team science” projects that have established national centers and open-access databases of fundamental information. In the following sections specific lessons learned from the genomics, bioinformatics, and astronomy com- munities are highlighted. Genomics and Bioinformatics One of the most significant integration efforts in modern science was the human genome project (HGP).19 This well-coordinated, $3 billion, 13-year project 19 For more information, see http://www.ornl.gov/sci/techresources/Human_Genome/home.shtml. Accessed October 2007.

Case Studies and Lessons Learned 59 was funded in the United States by the DOE and the National Institutes of Health (NIH) and was conducted in collaboration with the United Kingdom, Japan, France, Germany, China, and other countries. The project determined the complete sequence of the three billion DNA subunits (bases), identified all human genes, and made them accessible for further biological study. At any given time, HGP involved over 200 researchers, and its successful completion required large-scale funding, the coordination of important technologies (for example, rapid, high-throughput sequencing capabilities and databases), and the evolving principles surrounding intellectual property and publication.20 All human genomic sequence information generated by the centers that had been funded for large-scale human sequencing was made freely available in the public domain to encourage research and devel- opment and to maximize the benefit to society. Further, the sequences were to be released as soon as possible and finished sequences submitted immediately to public databases. To promote coordination of activities, it was agreed that large- scale sequencing centers would inform the Human Genome Organization (HUGO) of their intention to sequence particular regions of the genome. The information was presented on the HUGO Internet page and directed users to the Web pages of individual centers for more detailed information on the status of sequencing in specific regions. Although HGP was completed in 2003, NIH continues to fund major coordinated sequencing projects. As an example of the magnitude and type of efforts funded by NIH, the sequence of the Rhesus monkey was recently completed, the result of a $20 million effort involving 100 researchers that was approved in 2005.21 While the resources expected to be made available to ICME are likely to be fewer than were applied to mapping the human genome, there are significant lessons to be learned by the ICME community from HGP by considering how that community initially organized and set goals and by considering the potential impact of large-scale, coordinated projects based on the gathering and organization of data. At the outset of the HGP, quantitative goals for information acquisition (gene sequencing), databases, computational advances, and the human infrastruc- ture were set. In 1991, 5-year goals were established, with funding of $135 million from NIH and DOE. The 1991 goals included these:22 • Improve current methods and/or develop new methods for DNA sequencing that will allow large-scale sequencing of DNA at a cost of $0.50 per base pair. 20 Rex Chisholm, Northwestern University, “Community computational resources in genomics re- search: Lessons for research,” Presentation to the committee on May 29, 2007. Available at http:// www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2007. 21 Elizabeth Pennisi, “Boom time for monkey research,” Science 316 (April): 216. 22 For more information on the HGP’s 5-year plan, see http://www.genome.gov/10001477. Accessed February 2008.

60 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g • Develop effective software and database designs to support large-scale map- ping and sequencing projects. • Create database tools that provide easy access to up-to-date physical map- ping, genetic mapping, chromosome mapping, and sequencing information and allow ready comparison of the data in these several data sets. • Develop algorithms and analytical tools that can be used in the interpreta- tion of genomic information. • Support research training of pre- and postdoctoral fellows starting in FY 1990. Increase the numbers of trainees supported until steady-state “pro- duction” of about 600 per year is reached by the fifth year. The technology for rapid, low-cost sequencing advanced quickly, while the needs for databases and computational tools continued to evolve.23 Interestingly, the training goals were not met, “because the capacity to train so many individuals in interdisciplinary sciences did not exist.” The establishment of interdisciplinary research centers, with significant participation from nonbiological scientists, was identified as an ongoing need throughout the program.24 ICME shares many of the same challenges: rapid, low-cost experimentation, development of databases and computational tools, and the need for interdisciplinary training. The HGP also drove developments in bioinformatics, a field that lies at the intersection of biology, computer science, and information science and is defined by the NIH as “research, development, or application of computational tools and approaches for expanding the use of biological, medical, behavioral or health data, including those to acquire, store, organize, archive, analyze, or visualize such data.” Bioinformatics generally makes use of publicly available databases that can be mined for associating complex disorders (analogous to material proper- ties) with different versions of the same gene (analogous to microstructures). In the case of NIH-sponsored genetic research, genetic data must be published in publicly accessible databases. Similarly genetics-oriented research journals also require that this kind of information is made publicly available before the relevant paper may be published. The length scales associated with bioinformatics data and their application pose a challenge as complicated as materials. Examples of informatics databases include GenBank (genetic sequences), EMBL (nucleotide sequences), SwissProt (protein sequences), EC-ENZYME (enzyme database), RCSB PDB (three-dimensional biological macromolecular structure data from X-ray crystallography), GDB (human genome), OMIM (Mendelian inheritance in man 23 F. Collins and D.J. Galas, “A new five-year plan for the U.S. Human Genome Project,” Science 262 (1993): 43-46. 24 Francis S. Collins, Ari Patrinos, Elke Jordan, Aravinda Chakravarti, Raymond Gesteland, and LeRoy Walters, “New goals for the U.S. Human Genome Project: 1998-2003,” Science 282 (1998):682-689.

Case Studies and Lessons Learned 61 data bank), and PIR (protein information resource). The National Center for Biotechnology Information (NCBI),25 which provides access to these and other databases, was created in 1998 as a national resource for molecular biology infor- mation. The mission of the NCBI includes the generation of public databases, the research and development of computational biology tools, and the dissemination of biomedical information. An important lesson learned by the genetics community is that a key first step in establishing standards is agreement on a taxonomy (that is, an agreed-on classification scheme) and a vocabulary that ensures interoperability of data and models.26 Database curators play a key role in ensuring the quality of the data and determining what data are needed by the community. Bioinformatics database development, maintenance, and curation are funded by NIH. A small model organ- ism database might cost $400,000 annually, including the services of three curators, two database programmers, and the principal investigator.27 One of the cultural lessons from bioinformatics is that transitioning from bench science to big science involves a recognition on the part of the researchers that although their data are connected to them in the database, once those data are used by others they are no longer connected to them. This transition also requires a realization that analysis of a researcher’s data by others does not diminish them but increases their impact. One presentation suggested that this required a transforma- tion from a “hunter-gatherer” research model to a “collective farming” model, in which coordination and collaboration are the central elements.28 There are still no similar, publicly funded databases, informatics efforts, or comprehensive training programs for ICME. In fact, there are substantial barriers to developing them: lack of funding, lack of standards, diverse data classes, propri- etary data ownership, and cultural barriers to building the needed collaborations among materials science, engineering, computational science, and information technology. However, there is a clear opportunity to capitalize on the large body of public high-quality data by establishing open-access, mineable databases analogous to those in the bioinformatics world. An effort on the part of the emerging ICME community to set quantitative and specific goals for materials characterization, databases, computational models, and training will be needed for this discipline to mature. 25 For more information see http://www.ncbi.nlm.nih.gov/. Accessed February 2008. 26 Rex Chisholm, Northwestern University, “Community computational resources in genomics re- search: Lessons for research,” Presentation to the committee on May 29, 2007. Available at http:// www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 27 Cate L. Brinson, Northwestern University, “Materials informatics—what, how and why: Analogy to bioinformatics,” Presentation to the committee on May 30, 2007. Available at http://www7.national academies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 28 Ibid.

62 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g Open Science Grid and Sloan Digital Sky Survey Fields from climate change to gaming have also benefited from integration of scientific models and collaborative frameworks involving large-scale databases or computing needs. Some efforts such as climate change modeling and genetics involve large-scale, central coordination; others such as gaming are more organic in nature and based on a free, open source software (FOSS) paradigm.29 These activities have been enabled largely by the Internet and tend to be Web based, with networks of geographically distributed scientists and code developers working together. This process, known in the United States as cyberinfrastructure and in Europe as e-Science, has given rise to the new field of grid computing, in which computers are shared.30 The committee was briefed on two such efforts, the Open Science Grid (OSG)31 and the Sloan Digital Sky Survey.32,33 These efforts were initiated because of the need for large-scale (petascale) computing and storage within the astronomy and physics community. The Sloan Digital Sky Survey provides over 40 terabytes (TB) of raw data and 5 TB of process catalogs to the public. The data challenge in this field is the integration of disparate types of data about astronomical objects (stars, galaxies, quasars), including images, spectroscopy data (acquired by an array of experimental techniques at various wavelengths), and astrometric data, along with the large volumes of data (2 to 4 TB per year). Tools developed for the automated data reduction efforts that make the survey possible have involved more than 150 person-years of effort. A lesson learned in this activity was that information is growing exponentially and planning for this data explosion is important. The OSG comprises a grid or distributed network of over 70 sites on four continents accessing more than 24,000 central processing units. Establishment of 29  W. Scacchi, “Free and open source development practices in the game community,” IEEE Software (January 2004): 59-66. 30 Daniel Clery, “Infrastructure: Can grid computing help us work together?” Science 313 (July 2006): 433-434. 31 OSG is a consortium of software, service, and resource providers and researchers from uni- versities, national laboratories, and computing centers across the United States. It brings together computing and storage resources from campuses and research communities into a common, shared grid infrastructure over research networks via a common set of middleware. The OSG Web site says the grid offers participating research communities low-threshold access to more resources than they could afford individually. For more information on the OSG, see http://www.opensciencegrid.org/. Accessed February 2008. 32 Paul Avery, University of Florida, “Open Science Grid: Linking universities and laboratories in national cyberinfrastructure,” Presentation to the committee on March 13, 2007. Available at http:// www7.nationalacademies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008. 33 Alex Szalay, Johns Hopkins University, “Science in an exponential world,” Presentation to the committee on May 30, 2007. Available at http://www7.nationalacademies.org/nmab/CICME_Mtg_ Presentations.html. Accessed February 2008.

Case Studies and Lessons Learned 63 this grid was funded (1999-2007) by $38 million from DOE and NSF. The OSG serves a diverse set of disciplines including astronomy, astrophysics, genetics, grav- ity, relativity, particle physics, mathematics, nuclear physics, and computer science. According to information provided to the committee, a key lesson learned from the OSG is that over half the challenges associated with its establishment were cultural. Thus significant effort was and is required in project and computational coordination and management, education, and communication. The OSG has a well-developed communication Web site, monthly newsletters, and annual sum- mer schools for participants. Key technical challenges include commercial tools that fall short of the needs for grid computation, requiring OSG collaborators to invent the software. Summary and LESSONs LEARNED Several consistent lessons emerged from the case studies reported to the com- mittee. These lessons set out a context for a path forward for ICME, which is discussed in Chapter 3 and Chapter 4. Lesson Learned 1. ICME is an emerging discipline, still in its infancy. Although some ICME successes have been realized and articulated in the case studies, from an industrial perspective ICME is not mature and is contrib- uting only peripherally. While some companies may have ICME efforts, product design often goes on without materials modeling. Significant government funds have been expended on developing tools for computational materials science (CMS), and many of these tools are sufficiently advanced that they could be incorporated into ICME models. However, government and industry efforts in integrating CMS tools and in applying them to engineering problem solving are still relatively rare. Lesson Learned 2. There is clearly a positive return on investment in ICME. Performance, cost, and schedule benefits drive the increasing use of simulation. ICME shows promise for decreasing component design and process development costs and cycle time, lowering manufacturing costs, improving material life-cycle prognosis, and ultimately allowing for agile response to changing market demands. Typical reductions in product development time attributed to use of ICME are estimated to be 15 to 25 percent, with best-case ROIs between 7:1 and 10:1. Less quantifiable, but potentially more important, ICME often offers solutions, whether for design decisions or lifetime prognoses, that could not be obtained in any other way.

64 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g Lesson Learned 3. Achieving the full potential of ICME requires sustained investment. Because most ICME efforts are not yet mature, they are just beginning to realize benefits from the investment in them. Several case studies, including the CTMP and AIM programs, highlighted the need for additional, sustained commercial invest- ments, whether in human resources, code development, or capital equipment, to achieve the full ICME payoff. Government programs, in particular, often fund the initial investigation of a concept but leave follow-up to others. Developing ICME into a mature discipline will take a considerable investment from government and industry. Lesson Learned 4. ICME requires a cultural shift. Several case studies articulated that the cultural changes required to fully benefit from ICME should not be underestimated. For ICME to gain widespread acceptance, shifts are required in the cultures in industry, academia, and govern- ment. The design philosophy must shift from separating the product design analysis and the manufacturing process optimization. The engineering culture must shift toward increasing confidence in and reliance on computational materials engineer- ing models and depending less on databases and physical prototypes. Materials researchers and other data generators must shift toward an open-access model that uses data in a standard format. Each of these cultural changes is challenging and must be supported with education and resources. Lesson Learned 5. Successful model integration involves distilling informa- tion at each scale. Models that explicitly link different length scales and timescales are widely viewed as a laudable goal; however, the very few cases where this goal has been met required computational resources that are not widely available. In most successful ICME case studies, length scales and timescales were integrated by reducing the information at each scale to simple, computationally efficient models that could be embedded into models at other scales. By focusing on ICME as an engineering undertaking, this approach to incorporating information on the length scale, the timescale, or the location was found to be effective. However, it requires experts with sufficient understanding of a particular materials system to be able to judge which material response issues are essential. Lesson Learned 6. Experiments are key to the success of ICME. Estimates from several case studies indicated that 50 to 80 percent of the expense of developing ICME tools related to experimental investigations. As models

Case Studies and Lessons Learned 65 for materials properties developed, it was often the case that earlier materials char- acterization was insufficient since it had been conducted primarily for the purpose of quality control. Experiments were also required to fill the gaps where theories were not sufficiently predictive or quantitative. Finally, experimental validation is critical to gaining the acceptance of the product engineering community and to ensuring that the tools are sufficiently accurate for the intended use. An important function of computational models is to capture this experimental knowledge for later reuse. Lesson Learned 7. Databases are the key to capturing, curating, and archiving the critical information required for development of ICME. A number of case studies, both within and outside the materials engineering discipline, highlighted the need for large-scale capture and dissemination of critical data. One showed the negative impact of failing to archive data in a recoverable form.34 To create and utilize accurate and quantitative ICME tools, engineers must have easy access to relevant, high-quality data. Both open-access and proprietary databases permit the archiving and mining of the large, qualified, and standardized data sets that enable ICME. Lesson Learned 8. ICME activities are enabled by open-access data and integration-friendly software. Integration of computational models requires the transfer of data between models and customized links—that is, input and output interfaces—within mod- els. To enable data transfer, information must be stored in accessible, standardized formats that can interface with various models. To facilitate model input and output, software must be designed to allow easy integration of user-developed subroutines, preferably through the use of open architectures that enable plug-in applications. While many of the main commercial codes used in design analysis allow user-definable subroutines, manufacturing simulation codes vary greatly in that capability and in the sophistication of the interfaces. To make fullest use of ICME, both databases and model software must be designed with open integra- tion in mind. Lesson Learned 9. In applying ICME, a less-than-perfect solution may be good enough. Several case studies emphasized that ICME can provide significant value even if 34 Jonathan Zimmerman, Sandia National Laboratories, “Helium bubble growth during the aging of Pd-tritides,” Presentation to the committee on March 13, 2007. Available at http://www7.national academies.org/nmab/CICME_Mtg_Presentations.html. Accessed February 2008.

66 I n t e g r at e d C o m p u tat i o na l M at e r i a l s E n g i n e e r i n g it is less than 100 percent accurate. While scientists may focus on perfection, many existing theories, models and tools are sufficiently well developed that they can be effectively integrated into an ICME engineering methodology. Sensitivity studies, understanding of real-world uncertainty, and experimental validation were key to gaining the acceptance of and value from ICME tools with less than 100 percent accuracy. To judge what is a reasonable balance between efficiency and robustness requires a team with expertise. Lesson Learned 10. Development of ICME requires cross-functional teams focused on common goals or “foundational engineering problems.” All the successful ICME efforts discussed in the report were carried out by cross-functional teams made up of experts in materials, design, and manufac- turing who were well versed in technology integration and had a common goal. Since many of the required tools are still under development, both engineering and research perspectives must be represented in ICME teams. Successful ICME required integration of many kinds of expertise, including in materials engineering, materials science, mechanics, mechanical engineering, physics, software develop- ment, experimentation, and numerical methods. An important part of this lesson is the selection of a common goal (such as a foundational engineering problem) that includes (1) a manufacturing process or set of processes, (2) a materials system, and (3) an application or set of applications that define the critical properties and geometries.

Next: 3 Technological Barriers: Computational, Experimental, and Integration Needs for ICME »
Integrated Computational Materials Engineering: A Transformational Discipline for Improved Competitiveness and National Security Get This Book
×
Buy Paperback | $47.00 Buy Ebook | $37.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

Integrated computational materials engineering (ICME) is an emerging discipline that can accelerate materials development and unify design and manufacturing. Developing ICME is a grand challenge that could provide significant economic benefit. To help develop a strategy for development of this new technology area, DOE and DoD asked the NRC to explore its benefits and promises, including the benefits of a comprehensive ICME capability; to establish a strategy for development and maintenance of an ICME infrastructure, and to make recommendations about how best to meet these opportunities. This book provides a vision for ICME, a review of case studies and lessons learned, an analysis of technological barriers, and an evaluation of ways to overcome cultural and organizational challenges to develop the discipline.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!