1
Introduction

THE BIENNIAL ASSESSMENT PROCESS

This introductory chapter first describes the biennial assessment process conducted by the National Research Council’s (NRC’s) Army Research Laboratory Technical Assessment Board (ARLTAB). It then identifies important research areas that involve crosscutting collaboration across the Army Research Laboratory (ARL) directorates and notes the linkage between the Army Research Office (ARO) and the ARL directorates.

The charge of ARLTAB is to provide biennial assessments of the scientific and technical quality of ARL. These assessments include the development of findings and recommendations related to the quality of ARL’s research, development, and analysis programs. The Board is charged to review the work of ARL’s six directorates but not to review two key elements of the ARL organization that manage and support basic research: the Army Research Office and the Collaborative Technology Alliances (CTAs). Although the primary role of the Board is to provide peer assessment, it may also offer advice on related matters when requested to do so by the ARL Director; such advice focuses on technical rather than programmatic considerations. The Board is assisted by six NRC panels that focus on particular portions of the ARL program. The Board’s assessments are commissioned by ARL itself rather than by one of its parent organizations.

For this assessment, ARLTAB consisted of six leading scientists and engineers whose experience collectively spans the major topics within the scope of ARL. Six panels, one for each of ARL’s directorates,1 report to the Board. Each Board member sits on a panel, five of them as panel chairs. The panels

1

 The six ARL directorates are the Computational and Information Sciences Directorate (CISD), Human Research and Engineering Directorate (HRED), Sensors and Electron Devices Directorate (SEDD), Survivability and Lethality Analysis Directorate (SLAD), Vehicle Technology Directorate (VTD), and Weapons and Materials Research Directorate (WMRD). The Board does not have a panel specifically devoted to the Army Research Office, which is another unit of ARL, but all Board



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 3
1 Introduction THE BIENNIAL ASSESSMENT PROCESS This introductory chapter first describes the biennial assessment process conducted by the National Research Council’s (NRC’s) Army Research Laboratory Technical Assessment Board (ARLTAB). It then identifies important research areas that involve crosscutting collaboration across the Army Research Laboratory (ARL) directorates and notes the linkage between the Army Research Office (ARO) and the ARL directorates. The charge of ARLTAB is to provide biennial assessments of the scientific and technical quality of ARL. These assessments include the development of findings and recommendations related to the quality of ARL’s research, development, and analysis programs. The Board is charged to review the work of ARL’s six directorates but not to review two key elements of the ARL organization that man - age and support basic research: the Army Research Office and the Collaborative Technology Alliances (CTAs). Although the primary role of the Board is to provide peer assessment, it may also offer advice on related matters when requested to do so by the ARL Director; such advice focuses on technical rather than programmatic considerations. The Board is assisted by six NRC panels that focus on particular portions of the ARL program. The Board’s assessments are commissioned by ARL itself rather than by one of its parent organizations. For this assessment, ARLTAB consisted of six leading scientists and engineers whose experience collectively spans the major topics within the scope of ARL. Six panels, one for each of ARL’s director- ates,1 report to the Board. Each Board member sits on a panel, five of them as panel chairs. The panels 1 The six ARL directorates are the Computational and Information Sciences Directorate (CISD), Human Research and Engineering Directorate (HRED), Sensors and Electron Devices Directorate (SEDD), Survivability and Lethality Analysis Directorate (SLAD), Vehicle Technology Directorate (VTD), and Weapons and Materials Research Directorate (WMRD). The Board does not have a panel specifically devoted to the Army Research Office, which is another unit of ARL, but all Board 

OCR for page 3
 007–008 aSSESSmENT Of ThE army rESEarCh labOraTOry range in size from 9 to 19 members, whose expertise is tailored to the technical fields covered by the directorate(s) that they review. In total, 95 experts participated, without compensation, in the process that led to this report. The Board and panels are appointed by the National Research Council with an eye to assembling balanced slates of experts without conflicts of interest and with balanced perspectives. The 95 experts include current and former executives and research staff from industrial research and development (R&D) laboratories, leading academic researchers, and staff from Department of Energy national labo - ratories and federally funded R&D centers. Twenty-six of them are members of the National Academy of Engineering, 5 are members of the National Academy of Sciences, 3 are members of the Institute of Medicine, a number have been leaders in relevant professional societies, and several are past members of organizations such as the Army Science Board and the Defense Science Board. The Board and its panels are supported by NRC staff, who interact with ARL on a continuing basis to ensure that the Board and panels receive the information that they need to carry out their assessments. Board and panel members serve for finite terms, generally 4 years, staggered so that there is regular turnover and a refreshing of viewpoints. Biographical information on the Board members appears in Appendix B, along with a list of the members of each panel. Preparation and Organization of This Report The current report is the fifth biennial report of the Board. Its first biennial report was issued in 2000, and annual reviews by the Board were issued in 1996, 1997, and 1998. As with the earlier reviews, this report contains the Board’s judgments about the quality of ARL’s work (Chapters 2 through 7 focus on the individual directorates). The rest of this chapter explains the rich set of interactions that support those judgments. The amount of information that is funneled to the Board, including the consensus evaluations of the recognized experts who make up the Board’s panels, provides a solid foundation for a thorough peer review. This review is based on a large amount of information received from ARL and on panel interac - tions with ARL staff. Most of the information exchange occurs during the annual meetings convened by the respective panels at the appropriate ARL sites. Both at scheduled meetings and in less formal interac - tions, ARL evinces a very healthy level of information exchange and acceptance of external comments. The assessment panels engaged in many constructive interactions with ARL staff during their annual site visits in 2007 and 2008. In addition, useful collegial exchanges took place between panel members and individual ARL investigators at outside meetings as ARL staff members sought additional clarification about panel comments or questions and drew on panel members’ contacts and sources of information. Each panel meeting lasted 2½ days, during which time the panel members received a combination of overview briefings by ARL management and technical briefings by ARL staff. Prior to the meetings, some panels received extensive materials for review, including selected staff publications. The overview briefings brought the panels up to date on ARL’s long-range planning. This context- building step is needed because the panels are purposely composed mostly of people who—while experts in the technical fields covered by the directorates(s) that they review—are not engaged in work focused on Army matters. Technical briefings for the panels focused on the R&D goals, strategies, methodolo - gies, and results of selected projects at the laboratory. Briefings were targeted toward coverage of a panels examine how well the development of ARO and ARL are coordinated. Appendix A provides information summarizing the organization and resources of ARL and its directorates.

OCR for page 3
 iNTrOduCTiON representative sample of each directorate’s work over the 2-year assessment cycle. Briefings included poster sessions that allowed direct panelist interaction with other projects that either were not covered in the briefings or had been covered in prior years. Ample time during both overview and technical briefings was devoted to discussion, both to clarify the relevant panel’s understanding and to convey the immediate observations and understandings of individual panel members to ARL’s scientists and engineers. The panels also devoted sufficient time to closed-session deliberations, during which they developed consensus findings and identified important questions or gaps in panel understanding. Those questions or gaps were discussed during follow-up sessions with ARL staff so that the panel was confident of the accuracy and completeness of its assess - ments. Panel members continued to refine their findings, conclusions, and recommendations during written exchanges and teleconferences among themselves after the meetings. In addition to the insights that they gained from the panel meetings, Board members received exposure to ARL and its staff at Board meetings each winter. Also, some Board members attended the annual ARL Program Formulation Workshop in 2007 and 2008; at these workshops the ARL directorates discussed their programs with the directorates’ customers and stakeholders. In addition, several panel members attended the 2007 and 2008 symposia that highlighted progress among ARL’s Collaborative Technology Alliances, and selected CTA projects performed by ARL researchers were presented during panel meetings. Assessment Criteria Within the general framework described above, the Board developed and the panels applied detailed assessment criteria organized in the following six categories (Appendix C presents the complete set of assessment criteria): 1. Effectieness of interaction with the scientific and technical community—criteria in this category relate to cognizance of and contribution to the scientific and technical community whose activi - ties are relevant to the work performed at ARL; 2. impact on customers—criteria in this category relate to cognizance of and contribution in response to the needs of the Army customers who fund and benefit from ARL R&D; 3. formulation of projects’ goals and plans—criteria in this category relate to the extent to which projects address ARL strategic goals and are planned effectively to achieve stated objectives; 4. r&d methodology—criteria in this category address the appropriateness of the hypotheses that drive the research, of the tools and methods applied to the collection and analysis of data, and of the judgments about future directions of the research; 5. Capabilities and resources—criteria in this category relate to whether current and projected equipment, facilities, and human resources are appropriate to achieve success of the projects; and 6. responsieness to the board’s recommendations—with respect to this criterion, the Board does not consider itself to be an oversight committee. The Board has consistently found ARL to be extremely responsive to its advice, so the criterion of responsiveness encourages discussion of the variables and contextual factors that affect ARL’s implementation of responses to recom - mendations rather than an accounting of responses to the Board’s recommendations. During the assessment, the Board considered the following questions posed by the ARL Director:

OCR for page 3
 007–008 aSSESSmENT Of ThE army rESEarCh labOraTOry 1. Is the scientific quality of the research of comparable technical quality to that executed in lead - ing federal, university, and/or industrial laboratories both nationally and internationally? 2. Does the research program reflect a broad understanding of the underlying science and research conducted elsewhere? 3. Does the research employ the appropriate laboratory equipment and/or numerical models? 4. Are the qualifications of the research team compatible with the research challenge? 5. Are the facilities and laboratory equipment state of the art? 6. Does the research reflect an understanding of the Army’s requirement for the research or the analysis? 7. Are programs crafted to employ the appropriate mix of theory, computation, and experimentation? 8. Is the work sufficiently unique and appropriate to the ARL niche? 9. Are there especially promising projects that, with application of adequate resources, could pro - duce outstanding results that could be transitioned ultimately to the field? Preparation of the Report This report represents the Board’s consensus findings and recommendations, developed through deliberations that included consideration of the notes prepared by the panel members summarizing their assessments. The Board’s aim with this report is to provide guidance to the ARL Director that will help ARL sustain its process of continuous improvement. To that end, the Board examined its extensive and detailed notes from the many Board, panel, and individual interactions with ARL over the 2007-2008 period. From those notes it distilled a shorter list of the main trends, opportunities, and challenges that merit attention at the level of the ARL Director. The Board used that list as the basis for this report. Specific ARL projects are used to illustrate these points in the following chapters when it is helpful to do so, but the Board did not aim to present the Director with a detailed account of 2 years’ worth of interactions with bench scientists. The draft of this report was subsequently honed and reviewed accord - ing to NRC procedures before being released. The approach to the assessment by the Board and its panels relied on the experience, technical knowl- edge, and expertise of its members, whose backgrounds were carefully matched to the technical areas within which the ARL activities are conducted. The Board and its panels reviewed selected examples of the standards and measurements activities and the technological research presented by ARL; it was not possible to review all ARL programs and projects exhaustively. The Board’s goal was to identify and report salient examples of accomplishments and opportunities for further improvement with respect to the technical merit of the ARL work, its perceived relevance to ARL’s definition of its mission, and apparent specific elements of the ARL resource infrastructure that is intended to support the technical work. Collectively, these highlighted examples for each ARL directorate are intended to portray an over- all impression of the laboratory while preserving useful mention of suggestions specific to projects and programs that the Board considered to be of special note within the set of those examined. The Board applied a largely qualitative rather than quantitative approach to the assessment; it is possible that future assessments will be informed by further consideration of various analytical methods that can be applied. The assessment is currently scheduled to be repeated annually and reported biennially. CROSSCUTTING ISSUES The Board has regularly encouraged ARL to continue to support new interdisciplinary initiatives, including those that require collaboration across ARL directorates. The Board was provided a welcome

OCR for page 3
7 iNTrOduCTiON opportunity by the ARL Director to examine ARL plans for three crosscutting Strategic Technology Initiatives (STIs): Advanced Computing, System of Systems Analysis, and Applications of Neuroscience to Enhancement of Soldier Performance. Panels examined additional crosscutting work that had previ - ously been encouraged by the Board in the following areas: information fusion, information security, ad hoc wireless networks, and system prototyping and model verification and validation. Technical details of these crosscutting research areas are presented in the following chapters; a brief summary of the Board’s impressions of these areas is presented here. Advanced Computing ARL’s Strategic Technology Initiative in Advanced Computing gives clear indication that ARL views high-performance computing as a critical technology driven by requirements from a variety of applications, including armor and armaments, atmospheric modeling, aerodynamics, and computational biology, across multiple directorates. In addition, ARL’s strategic plans include attention to petascale computing and to the investigation of software developments that will be needed to take advantage of potential applications of advanced computing. ARL’s use of advanced computing for basic science is still evolving, with several new projects showing promise. This STI was at its inception when examined by the Board, and ARL is examining, appropriately, the following issues as plans are detailed and projects are implemented: 1. ARL applications drivers, both current and emerging; 2. Current ARL capabilities in simulation and modeling; 3. Opportunities for new algorithmic and software technologies to have an impact on ARL work; 4. Implications for high-performance computing requirements at ARL, including hardware, soft - ware stack, middleware libraries, and applications codes; 5. Movement of relevant high-performance applications to multicore embedded high-performance computers, reflecting a transition from processing in machine rooms to computing on the battlefield; 6. A method for verification and validation; and 7. Strategic planning issues, including building core competences, developing team structures, and seeking opportunities for leveraging across applications, domains, and directorates. System of Systems Analysis The Survivability and Lethality Analysis Directorate (SLAD) has continued to work on method - ologies aimed at assessing the effectiveness of system of systems (SoS), which has been a continuing recommendation of the Board. However, SLAD’s methodological development has focused increasingly on the System of Systems Survivability Simulation (S4), a fine-grained, event-driven simulation whose development is focused on human decision-making processes. Methodological development focusing on a complementary methodology, the Mission and Means Framework (MMF), has essentially stopped; MMF is an approach to decomposing missions and systems in order to analytically identify links between subsystems and mission performance. The Board continues to recommend strongly that SLAD add a third leg to its platform of SoS methodologies. This third methodology should provide enough fidelity to enable the meaningful study of scenarios in order to identify any major system-level impact of, for example, communications bandwidth; intelligence, surveillance, and reconnaissance; and precision weaponry without modeling fine-grain entities such as packet-level communications or details of terrain.

OCR for page 3
8 007–008 aSSESSmENT Of ThE army rESEarCh labOraTOry Developing this methodology in collaboration with an extramural team other than the one that has been developing the S4 tool would stimulate needed fresh perspectives in SoS analysis. Applications of Neuroscience to Enhancement of Soldier Performance The neuroscience group in the Human Research and Engineering Directorate (HRED) has responded well to new opportunities in this important arena. ARL is developing needed collaborations with the relevant research community by its proposed use of the Collaborative Technology Alliance mechanism, joining industry and academic groups. HRED’s organization of the 1-day workshop that took place on May 8, 2008, was an excellent method of informing neuroscientists and cognitive scientists of the Army’s needs and of quickly evaluating various research groups that could respond to a CTA announcement. The HRED staff indicated that the pending CTA announcement would focus on a few areas that seem most promising for basic research and that have clear applications to the Army’s needs. ARL should develop advanced cognitive performance models to form the basis for hypotheses to be tested and to relate various neurological measurements to the prediction of human performance capabilities and mental workload. Information Fusion One of the most important new technology processes to emerge over the past few years is informa - tion fusion, or knowledge discovery, whereby disparate pieces of data are combined to yield higher-level knowledge, or information, that becomes actionable intelligence when presented in a sufficiently concise form and at the right time. Particularly in the Sensors and Electron Devices Directorate (SEDD) exciting developments in this area were demonstrated. The Board continues to encourage ARL to explore multi - directorate efforts to select some manageable set of problems—from sensing through the processing and presentation of information to the soldier—and to develop reasonably robust solutions for those problems that will help define the overall information fusion landscape and thus more general architectures. The Board continues to recommend crosscutting activities in this area, especially between SEDD and the Computational and Information Sciences Directorate (CISD), and especially with a close tie-in with the Network Science Division, that will form the veins for the tactical data driving such fusion. Information Security Information security remains an issue of great concern today in the wired computer network arena, both military and private, and it is of growing concern to the military as it moves to ad hoc networks formed from groups of warfighters. Therefore, the Board has encouraged ARL to develop crosscutting efforts in this area, especially in the establishment of testing facilities and organizations that help identify the specific challenges (both common and unique) faced by the Army and recognize when the best of commercially viable technologies provide some at-least-interim solutions. ARL’s proposed creation of a new Network Sciences CTA and an Army-wide Information Assurance Center of Excellence—both addressed by CISD—seem to be appropriate moves to expand ARL’s capabilities in the area of informa - tion security in significant ways. While mobile ad hoc networks (MANETs) are an important challenge, ARL should maintain global and long-term thinking with respect to traditional networks as well, since MANET-like systems will be increasingly integrated with traditional networks. SLAD supports information assurance testing, determination of compliance with Army regulations and policies, and analysis and identification of critical system and network vulnerabilities that could potentially be exploited by an adversary, as well as development of mitigation strategies for all system

OCR for page 3
 iNTrOduCTiON and network vulnerabilities. Such efforts present opportunities to drive issues rather than to react when systems are presented to SLAD for testing and analysis, and the experience gained through testing and analysis of specific systems should be proactively leveraged to develop a methodology for overall net - work vulnerability assessment and to define specific metrics for evaluating performance in this area. Ad Hoc Wireless Networks Ad hoc networks are electronic networks with the following characteristics: the individual nodes attempting to communicate come in and out of contact with one another, the nodes can move dynami - cally (and thus affect which other nodes they may be in contact with), and they may encounter envi - ronmental constraints (e.g., power, bandwidth, real time, security) not present in traditional networks. Such networks, particularly wireless ones, are beginning to permeate many of ARL’s projects, from sensor networks distributed over the battlefield, to dynamic intelligence networks aboard unmanned aerial vehicles, to intra- and intersoldier networks. Regarding the last of these, the extensive planning of studies by HRED on the effects of a variety of information networks on soldier performance is noted. Additionally, successful development of the ARL Blue Radio prototype by SEDD lays an excellent foun - dation for understanding at a deep level how the physics of radio transmission on the battlefield needs to interact with the flow of required information.2 The Board has encouraged ARL to consider efforts to bring together the disparate groups engaged in these endeavors so that fertilization of approaches, code, and subsystems can engender progress across the board. A particularly important ARL response in the area of ad hoc wireless networks has been the estab - lishment of the Mobile Network Modeling Institute. The institute has a charter to work with external and internal organizations on end-to-end models of MANETs for tactical purposes before they are developed and to allow those models to guide both development and deployment activities. This institute is clearly appropriate, with the potential to develop large-scale networked radio codes strongly matched to emerg - ing Army needs. However, unless this work is supported by a strong experimental component to validate and verify the models, there is a potential risk of falling short of ambitious goals. System Prototyping and Model Verification and Validation A continuing challenge for ARL is to ensure that appropriate verification and validation activi - ties—validating that models developed during research programs actually reflect reality and verifying that the codes or systems that are supposedly constructed to match are in fact correct implementations of the models—are applied to projects whose results rely heavily on models. ARL should continue to explore carefully opportunities to exploit its high-performance computer and model resources for applications such as the following: hardware prototyping, predictive performance modeling of sys - tems, and verification and validation of multiscale analysis and forecast models, for use in areas such as battlefield weather and the HRED-Improved Performance Research Integration Tool (IMPRINT) modeling of soldier performance in advanced hardware systems. Continued progress in this area should reduce significantly the costs of system hardware and software development and testing. The Board also continues to encourage ARL to consider ways of capturing the results of many of the field tests that it performs every year relative to such phenomena so that these results can be searched later for answers to questions not yet asked today. 2 Blue Radio is a small, wireless network interface card that was designed by ARL as a demonstration platform for imple - menting sensor networks, particularly ones that will be placed randomly on the ground and thus must rely heavily on surface wave propagation rather than on free-space propagation.

OCR for page 3
0 007–008 aSSESSmENT Of ThE army rESEarCh labOraTOry LINKAGE BETWEEN ARMY RESEARCH LABORATORY AND ARMY RESEARCH OFFICE The Board is not charged to review the work funded by the Army Research Office, which is an organizational entity within ARL. ARO is a significant basic research asset with a significant fraction of the total ARL basic research (6.1) budget. Considering the important role that basic research has had in the development of Army-relevant technologies and the similar high-payoff role that it could have in the future, the Board requested an opportunity to learn how the work portfolio of ARO is integrated into the activities normally reviewed by the Board. In response, ARL and ARO presented to each panel summaries of those 6.1 programs that ARO sponsors which are relevant to the ARL work reviewed by the given panel. The level of ARO collaboration varies across the directorates; in general, ARO demonstrated increasing attention to such collaboration, and the Board looks forward to continuing improvements in ARO’s cognizance and support of the missions of the directorates.