National Academies Press: OpenBook

Assessment of the NASA Applied Sciences Program (2007)

Chapter: 5 Achieving the Objectives of NASA's Applied Sciences Program

« Previous: 4 Beyond Federal Partnerships: Engagement with the Broader Community of Users
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×

5
Achieving the Objectives of NASA’S Applied Sciences Program

This chapter summarizes the committee’s findings with respect to three questions. (1) What are users’ expectations about the ability of the data and tools provided through NASA’s Applied Sciences Program (ASP) to meet their needs? (2) How appropriate is ASP’s Integrated Systems Solution Architecture (ISSA) for describing the transition of NASA products to realized societal benefits? (3) Does ASP’s strategic planning process need other mechanisms for coordinating and tracking results? These questions tie together ideas put forth in previous chapters.

USER EXPECTATIONS

What are users’ expectations about the ability of the data and tools provided through NASA’s ASP to meet their needs? While the committee did not assess user needs, it came away with general observations on users’ expectations (Chapters 3 and 4, Appendix B). Data needs (including a mechanism to specify new sensor types that provide useful data) were emphasized by users over the need for new modeling or data analysis tools, and very little discussion was presented of any need for new models. Users tend to employ NASA data in their own models to aid in decision support rather than using NASA models. Users appeared to need better access to (1) NASA data to replace existing field data; (2) improved monitoring and prediction capabilities for management, planning, and regulatory compliance; and (3) a mechanism to provide continuity in sensing technologies. Users generally felt no mechanism existed by which to convey their needs on a routine basis.

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×

The committee’s findings are summarized for the following areas:

  • data continuity;

  • geospatial and temporal resolution;

  • data quality;

  • format interoperability;

  • unused data;

  • data latency;

  • data on physical characteristics of the environment;

  • the Applications Implementation Working Group website; and

  • benchmark reports.

All findings in this section point to the general need for enhanced user feedback mechanisms and processes for considering user needs.

Data Continuity

The user community is particularly concerned about data continuity. Older satellite systems and instruments, such as Landsat, offer familiarity to users. Landsat has provided continuous global coverage since 1972 and AVHRR since 1979. While the Land Remote Sensing Policy Act of 1992 identifies a privately funded and managed system as the preferred option for a successor to Landsat, commercial data providers indicate that there is an insufficient market to justify the private investment that would be required to fly a commercial Landsat-like system. Users are reluctant to build practical applications based on NASA remote sensing data streams that may not exist in the near future or that are considered experimental. Operationally it is difficult to take data from Landsat and substitute that for other data and get equivalent results. Because the licensing provisions for commercial companies restrict sharing of information in its original form with other users, commercial vendors are reluctant to build software that will accommodate new instruments that are considered experimental. The issue of continuity transcends the Landsat and AVHRR datasets in that commercial applications depend on reliable data delivery over the long term. If a dataset is not available at a crucial time or is not scheduled for operational delivery, the cost of developing products for these research-level data products is not justified economically.

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×

Geospatial and Temporal Resolution

Numerous user groups indicate that they need data with a spatial resolution better than that offered by Landsat-7 ETM+ (NASA’s highest-resolution platforms) to conduct their day-to-day local and regional work (e.g., Barnard, 2006; NSGIC, 2006; Walthall, 2006; Worthy, 2006). Many users said that in order to support their day-to-day remote management and regulatory decisions they need data with spatial resolution higher than 10 meters. However, NASA appears to have decided not to develop measurements smaller than 10 meters, stating that the commercial sector should provide data at this scale. Review of this practice may be necessary if government, in particular, is to be more cost-effective in its data collection efforts and avoid overlap. Where NASA’s role in serving the national need for higher-resolution data has been restricted by reluctance to launch the necessary sensors, other federal agencies have partnered with the National States Geographic Information Council to cover this critical need (Box 5.1), although this agency coordination is not a solution to provide frequent, synoptic national or global coverage as is required for many national needs.

BOX 5.1

Imagery for the Nation

The Imagery for the Nation Project is an example of coordination by federal, state, and local government agencies to meet a need for high-resolution digital imagery for a wide range of applications, including natural resource management, agriculture, land use planning, and homeland security. The National States Geographic Information Council (NSGIC) developed the concept for Imagery for the Nation in 2004 and proposed it to the Federal Geographic Data Committee (FGDC) and National Orthophoto Program Committee (NOPC) in 2005. The project will collect and disseminate standardized nationwide aerial color imagery products at 1-meter, 1-foot, and 6-inch spatial resolutions with repeat imagery every one to five years depending on location, population density, and image resolution. The imagery acquired through this project will remain in the public domain and will be archived to ensure its availability for posterity.

No federal funds have been committed to this project but funds have been requested for FY 2009. The projected cost is $111 million per year for the first three years, with a projected cost saving of 25 percent over current, less coordinated data purchases by individual federal, state, and local agencies. Both the U.S. Department of Agriculture and U.S. Geological Survey (USGS) support the concept; if implemented, USGS would manage the program.

SOURCE: Http://www.nsgic.org/hottopics/iftn/imagery_forthe_nation.pdf and Http://www.nsgic.org/hottopics/iftn/briefing_document.pdf.

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×

Data Quality

NASA data quality was considered excellent by users with whom the committee interacted. In addition, users felt that validation and verification of accuracy and geographic and temporal adjustment were adequate. ASP is concerned about the loss of the scientific integrity of the data and models when a partnering agency assumes responsibility for continued operations without continuing support by ASP. Partnering agencies are concerned that NASA does not provide products of appropriate resolution or format to satisfy their needs and apparently has no mechanism to address these requirements. The National Oceanic and Atmospheric Administration (NOAA) stated, for example, that many NASA global products are reprocessed internally by NOAA to meet spatial, temporal, and format requirements of the National Ocean Service. In some cases NOAA uses NASA global products at existing spatial and temporal scales, although not optimal for NOAA needs, by reformatting data. There appears to be no means for ASP to ensure the integrity of products once responsibility is assumed by a partnering agency, in spite of NOAA having a long history of cooperation with NASA. Concerns over data integrity could be alleviated, in part, if NASA were to provide data at a resolution and in a format needed by end users. Developing feedback mechanisms as outlined in this report might also contribute toward NASA and ASP being more confident that data integrity is maintained after the data have been adopted and incorporated in partners’ operations.

Format Interoperability

Progress to provide data in formats that enhance interoperability does not proceed at a speed that many users would prefer. The large size of data holdings for NASA products exacerbates the formatting problem. Achieving a balance between maintaining these large global data products and the needs of local users is challenging, and has not been resolved. NASA’s data products can be accessed through eight government Distributed Active Archive Centers (DAAC). The DAACs process, archive, document, and distribute data from Earth Observation System (EOS) satellites and measurement programs. The DAACs tend to distribute products in hierarchical data format (HDF) while the modeling and GIS community tend to use other formats. To convert the files can be tedious and time consuming. Many users indicated that they would like to have NASA data in a format that enhances interoperability with their other non-NASA information products and software.

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×

Unused Data

Large quantities of potentially useful data may go unanalyzed. Causes of this lack of analysis may be related to funding, to the fact that the applications community may not yet have learned how to use the data, or to concerns on the part of the applications community about lack of continuity. In other cases the operational agency's (or other users’) requirements may be at odds with NASA’s research priorities, or limitations on spatial resolution of NASA instruments may preclude data use. Of data produced by NASA’s 17 Earth observing satellites, only those from two satellites—Landsat and Moderate Resolution Imaging Spectroradiometer (MODIS)—are in high demand by the user community. In a recent round of proposals received by ASP 80 percent proposed using MODIS data (Birk, 2006).

Data on Physical Characteristics of the Environment

Among NASA sensors the committee found a range of measurement capabilities for determining physical characteristics of the environment. Remote sensing of land surface and atmospheric characteristics are the most advanced (for example, vegetation condition, fire fuels, pollutant concentrations, weather forecasts). Less advanced are capabilities for measuring coastal and subsurface characteristics (for example, water quantity and quality, levee integrity, and fisheries; Roffer [2006]). Ample opportunity seems to exist for development of new sensors and for applications research to respond to these and other potential markets.

Data Latency

Time lags between data acquisition and processing into products and delivery to users were cited as a problem for some users, particularly in the commercial weather forecasting community. It was unclear which efforts were being considered to speed the transfer of data to the ground-based processing centers. Responsibility to rectify this situation does not rest solely with NASA but may also involve other partners like NOAA.

Applications Implementation Working Group Website

ASP managers stated that the main way users learn about NASA data and tools is by visiting the Applications Implementation Working Group

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×

(AIWG) website. However, committee members and other users found the site confusing and needed NASA staff assistance to find the data they needed. The ASP, through the Geosciences Interoperability Office, is developing the Earth Science Gateway, which is supposed to enable better access to and use of Earth observations and model products. It is currently in beta test phase. Further development of this approach may allow data and visualization tools to be made available through openstandard protocols.

Benchmark Reports

The NASA Research, Education, and Application Solutions Network (REASoN) and Research Opportunities in Space and Earth Sciences (ROSES) research programs funded by ASP are starting to produce benchmark reports. These reports are reviewed by ASP personnel but receive little outside peer review. To date, NASA and ASP have not used a transparent review of the disposition of the few existing benchmark reports in terms of their effectiveness in engaging the broader community. After more benchmark reports are released it would be useful to obtain feedback from the beneficiaries of the NASA products.

NASA’S INTEGRATED SYSTEMS SOLUTION ARCHITECTURE

How appropriate is the ISSA (Figure 2.2) for describing the transitioning of NASA products to realized societal benefits? This architecture characterizes an integrated system that connects basic scientific observations through a number of intermediary analytic steps to outputs directly relevant to decision makers. In this framework, Earth observations are inputs to models that simulate the dynamic processes of Earth. These models produce outputs that predict and forecast to inform decision-support tools, typically computer models assess events (e.g., forest fires, hurricanes), relationships among environmental conditions and other scientific metrics (e.g., epidemiological data), or resource availability. Outcomes are decisions about policy or management issues like food supply or natural disasters.

Many of the outputs are not being used by the intended users. These models are typically developed by software engineers based on what they think the end user wants. An opportunity exists for ASP to bring together software engineers and end users to ensure that what NASA is developing assists the users in making their decisions. Two-way dialogue ought to be incorporated in this process (see Boxes 5.2, 5.3, 5.4).

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×

BOX 5.2

Avoiding the Risk of One-time Experiments

The Decadal Study (NRC, 2007a, p. 141) stated that “unless there is sustained institutional support for interactions between the scientific producers and the applications users of information, there is a risk that even successful examples of Earth science applications become “one-off” experiments that are not repeated over time. Of the examples identified in the previous section [of NRC, 2007a], for example, only those involving weather forecasting had institutional mechanisms designed specifically to foster such two-way interactions. In the other cases, the two-way interactions occurred early in program development through the activities of principal investigators, but there is no clear institutional mechanism to ensure that improvements in observations, methods, or changing needs can propagate through the systems. In sum, to be successful, the use of Earth science data for applications of benefit to society will require research as well as data. Such research will improve our understanding of successful transitions from research data to societal applications, processes of information adoption and use outside the scientific community, and decision-making under uncertainty. It will also require sustained communications with potential users of scientific information.”

Program managers can be particularly effective in connecting the main functions of the systems architecture and providing iterative dialogue. In this way, ASP managers serve an important function in understanding and documenting whether all the pieces in the chain leading from inputs to societal benefits exist, whether they are adequately connected, how various organizations at the federal, state, and local levels fit into that system, and who will manage the flow and how. ASP indicated early in this study that their official responsibility was for the left-hand side of Figure 2.2, and that they had little official influence over what occurred on the right-hand side (once the products had been transferred to their partners). ASP’s ISSA is not adequate in its present form to effectively transfer from research to applications to the benefit of end users. Much of the approach appeared too linear and unidirectional, where basic research was converted into societal benefits without user feedback. Making the transition from research to societal benefits requires explicit identification of which outcomes and impacts are needed so that activities and resources can be targeted accordingly—a function that the feedback loop will enhance.

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×

BOX 5.3

Forging Stronger Connections to User Requirements

ASP’s process may ideally be conceived as including the feedback loop in the research-to-operations transition that bridges the "valley of death" (NRC, 2003) by providing clear channels for agencies to transmit requirements to NASA. USDA sought but could not find such channels. In addition, there are no clear links between ASP and some of the major observing programs (e.g., Oak Ridge Isochronous Observation Network [ORION], National Ecological Observatory Network [NEON], Integrated Ocean Observing System [IOOS]). There is no systematic mechanism for NASA to assess or learn of user needs, including those of the federal agencies.

The following examples illustrate opportunities and models that ASP could follow to forge stronger connections to users and their requirements.

  1. There are operational requirements laid out in the various presidential directives issued over the past five years (e.g., U.S. Ocean Action Plan, Climate Change, Exploration) that can be mined. For example, the U.S. Office of Science and Technology Policy/National Space and Technology Council has initiated a long-term process to define an Ocean Research Priorities Plan.

  2. Formal structures have been set up to link management and operations with research and education (e.g., the SIMOR Committee of the White House's Committee on Ocean Policy). ASP could represent NASA in this structure.

  3. The infrastructure and regional contacts developed over the years by the Space Grant Consortium and the Geospatial Extension Specialist Program (see also Chapter 4) could be used to implement an outreach and extension program similar to that of NOAA’s Sea Grant Program and Climate Science Applications Extension Program (see http://cals.arizona.edu/climate/). These types of grant programs could be linked as additional elements of the research-to-operations feedback mechanism serving NASA, NOAA, and other entities.

  4. The longtime partnership between the National Weather Service (NWS) and the private sector, which results in both general and tailored weather forecast and warning products that are widely acknowledged as valuable, is a good model upon which to build the user feedback loop. NWS and commercial meteorological products have applications ranging from scientific research to human safety, transportation, agriculture, and daily forecasts.

  5. New findings and research results from NASA’s basic Earth science research (core funding) are potential candidates for applied remote sensing science. There are currently no formal links between basic Earth science research and ASP through which results obtained during the basic remote sensing science initiatives are communicated to ASP.

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×

BOX 5.4

Developing and Implementing Applications at the U.S. Naval Meteorology and Oceanography Center

The U.S. Naval Meteorology and Oceanography Center (METOC) has developed an effective approach to developing and implementing applications. The model used by METOC involves two major steps.

  1. A rigorous process to identify requirements that includes

    • Formal engagement of all stakeholders to obtain input;

    • Establishing a process for feedback and product/application review; and

    • Metrics.

  1. A production system organized by capabilities that includes

    • Data acquisition;

    • Aggregation of similar data types into specific products;

    • Fusion of dissimilar but complementary data, which can be “overlaid” onto a four-dimensional model (space/time) after georeferencing;

    • An efficient distribution system that delivers the right product, to the right customer, at the right time, regardless of their location; and

    • A customer service program (help, technical documents, literature, discussion forums).

STRATEGIC PLANNING

Does the strategic planning process need other mechanisms for coordinating and tracking results? ASP’s strategic planning process does not include explicit identification of which outcomes and impacts it wants to achieve with which audiences, making effective targeting of activities and resources more difficult. The strategic plan of ASP is a broad document with many goals, not all of which are clear, accompanied by practical steps for implementation.

The committee found that metrics are not currently being collected by ASP to gauge progress toward achieving their goals. Some useful metrics to employ for this purpose would be those that measure the outcomes and impacts actually achieved by NASA products as reported by NASA partners. One of the more difficult problems for ASP would be setting priorities among the 12 application areas in the absence of an overarching national strategy on environmental issues.

Metrics need to assess the process as well as progress in the transition of research to practical applications and demonstration of societal benefit. “Process” refers to the level of planning, type of leadership, availability of resources, and accessibility of information. Metrics used by ASP are currently focused almost exclusively on quantitative measures

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×

of website usage and very little on adoption of benchmark reports (Chapter 2).

Performance measures are currently lacking to demonstrate that ASP's decision-support tools are helping decision makers make better choices. Incentive structures enacted by the Government Performance Review Act (GPRA) of 1993 and related policies put into place by the Office of Management and Budget require federal agencies to set strategic goals and to measure program performance against those goals. These policies tend to emphasize, rather than deter, disconnections between outputs and outcomes. GPRA’s Program Assessment Rating Tool (PART) evaluations apply only to individual federal agencies. The Act does not apply to multiagency programs such as the Climate Change Science Program (CCSP) to which NASA outputs contribute. This relinquishing of control over achievement of goals in multiagency collaborative efforts is a significant deterrent to collaboration. The NRC report, Thinking Strategically, addresses this explicitly (NRC, 2005b).

Some of ASP’s output contributes to interagency activities such as the CCSP. A study of the appropriate use of metrics for the CCSP (NRC, 2005b) recommended a general set of metrics that this committee believes would be useful for ASP to consider as it prioritizes the outcomes and impacts it most wants to achieve:

  • Process metrics (measure a course of action taken to achieve a goal);

  • Input metrics (measure tangible quantities put into a process to achieve a goal);

  • Output metrics (measure the products and services delivered);

  • Outcome metrics (measure results that stem from use of the outputs and influence stakeholders outside the program); and

  • Impact metrics (measure the long-term societal, economic, or environmental consequences of an outcome).

These performance measures comprise both qualitative (e.g., productivity, research quality, relevance of research to the agency’s mission, leadership) and quantitative measures.

Methods for gaining feedback from academics and the user community come almost exclusively from the external peer review of ASP proposals. Review of completed projects takes place exclusively by internal review. The present system has no end-user review of projects. The committee looked for use of other methods through which ASP might have gained information and feedback, but found little evidence that any had been employed. These other methods were:

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
  • Workshops for potential users to disseminate the lessons learned from their projects.

  • An advisory committee with representatives from multiple government levels and the private sector to monitor the program. Such a committee could consult with the interagency committees and industry representatives and organizations.

  • Competitive solicitations that take a more user-inspired basic research approach as defined together with the user community.

  • Collaborative, user-driven dialogue to identify what information users need to address a problem, what NASA can offer, and how the needs and possible responses to those needs might converge. The user may leave with a different understanding of what they need and NASA may come to a different understanding of what to offer or how products could be offered.

Such methods would lead to more relevant data streams and research results that would more likely be used. In addition, these methods would improve perceptions of the legitimacy of the process and credibility of the knowledge it produces, given the increased transparency and openness that user engagement requires (NRC, 2006).

The science produced by NASA will be more useful and effective if its products are designed so that they are perceived by multiple stakeholders to be credible, salient, and legitimate. The process for gaining feedback needs to be perceived not only as scientifically credible but also salient to users’ concerns and generated through legitimate means. These properties increase the likelihood of ultimate effectiveness. Because different stakeholders will have different standards, a central challenge is the task of getting multiple key actors all to see that a product meets their individual, particular salience, credibility, and legitimacy criteria (NRC, 2007c; Mitchell et al., 2006; Cash et al., 2003)

SUMMARY

  1. NASA works from the basis that Earth observation data and tools are intended for scientific research. However, consideration of how additional benefits might accrue from Earth science data products is critical in informing public understanding of investment in these services.

  2. For applications the consistency of the information stream can outweigh the gain in information provided by a new research development. Thus, advances in technology for applications or models will need to be significant improvements to overcome the cost of making the transition

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×

to a new product. If the information delivery is erratic or not available when required, the utility of the advanced product will lose its value and will not be incorporated into the application area.

  1. The process between the development of model results and other decision-support tools is what matters if expected benefits are to be realized by the end user. Considerably more effort is needed to improve the understanding of all parties involved in the process. This includes the communication between the research community and the ultimate users of the information. A revised strategy that incorporates a feedback mechanism and engages regional, local, and private users as well as other federal agencies is needed if ASP is to better realize the societal benefits resulting from its data and tools. A fundamental issue is the transfer from a research asset to an operational one that provides long-term, consistent information that is useful to a user community. The current structure does not guarantee transition from research to operations. It assumes that once research results have been demonstrated as useful in a benchmark report, a plan will be developed by someone else to transfer the capacity to an operational configuration to serve on a routine basis.

  2. A strong need exists for ASP to reach out to users to get a complete understanding of the requirements process. Currently, numerous agencies at multiple levels of government are collecting similar data at different spatial and temporal resolutions while others process the data internally in duplicative fashion, and not always in a validated, scientific manner. Such an approach is neither efficient nor cost-effective. Better access to data to replace existing field data collection methods and to improve monitoring and prediction capabilities for management, planning, and regulatory compliance is also needed. The committee heard little about the need by users for different tools, particularly models, provided by NASA. Furthermore, no formal mechanism exists for agencies or other users to provide specifications or needs to NASA to develop new sensing technologies.

  3. ASP’s strategic planning process does not incorporate explicit identification of what outcomes and impacts it most wants to achieve, with which audiences—making efficient prioritization of its activities and resources difficult.

  4. No clear links appear to exist between ASP and some of the major observing programs, for example, Oak Ridge Isochronous Observation Network (ORION), National Ecological Observatory Network (NEON), and Integrated Ocean Observing System (IOOS).

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
  1. Program managers can be particularly effective in connecting the Earth Science Framework, including providing iterative dialogues. To do so, it is important for ASP managers to understand and document whether all the links in the chain from inputs to impacts exist, whether they are adequately connected, how various organizations at the federal, state, and local levels fit into that system, and who will manage the flow and how.

  2. ASP's strategic planning process does not include explicit identification of what outcomes and impacts it most wants to achieve with which audiences, making efficient prioritization of its activities and resources difficult to do. Setting priorities among the 12 applications areas in the absence of an overarching national strategy on environmental issues or a formal mechanism for collaboration across the agencies would be a critical problem.

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×

This page intentionally left blank.

Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 95
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 96
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 97
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 98
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 99
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 100
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 101
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 102
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 103
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 104
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 105
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 106
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 107
Suggested Citation:"5 Achieving the Objectives of NASA's Applied Sciences Program." National Research Council. 2007. Assessment of the NASA Applied Sciences Program. Washington, DC: The National Academies Press. doi: 10.17226/11987.
×
Page 108
Next: 6 Conclusions and Recommendations »
Assessment of the NASA Applied Sciences Program Get This Book
×
Buy Paperback | $47.00 Buy Ebook | $37.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!