Executive Summary

The structure of the United States and global economies has changed during the last two decades in at least three major ways. First, what used to be as simple as tracking domestic research and development (R&D) spending by a small number of large U.S. manufacturers has now blossomed into the need to monitor scientific, technology, and innovation (STI) activities across the globe and across a wide range of sectors, beyond manufacturing.

Second, the type of information available to track innovation, R&D, and even the science, technology, engineering, and mathematics (STEM) workforce has changed as well. Historically, statistical agencies have relied on probability surveys to collect consistent and unbiased information. In recent years, however, the amount of raw data that is easily available online has soared, opening up possibilities for new STI indicators. Microdata from administrative records and other sources have been increasingly used to produce measures of capacities and trends in the global STI system. Also, frontier methods are emerging for monitoring the number of new product introductions through sophisticated web-scraping algorithms or tracing networks of scientists engaged in research. These data sources, although promising, may have uncertain biases and other deficiencies.

Third, the statistical mission of the National Center for Science and Engineering Statistics has also changed recently, expanded to include the condition and progress of U.S. STEM education, and the broader question of U.S. competitiveness in science, technology, and R&D.

The combination of these three factors raises questions about whether the statistical activities are properly focused to produce the information that policy makers, researchers, and businesses need. The questions become especially acute given the downturn in the U.S. economy and the importance of innovation in producing new job opportunities.

To answer these questions, the panel was charged to conduct a study of the status of the science, technology, and innovation indicators that are currently developed and published by the National Science Foundation’s National Center for Science and Engineering Statistics. In carrying out its charge, the panel undertook a broad and comprehensive review of STI indicators from different countries, including Japan, China, India, and several countries in Europe, Latin America and Africa. We also closely examined alternative methodologies for collecting relevant data. Our goal was not to come to any particular conclusion, but to keep an open mind to possibilities for improving and revamping the NCSES suite of statistical activities.

FINDINGS

Our first finding is that the depth and breadth of STI indicators across the globe is truly remarkable. Many countries are putting a high priority on collecting information on innovation and related activities, and they are gathering high-quality data.

Second, no country seems to have “cracked the code” in terms of a clearly superior set of STI indicators. Everyone still seems to be figuring out the right questions to ask. For example, when it comes to R&D, does it matter where R&D is done? Where the R&D is used? Or where the resulting intellectually property is located legally? Obviously, it would be great to have information on all three, but no one really knows which of these factors is the most important.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 1
Executive Summary The structure of the United States and global economies has changed during the last two decades in at least three major ways. First, what used to be as simple as tracking domestic research and development (R&D) spending by a small number of large U.S. manufacturers has now blossomed into the need to monitor scientific, technology, and innovation (STI) activities across the globe and across a wide range of sectors, beyond manufacturing. Second, the type of information available to track innovation, R&D, and even the science, technology, engineering, and mathematics (STEM) workforce has changed as well. Historically, statistical agencies have relied on probability surveys to collect consistent and unbiased information. In recent years, however, the amount of raw data that is easily available online has soared, opening up possibilities for new STI indicators. Microdata from administrative records and other sources have been increasingly used to produce measures of capacities and trends in the global STI system. Also, frontier methods are emerging for monitoring the number of new product introductions through sophisticated web-scraping algorithms or tracing networks of scientists engaged in research. These data sources, although promising, may have uncertain biases and other deficiencies. Third, the statistical mission of the National Center for Science and Engineering Statistics has also changed recently, expanded to include the condition and progress of U.S. STEM education, and the broader question of U.S. competitiveness in science, technology, and R&D. The combination of these three factors raises questions about whether the statistical activities are properly focused to produce the information that policy makers, researchers, and businesses need. The questions become especially acute given the downturn in the U.S. economy and the importance of innovation in producing new job opportunities. To answer these questions, the panel was charged to conduct a study of the status of the science, technology, and innovation indicators that are currently developed and published by the National Science Foundation’s National Center for Science and Engineering Statistics. In carrying out its charge, the panel undertook a broad and comprehensive review of STI indicators from different countries, including Japan, China, India, and several countries in Europe, Latin America and Africa. We also closely examined alternative methodologies for collecting relevant data. Our goal was not to come to any particular conclusion, but to keep an open mind to possibilities for improving and revamping the NCSES suite of statistical activities. FINDINGS Our first finding is that the depth and breadth of STI indicators across the globe is truly remarkable. Many countries are putting a high priority on collecting information on innovation and related activities, and they are gathering high-quality data. Second, no country seems to have “cracked the code” in terms of a clearly superior set of STI indicators. Everyone still seems to be figuring out the right questions to ask. For example, when it comes to R&D, does it matter where R&D is done? Where the R&D is used? Or where the resulting intellectually property is located legally? Obviously, it would be great to have information on all three, but no one really knows which of these factors is the most important. 1

OCR for page 1
Third, the panel did not find any little-known, proven STI indicators and methodologies used by other countries that could be easily and inexpensively adopted by NCSES. New technologies for data collection are very promising, but none of them is ready for implementation at a federal statistical agency. This does not mean that NCSES’s STI indicators cannot be improved. Indeed, there are several recommendations in this report and others to follow in the final report that propose ways and means for NCSES to improve its STI indicators program. PLANS FOR FURTHER WORK AND FINAL REPORT The panel’s final report will offer a comprehensive set of recommendations (including those in this interim report) with priority rankings and implementation strategies, as well as a roadmap for how the recommendations relate one to another. Those recommendations are likely to require longer lead times for data and tool development, as well as coordination with specific divisions of other statistical agencies in the United States and abroad, than those included in this interim report. We will address the net value added of proposed indicators, and we expect to specify which data and indicators can be eliminated by NCSES. Criteria for prioritization will include policy utility and obtaining more comparability of STI indicators in the United States with those published by foreign organizations. To develop those recommendations, the panel will carry out a wide range of work, including: gap analyses of current STI indicators; performance tests of key STI indicators; ways to improve measures of innovation, technological diffusion, and other key elements in understanding innovation; new data developments at the U.S. Patent and Trademark Office; the use of microdata; the possibilities of developing subnational indicators; data linking; the role of institutions and regulations; and NCSES’s potential role in coordination of federal STI statistics. RECOMMENDATIONS This interim report recommends near-term action by NCSES along two dimensions: (1) development of new policy-relevant indicators that are based on NCSES survey data or on data collections at other statistical agencies; and (2) exploration of new data extraction and management tools for generating statistics, using automated methods of harvesting unstructured or scientometric data and data derived from administrative records. Our six near-term recommendations are in descending priority order. The first five are about new and revised indicators; the sixth concerns new processes and techniques. RECOMMENDATION 1: The National Center for Science and Engineering Statistics should explore methods of using existing longitudinal data on labor force mobility related to science, technology, and innovation activities in the United States and abroad. This work should include gap analyses and workshops with statistical agencies to determine how to achieve efficient management of datasets and statistics for human capital indicators. The agency should also use its own data resources, especially the Business Research and Development and Innovation Survey, for new employment measures. 2

OCR for page 1
RECOMMENDATION 2: The National Center for Science and Engineering Statistics should develop new indicators on innovation, based on data from its Business Research and Development and Innovation Survey (BRDIS). The agency should develop comparative statistics with the same cutoffs used by countries in the Organisation for Economic Co-operation and Development for its BRDIS data. RECOMMENDATION 3: The National Center for Science and Engineering Statistics should begin to match its Business Research and Development and Innovation Survey data to data from ongoing surveys at the U.S. Census Bureau and the Bureau of Labor Statistics to create indicators of firm dynamism. This is a necessary first step for developing data linkages that yield measures of activities by high-growth firms, and on births and deaths of businesses linked to innovation outputs. These measures should be established by geographic and industry sectors and by business size and business age. Such measures would be an important step in furthering international comparability on innovation indicators. NCSES should conduct its own sensitivity analysis to fine tune meaningful age categories of high- growth firms. RECOMMENDATION 4: The National Center for Science and Engineering Statistics should more fully use data from its Business Research and Development and Innovation Survey to provide indicators on payments and receipts for R&D services between the United States and other countries. RECOMMENDATION 5: The National Center for Science and Engineering Statistics should host working groups in the near future to further develop indicators on subnational science, technology and innovation activities. Participants in the working groups should be both users and providers of the data. A main focus of the discussion should be on data reliability, particularly at fine geographical scales. Potential indicators should include subnational research and development statistics, and subnational science, technology, engineering, and mathematics workforce statistics. RECOMMENDATION 6: The National Center for Science and Engineering Statistics should fund exploratory activities on frontier data extraction and development methods. These activities should include  research funding or prize competitions to harness the computing power of data specialists with a view to (a) analyzing existing public databases to develop better indicators of science, technology, and innovation activities and (b) analyzing the huge and growing amount of information on the Internet for similar purposes;  pilot programs or experiments to produce a subset of indicators using web tools; and  convening a workshop of experts on multimodal data development, to explore the new territory of developing metrics and indicators from surveys, administrative records, and scientometric sources. 3