National Academies Press: OpenBook
« Previous: 5 Phase III Challenges and Opportunities
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

6
Program Management

6.1
INTRODUCTION

Management of the DoD SBIR program is characterized by two central elements: (1) the tremendous diversity of objectives, management structures and approaches among the different services and agencies that fund SBIR programs at DoD; and (2) the consistent pursuit of improvements to the program to enable it to better meet its objectives.

The review that follows is focused on describing the mainstream of DoD practice, and where relevant, divergences from it among the agencies and components. It concentrates on describing current practices and recent reforms.

These reforms also impact the way in which assessment must be made. The significant lags between award date and commercialization means that comprehensive outcomes are only now available for awards made in the mid to late 1990s. However, management practices have changed—often significantly—since the time of those awards.1 Hence it is methodologically not possible to build a one-to-one relationship between outcomes and management practices.

1

For example, a number of major internal changes followed the 1995 Process Action Team (PAT) review. These led to a reduction in the lag between receipt of proposals and award announcement from 6.5 months to 4 months for Phase I, and from 11.5 months to 6 months for Phase II. The Fast Track Program was also established, which both accelerates the decision-making process and increases the level of funding for Phase II projects which obtain matching funds from third-party investors. DoD also required all SBIR Phase II proposals to define a specific strategy for moving their technology rapidly into commercial use.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
FIGURE 6-1 SBIR timeline at DoD.

FIGURE 6-1 SBIR timeline at DoD.

SOURCE: Michael Caccuitto, DoD SBIR/STTR Program Administrator and Carol Van Wyk, DoD CPP Coordinator. Presentation to SBTC SBIR in Rapid Transition Conference, Washington, DC, September 27, 2006.

The latter must therefore be assessed primarily through interviews, focused on current practice, with awardees, agency staff, and other stakeholders.2

Finally, it is worth noting that DoD processes are quite complex—unsurprising, given the high volume of proposals and awards, and the wide variety of Service and Agency objectives. However, it is possible to provide an overview of core activities, as seen in Figure 6-1. Each phase of the SBIR program will be reviewed in turn. Figure 6-1 shows the significant pre-solicitation activities focused around topic development, some of the funding initiatives in place (Fast Track and Phase II Enhancements), and the potential role of Phase III which, as we shall see, should be part of very early activities within the SBIR framework.

2

The continuing, at times incremental nature of these changes set against the longer term, often circuitous processes of firm growth and commercialization of SBIR awards complicates efforts to relate program management techniques to performance outcomes. Thus, results measured for awards that occurred ten years ago may not adequately describe how well a service or agency is managing in its SBIR program today.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

6.2
TOPIC GENERATION AND PROCEDURES

Management of the DoD SBIR program has been largely decentralized to individual services and agencies. The exception is that the Office of the Deputy Director of Research and Engineering (DDR&E) uses the topic review process to exert centralized control over the definition of the SBIR topics included in official solicitations.

Informal DoD topic review under the lead of the DDR&E began in 1996, following the recommendations of the 1995 PAT review. A formalized process for topic review began in 1997. It was designed to promote the closer alignment of service and agency R&D with overall DoD R&D priorities, to avoid duplication, and to maintain the desired degree of specialization in the R&D activities of the respective services and agencies.

Ultimate decision authority on the inclusion of topics in a solicitation lies with the Integrated Review Team, which contains representatives from each of the awarding components. Topics are reviewed initially at DDR&E and then returned to the agencies for correction of minor flaws, for revision and resubmission, or as discards.

This review process is not necessarily popular with topic authors or program managers, as it limits their authority. Some senior managers have stated that they believe the DDR&E offices are not close enough to the programs to make these kinds of decisions effectively. The process also reduces responsiveness to

FIGURE 6-2 Topic review process.

FIGURE 6-2 Topic review process.

SOURCE: Developed from interviews with DoD staff.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

BOX 6-1

Acquisition Liaisons

To further foster coordination between its R&D and acquisitions programs, DoD mandates that each major acquisition program must designate as SBIR liaison an individual who is:

  • knowledgeable about the technology needs of the acquisition program and

  • responsible for technology infusion into the program.

These liaisons interface with the SBIR program managers within DoD and with the SBIR contractor community. Their role is to integrate SBIR technologies into their acquisition programs.

Contact information for the liaisons is listed on a DoD SBIR Web Site so that both DoD laboratory personnel and SBIR contractors have—in theory at least—an efficient means of communicating with their end customers in acquisition programs at all stages of the SBIR process. The liaisons may author topics or cause them to be authored.

However, speakers at the NRC Phase III Conference observed that agencies sometimes worked around the mandate by assigning numerous liaison roles to a single individual as a pro forma matter, making the function effectively useless.a

  

aSee National Research Council, SBIR and the Phase III Challenge of Commercialization, Charles W. Wessner, ed., Washington, DC: The National Academies Press, 2007.

changed circumstances (such as 9/11) as it increased the time required for approval of possible topics. However, the review process has, according to DoD staff, improved clarity in topic descriptions and forced the technical monitors (TPOCs) to address how the possible applicants might transition into Phase III.3 DoD has tried to find ways of mitigating the negative effects of review—for example, by providing means to reduce delays for “hot” topics, such as the Navy quick response topics awarded in the aftermath of 9/11.

Topics originate in service laboratories or in program acquisition offices. The laboratories are focused on developing technologies to meet the ongoing research needs of their organization. Some awarding organizations within DoD do not have their own laboratories. In practice, these organizations frequently turn to the “in-house” expertise of the service laboratories both to transform mission requirements into R&D topics, and to suggest topics relevant to the organizations’ requirements.

Topic authors frequently serve in a dual capacity. After their topic has been accepted and an award made, they become the technical monitors for the contract that results. Thus, even though these technical monitors are often insulated

3

Presentation by Carol Van Wyk, presentation to Navy Opportunity Forum, Reston, VA, 2004.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

from the day-to-day needs of the front-line commands, they are charged with the responsibility of developing technologies that are relevant to DoD’s overall performance and to the specific mission needs of the funding components.

Further changes in topic generating procedures are under way. Missile Defense Agency (MDA) representatives, for example, described a reengineering of topic generation procedures intended to shift the focus of projects away from those based on program elements to what was termed a more MDA-centric approach, aimed at generating topics that enhance the agency’s technical performance capabilities and fill agency-wide gaps in existing systems.

Primes are also invited to suggest SBIR topics through informal discussions with laboratory personnel or SBIR program managers at scientific meetings, technology conferences, and trade shows, as well as at DoD’s own outreach workshops. Other channels for input include prerelease discussions with topic authors, and ongoing contacts between firms and technical monitors for current SBIR awards. Case study interviews with firms indicated that these informal4 channels are a recognized and generally accepted facet of the SBIR program. Firms do express some ambivalence about the proprietary of these informal channels, and about the frequency with which they affect the selection of topics, and thus the distribution of awards.

There is no formal process within DoD through which firms can suggest topics, so SBIR program managers have no information about how often firm-suggested topics are adopted. The iterative review and revision process for DoD topics is also such that the ultimate topic released may differ substantially from that originally proposed. And, of course, many proposed topics do not make it through the review process.5

Overall, the vast majority of topics are agency-driven. DoD has established seven criteria6, which are used in the review of potential SBIR topics:

  • Topics will solicit R&D and not procurement.

  • Topics must involve technical risk; i.e., technical feasibility is not yet established.

  • Topics will fall within one or more of the DoD key technology areas.

4

The Army section of the Solicitation 06-2 contains the following “Small Businesses are encouraged to suggest ideas that may be included in future Army SBIR solicitations. These suggestions should be directed to the SBIR points-of-contact at the respective Army research and development organizations listed in these instructions.”

5

It is important to note there are no hard data on the extent of firm influence on topics, and there are unlikely to be any in the future. When a firm wins an award on a topic that it is perceived to have initiated, other firms suggest that selected SBIR competitions are “wired.” However, if a firm fails to win a topic it suggested, that information rarely becomes public, so information asymmetries result in an unbalanced perspective.

6

See National Research Council, The Small Business Innovation Research Program: An Assessment of the Department of Defense Fast Track Initiative, Charles W. Wessner, ed., Washington, DC: National Academy Press, 2000.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
  • Topic will allow the performing company “significant flexibility,” stressing innovation and creativity.

  • Topics will include examples of possible phase III dual-use applications.

  • Topics will not duplicate each other.

  • Topics will be clearly written.

The services and agencies initially submit draft topics to OSD 6 months before the scheduled closing date of a solicitation. OSD and DDR&E conduct a one month detailed review using the review criteria. The topics are then accepted or returned to the originating service or agency for revision. The originators may revise and resubmit for a second DDR&E review. Topics that fail the second review are returned to the services and agencies, who may appeal the rejection to an integrated review team.

All ultimately approved topics are returned to the originators for final certification, followed by publication of the pre-release.

In addition, each Service and agency has its own review process that precedes the DoD-wide OSD review. The duration of these processes depends on numerous factors, including the size of the agency (larger agencies have more topics to review). Thus, the Army topic review is a centralized online process that takes 4 months, while the less centralized, but also online, Air Force SBIR Topic Submission Module may be active six months ahead of the OSD review process.

6.3
PRE-RELEASE

An overriding consideration at DoD is that its procedures for managing the SBIR solicitation process comply with Federal Acquisition Regulations (FAR). At the same time, DoD wants to provide as much information as allowable to firms interested in submitting proposals. FAR prohibits contact between an agency issuing a solicitation and prospective bidders once a solicitation has been issued, other than written questions to the contracting officer. The contracting officer in turn must make the question and answer available to all prospective bidders. This balancing act is managed through the pre-release process.

Pre-release is an important DoD SBIR initiative, which has won considerable praise from small businesses. The pre-release posts the entire projected solicitation on the Internet about two months before the solicitation is to open. Each topic includes the name and contact information of the topic author/monitor. Interested firms may discuss with the topic author/monitor the problem that the government wants to solve, as well as their intended approach.

From the firm’s perspective, a short private discussion with the topic author can often help avoid the cost of preparing a proposal if the firm’s capabilities do not match those required to compete successfully. Alternatively, the discussion

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

can help refine the firm’s understanding of the technical requirements embedded in the solicitation and lead it to prepare a more focused and competitive proposal.

Discussions can also open new possibilities for the firm, as it learns that its capabilities are better suited for other topics, other acquisition programs, or the unmet needs of prime contractors. Firms new to SBIR often get procedural information, and are steered to the appropriate DoD Web sites for further information. In all, the pre-release period has generally been viewed positively both by DoD and by participating firms.

Pre-release concludes when the solicitation is formally opened, about 45 days before the closing date. Almost all released topics coincide with those in the pre-release, although occasionally a few topics from the pre-release are not included in the solicitation. At this stage, all mention of topic authors is removed from the topics. The formal solicitation is posted on the Internet, at <http://www.dodsbir.net/>. As mentioned above, firms can ask questions, but the answers from the contracting officer are posted for view by all potential proposing firms.

6.4
SELECTION PROCEDURES

6.4.1
Phase I Contract Selection

Since the SBIR program’s inception at DoD, all SBIR awards have been contracts awarded on a competitive basis. The solicitation identifies the evaluation criteria for both Phase I and Phase II. Contracts generally require a deliverable, and for most Phase II SBIR contracts in recent years, the deliverable is a prototype. Having a prototype is often the first step in demonstrating commercial potential. Only firms that are completing a Phase I project can be considered for a Phase II award.

Beginning in 2003, DoD put a single contracting officer in the Defense Contracting Office, Washington, DC, in charge of the solicitation process. This civilian position is designed to provide prompt proper attention to logistical problems that might adversely affect the timely submission of proposal—such as the January 2003 overloading7 of the electronic site that caused some firms to miss the original submission deadline. The lead contracting officer can now make decisions to extend the proposal deadline or otherwise modify requirements.

Once the Phase I proposal deadline date has passed, each DoD component takes charge of the proposals submitted in response to its topics. The selection

7

DoD SBIR Solicitation 2003-1 was the first to require all proposals to be submitted electronically. It cautioned firms; “As the close date draws near, heavy traffic on the web server may cause delays. Plan ahead and leave ample time to prepare and submit your proposal.” However most firms waited until the closing 24 hours, resulting in network overload and a subsequent decision to reopen the solicitation. Subsequently, DoD substantially increased server capacity, and no subsequent solicitation has encountered similar problems of overload and delay.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

process varies considerably among services and agencies, from a centralized decision process in the Army to decentralized processes in Navy and Air Force. SBIR technical monitors are involved in the evaluation teams and recommendation of proposals to varying degrees. Contracting officers make the final selections and awards.

The actual selection process is quite diverse. DoD has about 30 separate awarding elements (e.g., the Navy has 8 to 10). There are published criteria for evaluation in each solicitation, and written evaluations are prepared by each evaluator for each proposal, but there is no published standard procedure as to how an element picks who will evaluate, though generally there are three reviewers per proposal, nor what happens after the initial evaluation. Proposals are evaluated on their merit, not necessarily in comparison to other proposals before the agency decides how many awards to make. Thus there may be one or more tiers of technical management within an element making recommendations before the proposals reach the contracting officer. Ultimately, some topics will see only one award while others may see multiple awards. These decisions may be based solely on the quality of the proposal, or may include the diversity of technical approaches and the importance of the topic, as well as available funding.

A contracting officer is designated as the Source Selection Authority (SSA),8 with responsibilities defined in Federal Acquisition Regulations (FAR).9 Adherence to these regulations is necessary to avoid protests about selection procedures being filed with the General Accounting Office (GAO). Proposal evaluations are legally based solely on the factors specified in the solicitation. These include:

  1. The soundness, technical merit, and innovation of the proposed approach and its incremental progress toward topic or subtopic solution.

  2. The qualifications of the proposed principal/key investigators, supporting staff, and consultants. Qualifications include not only the ability to perform the research and development but also the ability to commercialize the results.

  3. The potential for commercial (government or private-sector) application and the benefits expected to accrue from this commercialization.

Where technical evaluations are essentially equal in merit, cost to the government is considered as a tiebreaker. The solicitation also states that, “Final decisions will be made by the DoD component based upon these criteria and

8

Unless the agency head appoints another individual for a particular acquisition or group of acquisitions which it does rarely for SBIR.

9

For example, in accordance with FAR 15.303(b)(1), the SSA shall “establish an evaluation team, tailored for the particular acquisition, that includes appropriate contracting, legal, logistics, technical, and other expertise to ensure a comprehensive evaluation of offers.”

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

consideration of other factors including possible duplication of other work, and program balance.”10

In general, firms speak positively of the fairness of the award selection process. Some interviewees privately note that some firms have extensive contact with DoD officials, and are thus better able to have their specific technologies “built into” the topic selection process and thus have an inside track in selected competitions. Importantly, they also report that this advantage does not automatically lead to an award.

6.4.2
Phase II Selection Procedures

A Phase II proposal can be submitted only by a Phase I awardee, and only in response to a request from the agency. The latter condition is unique to DoD’s SBIR program. Phase II application is not initiated by a solicitation, or by the awardee. Although the formal evaluation criteria remain the same, the commercialization factor is more important for Phase II selection.

DoD components use different processes to determine which firms to invite for the Phase II competition. These vary from a decision made by the technical monitor for Phase IIs at DARPA to a centralized process like that used at MDA.11 The latter provides a template for Phase II decisions, where recommendation is based on several criteria.12

  • The Phase II prototype/demonstration (what is being offered at the end of Phase II?).

  • Phase II benefits/capabilities (why it is important?).

  • Phase II program benefit (why it is important to an MDA program?).

  • Phase II partnership (who are the partners and what are their commitments? Funding? Facilities? This also can include Phase III partners).

  • Potential Phase II cost.

These criteria address the basic business case for a Phase II invitation. Providing answers requires communication between the program office, the Phase I SBIR awardee, and the Phase I technical monitor.

Selection processes may be centralized, with a fixed date for submission of all of the Phase II proposals for that year (as in the Army), or decentralized to component commands or laboratories, as in the Navy and Air Force, where deci-

10

This quote from section 4.1 of DoD SBIR Solicitation 2005.3 had been identical in every solicitation since 1983. It was changed slightly in 2006 such that the other factors are now specified in section 4.2.

11

An MDA program begins the process for a Phase II invitation by making a recommendation (all MDA topics are sponsored by MDA programs).

12

Criteria provided by Mike Zammit, MDA SBIR Program Manager, in an interview on September 22, 2005.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

sions are made as the individual proposals are received and evaluated. Scoring procedures vary among components, with some using primarily qualitative assessment ratings and others a more quantitative approach to scoring.13

6.4.2.1
Commercialization Review

Under the 1992 Reauthorization, DoD established a Company Commercialization Report (CCR) as a part of any SBIR proposal for all firms which had 15 or more Phase II awards over the previous five years. DoD extended the CCR requirement to all firms in 1997, and made submission electronic in 1999. By 2000, the DoD CCR required firm information in addition to the sales and funding information on all prior Phase II awards. Firm information includes identification information as well as annual revenue, number of employees, issuance of an IPO—all indicators that can be used to gauge firm development. The CCR also requires firms to state the percentage of revenue derived from SBIR, which measures dependency on SBIR.

The CCR permits firms to provide additional information—such as the non-commercial impact (mission impact, cost savings, reliability improvements, etc.) of its SBIR projects. These factors, coupled with specific results (sales including customer, additional funding by source, identification of incorporation into a DoD system) from prior Phase II awards, along with the numerical score of the Company Achievement Index (CAI), are used to evaluate the past performance of a firm in commercializing their prior SBIR.

The CAI compares how well a firm has commercialized its Phase II compared to other firms with like number of contracts awarded in the same timeframe. Although external discussion often focuses solely on the numerical CAI, the CCR actually provides the evaluator with a comprehensive picture in which the CAI is a component. Even when the CAI is extremely low, theoretically denying one half of the commercialization score, the denial may be overridden based on the more complete picture.

In addition to the required Company Commercialization Report, each Phase II proposal must contain a two-page commercialization strategy, addressing the following questions:

  • What is the first product that this technology will go into?

  • Who will be your customers, and what is your estimate of the market size?

13

To give an example of how this works at a component, at MDA the TPOC recommends a Phase II invitation. The recommendation goes to the MDA SBIR Working Group, and on approval then goes to the MDA SBIR Steering Group (which decides based on the same criteria plus funding availability). The steering group recommendation then passes to the MDA Selection Official, who has the final authority.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
  • How much money will you need to bring the technology to market, and how will you raise that money?

  • Does your company contain marketing expertise and, if not, how do you intend to bring that expertise into the company?

  • Who are your competitors, and what is your price and/or quality advantage over them?

The commercialization strategy must also include a schedule showing estimated commercial returns (i.e., amount of additional investment, sales revenue, etc.) one year after the start of Phase II, at the completion of Phase II, and after the completion of Phase III.

Finally, proposed cost-sharing by a third party has been an accepted tie-breaker between equivalent proposals since the program’s inception. In the early 1990s, MDA (then known as SDIO/BMDO) began emphasizing co-investment as evidence of commercialization potential. Matching funds became a formal requirement for some parts of DoD SBIR with the implementation of Fast Track in 1996. The ratios used and the source requirements for third-party funds vary among components.

6.4.3
Composition of Selection Panels

Selection panels are comprised of DoD personnel. Two or three technical experts at the laboratory level review each proposal. Proposals are judged competitively on the basis of scientific, technical, and commercial merit in accordance with the selection criteria listed above.

Responsibility for each topic has been clearly established prior to the Phase I solicitation, so reviewers can access their proposals electronically immediately after the solicitation closes. This significantly shortens the decision cycle. If a proposal is beyond the expertise of the designated reviewers, the person with overall topic responsibility will obtain additional reviewers.

6.4.4
Fairness Review

Firms whose proposals were rejected can request a debriefing, which indicates how the proposal was scored on each specific evaluation criterion. The criteria discussed at debriefings must include only those that can fairly and properly be used for determining source selection. If practicable, the contracting officer and at least one engineer or scientist knowledgeable in the applicable field of technology conducts the debriefing, offering feedback on the weaknesses and strengths of the proposal, and how it might have been improved. The debriefing aims to ensure that the applicant fully understands why the proposal was not selected.

As recounted by firms and SBIR program officers, submitting an SBIR proposal to DoD is a learning process. Firm interviews indicate that many were ini-

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

tially unsuccessful, and that all had “losers” as well as “winners.” There are many workshops available where firms can learn how to submit good SBIR proposals, but experience—including debriefings—is often the best teacher.14

6.4.5
Program Manager Role

Currently, the role of the SBIR program manager at the awarding agencies and components is largely administrative. It entails monitoring award decisions, reporting, and the expenditure of contract funds. Notably, program managers do not currently make award decisions at any of the components.

This role has changed over time. For example, prior to 1993, the Army program manager did decide who to fund, largely based on which R&D organizations first submitted sound recommendations for funding. Through the late 1990s, the MDA SBIR program manager had considerable influence over final decisions on awards.

In many cases, the maximum award given by a component is smaller than that allowed in SBA guidelines. Successful Phase II outcomes, which demonstrate the value of additional funds, are also often the basis for the addition of non-SBIR program funding. Because of the way DoD records awards, this makes it appear that DoD is awarding contracts much larger than SBA SBIR guidelines. But selection procedures and authority for additional funds lie with the acquisition program or the R&D organization, not the SBIR program manager.

6.4.6
Resubmission Procedures and Outcomes

If a Phase I proposal is not selected for award, firms may submit a very similar proposal for a topic in a subsequent solicitation, or submit a proposal in response to the solicitation of a different agency. A firm may also submit a very similar proposal to two or more DoD components or other agencies in the same solicitation if each component had an appropriate topic. In these cases, the firm must note that the other proposals are being submitted. If any proposal is awarded, the firm must inform the other agencies.

Resubmission of rejected Phase II proposals is more difficult. In most of DoD, aside from the Army, a rejected Phase II cannot be resubmitted for the same Phase I topic. The Army allows resubmission of a rejected Phase II proposal or submission of a Phase II proposal on a Phase I proposal from a prior solicitation year. Navy also encourages the Navy staff to find relevant Phase Is that did not go to Phase II, both from the Navy and other agencies and services, to meet new, related needs in a more timely fashion.

14

When an agency makes an award, if only one proposal was submitted for a topic, that must be reported to SBA. DoD does not award single proposals. Not every topic results in an award, and all awards result from competition.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

6.5
POST-AWARD TRAINING AND ASSISTANCE

DoD provides considerable information about sources of assistance for potential participants and awardees on its Internet sites. Program managers participate in workshops at national and regional SBIR conferences, and at various outreach activities to provide training to firms interested in participating or improving their performance in the SBIR program.

At the Navy, initial participation is required for all Phase II recipients, although not all choose to complete the entire program. The Navy Transition Assistance Program (TAP), formerly the Commercialization Assistance Program (CAP), was recently reoriented. The name change emphasizes the mission orientation of this program.

TAP is a 10-month program offered exclusively to SBIR and STTR Phase II award recipients. The program aims to (1) facilitate DoD use of Navy-funded SBIR technology; and (2) assist SBIR-funded firms to speed up the rate of technology transition through development of relationships with prime contractors, and by supporting preliminary strategic planning for Phase III. TAP also underwrites the Navy’s Opportunity Forum, an annual event attended by prime contractors, other private sector companies, and representatives from various DoD agencies as well as SBIR awardees.

6.6
OUTREACH: PROGRAM INFORMATION SOURCES

The DoD Web site15 provides extensive information that supports the preparation of proposals and negotiation of the contracts. Detailed information available via the web site provides in-depth information on the DoD program.

The DoD SBIR Help Desk, 1-866 SBIR HELP, is available to answer general and administrative questions. During pre-release, technical monitors answer technical questions about topics and agency needs.

DoD sponsors or participates each year in National SBIR Conferences. In addition, when state or regional activities sponsor SBIR events, one or more DoD SBIR program managers (dependent on the size of the event) generally participate. Such events provide information on the program including classes on specific aspects and usually provide opportunities for firms to have one-on-one meetings with a DoD program manager to address individual questions.

The schedule below was taken from the DoD Web site in December 2006. National conferences are published a year in advance, whereas other events are not usually known more than a quarter in advance.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

National SBIR Conferences

  • 2006 Fall National SBIR/STTR Conference, Milwaukee, WI, November 6–9, 2006.

  • Beyond Phase II: Ready for Transition Conference, Crystal City, VA, August 20–23, 2007.

  • 2007 Spring National SBIR Conference, Research Triangle Park, NC, April 30–May 3, 2007.

  • 2007 Fall SBIR Conference, Richardson, TX, October 29–November 1, 2007.

Other Events Where DoD SBIR Will Be Present

  • Innovative Transitions 2006 Virginia's 12th Annual SBIR Conference, Herndon, VA, December 4–5, 2006.

6.7
FUNDING GAPS AND FUNDING INITIATIVES

Funding gaps between Phase I and Phase II proposals continue to present a financial problem for many SBIR awardees, especially for start-up and other smaller firms. The standard adjustment for firms addressing this gap is to reduce work on the project and to redeploy personnel to other funded projects. Larger firms with multiple SBIR awards or considerable prior experience with the program appear to treat the gaps as routine, if annoying, business liquidity problems.

For firms that do not have other sources of funding, funding gaps can require managers to shut down projects, lay off staff, and, go without salary for several months. An especially irksome aspect of the funding gap reported by some firms is that delays in funding do not always lead to adjustments by DoD technical monitors in the scheduling of Phase II deliverables.

Over the years, DoD has implemented a number of initiatives to help address these funding gap issues. Some of these are discussed below.

6.7.1
Reducing the Time to Contract

DoD has now formally introduced the objective of reducing the Phase II funding gap from an average of 11.5 months to 6 months.

DoD has limited influence over the actual pace of work under the Phase I award and how quickly firms prepare their Phase II submission following completion of this initial work. As with Fast Track, DoD can encourage early submission. However Phase I research can itself result in a change of direction for Phase II—so an early Phase II proposal may sometimes be inappropriate.

When DoD uses a centralized selection process, all Phase II proposals for that component are due the same day. But since the Phase I contracts are awarded

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

by different contracting officers, some Phase I contracts are awarded before others; thus this part of the gap may vary. If the process is not centralized, Phase II evaluation may begin as soon as a proposal is received, which eliminates part of the potential gap.

The Phase II selection process itself is not the primary source of the Phase II funding gap. Most of the Phase II funding gap occurs after the Phase II award selection. While Phase I awards are small enough for the contracting officer to apply simplified contracting procedures, Phase II awards are too large for such procedures, and require a complex process consistent with FAR regulations.

Since 1996, DoD has substantially reduced the Phase I–II gap by speeding the evaluation process and conducting most of the post selection procedures in parallel rather than sequentially. One of the most time consuming activities is the audit of the firm’s accounting procedures to ensure compliance with the FAR. After the audit relating to a firm’s first Phase II is completed, no subsequent pre-award audits are required. However, since DoD attracts so many new entrants each year, many awardees do require an audit. The time involved includes scheduling an extremely busy Defense Contract Audit Agency, conducting the audit, the firm changing procedures if required and reinspections if needed. Firms new to the SBIR program are informed of the requirement prior to Phase I, and are provided with information on what is required in accounting; they are encouraged to begin the process early. All components have reduced the gap to six months or less, and have established procedures to provided gap funding.

6.7.2
SBIR Fast Track

As early as 1992, DoD’s Ballistic Missile Defense Organization (BMDO) had begun to reward applications whose technologies demonstrated commercial private sector interest in the form of investment from outside sources. This BMDO “co-investment” initiative was effectively an informal “Fast Track” program.

In October 1995, DoD launched a broader Fast Track initiative to attract new firms and encourage commercialization of SBIR-funded technologies throughout the department. The initiative aims to improve commercialization through preferential evaluation and by providing tools for closing the Phase I–Phase II funding gap. The program expedites review of, and gives continuous funding during the traditional funding gap to Phase II proposals that demonstrated third-party financing for their technology. Under Fast Track, third-party financing means investment from another company or government agency; or investment in the firm from venture capital or some other private source. Internal funds did not qualify as matching funds.

The matching rates depend on whether the proposing firm has won previous SBIR Phase II awards.

Projects that obtain such outside investments and thereby qualify for the Fast Track will (subject to qualifications described in the solicitation):

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
  • Receive interim funding of $30,000 to $50,000 between Phases I and II;

  • Be evaluated for Phase II award under a separate, expedited process (the expedited decision-making process is acceptable to the agency because outside funding provides an additional form of validation for the commercial promise of the technology); and

  • Be selected for Phase II award provided they meet or exceed a threshold of “technically sufficient” and have substantially met their Phase I technical goals.

Fast Track focuses SBIR funding on those projects that appear most likely to be developed into viable new products that DoD and others will buy and that will thereby make a contribution to U.S. military or economic capabilities. More broadly, the Fast Track program seeks to shorten government decision cycles in order to interact more effectively with small firms focused on rapidly evolving technologies.

Outside investors may include such entities as another company, a venture capital firm, an individual investor, or a non-SBIR, non-STTR government program; they do not include the owners of the small business, their family members, and/or affiliates of the small business.

Small companies report that they have found Fast Track to be an effective tool for encouraging investors to provide additional funds, by offering the opportunity for a match of between $1 and $4 in DoD SBIR funds for every $1 of third-party investment. Investors are essentially acquiring additional nondiluting capital with their investment.

Based on commissioned case studies, surveys, and empirical research, the National Academy’s 2000 Fast Track report found that the Fast Track initiative was meeting its goals of encouraging commercialization and attracting new firms to the program,16 as well as increasing the overall effectiveness of the DoD SBIR Program. The Academy recommended that Fast Track be continued and expanded where appropriate.

In recent years, the data suggest that firms and program managers are increasingly preferring to use Phase II Enhancement rather than Fast Track. Using the award year of the original Phase II as a baseline,17 the data indicate that for Phase II awards made in 1997, 7 percent were Fast Track and 2 percent were subsequent winners of Phase II Enhancement. For 2002, 4 percent were Fast

16

It is important to note the limitations to this research. The first limitation concerns the relatively short time that the Fast Track program has been in place. This necessarily limited the Committee’s ability to assess the impact of the program. Secondly, although the case studies and surveys constituted what was clearly the largest independent assessment of the SBIR program at the Department of Defense, the study was nonetheless constrained by the limitations of the case-study approach and the size of the survey sample.

17

Phase II Enhancements for a 2002 Phase II are actually awarded in 2004.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

Track and 18 percent were Phase II Enhancement. For 2003 Fast Track awards fell to 2 percent.18

6.7.3
Phase II+ Programs

Phase II+ or Phase II SBIR Enhancement programs began in 1999 in the Army and the Navy. Army provided a dollar-for-dollar match up to $100,000 against third-party investment funds, in a project aimed at extending Phase II R&D efforts beyond the current Phase II contract to meet the needs of the investor, and to accelerate the Phase II project into the Phase III commercialization stage. The Navy program provided a 1:4 match against third-party funding of up to $250,000. Other services and agencies soon followed suit.19

The services and agencies vary widely in their implementation of enhancement programs, and these programs have also changed over time. The Army now defines “third-party investor” to mean Army (or other DoD) acquisition programs as well as the private sector. The Air Force selects a limited number of Phase II awardees for the Enhancement Program, which addresses technology barriers that were discovered during the Phase II work. These selected enhancements extend the existing Phase II contract award for up to one year, and provide a 1:1 match against up to $500,000 of non-SBIR funds.

The Navy essentially breaks its overall Phase II funding into a smaller than maximum Phase II contract plus an option. The latter is expected to be fully costed and well defined in a Phase II proposal, describing a test and evaluation plan or further R&D. Navy Phase II options typically fund an additional six months of research.

The Navy has now introduced a new Phase II Enhancement Plan to encourage transition of Navy SBIR-funded technology to the fleet. Since the law (PL102-564) permits Phase III awards during Phase II work, the Navy will provide a 1:4 match of Phase II to Phase III funds that the company obtains from an acquisition program. Up to $250,000 in additional SBIR funds can be provided against $1,000,000 in acquisition program funding, as long as the Phase III is awarded and funded during the Phase II.20

MDA also has a Phase II Enhancement policy. While not guaranteed, MDA may consider a limited number of Phase II enhancements on a case-by-case basis. Both the MDA and Navy programs are focused exclusively on supporting the

18

DoD awards database.

19

DoD’s FY2006 solicitation states: To further encourage the transition of SBIR research into DoD acquisition programs as well as the private sector, each DoD component has developed its own Phase II Enhancement policy. Under this policy, the component will provide a Phase II company with additional Phase II SBIR funding if the company can match the additional SBIR funds with non-SBIR funds from DoD acquisition programs or the private sector. Generally, enhancements will extend an existing Phase II contract for up to one year and will match up to $250,000 of non-SBIR funds.

20

DoD Small Business Resource Center, available at <http://www.dodsbir.net/ft-ph2/>.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

transition of technologies into the services, not into private sector commercialization. The Air Force program has similar requirements.

6.7.4
DoD Programs for Closing the Phase I-Phase II Gap

DoD services and agencies vary in the upper level of support they provide using Phase I awards. This impacts whether they see a need to have separate gap funding initiatives in addition to Fast Track. DoD Phase I awards are typically $60,000 to $100,000 in size, and generally last for a period of six to nine months. Table 6-1 contains a summary of the provisions of each component’s Phase I and Phase II awards.

6.7.4.1
Navy

The Navy only accepts Phase I proposals with a base effort not exceeding $70,000 to be completed over six months. Options for contract extensions not exceeding $30,000 and three months are available to help address the transition into the Phase II effort. Phase I options are only funded after receipt of a Fast Track proposal or after the decision to fund the Phase II has been made. The Navy has thus effectively divided the permitted Phase I funding into two components; the second component is used as bridge funding between Phase I and Phase II as necessary.

6.7.4.2
Air Force

The Air Force Phase I proposal covers a nine month effort, and can cost no more than $100,000 in total. Submission of the Phase II proposal at six months along with an interim Phase I report provides an additional funded period of three months while the Phase II proposal is being evaluated.

6.7.4.3
Army

The Army has implemented a Phase I Option that can be exercised to provide gap funding while a Phase II contract is being negotiated. The Phase I maximum at Army is $70,000 over six months. The Phase I Option—which must be proposed as part of the Phase I proposal—covers activities over a period of up to four months with a maximum cost of $50,000. Only projects that receive an Army Phase II award are eligible to exercise the Phase I Option. Phase II funding is then reduced to keep the total cost for SBIR Phase I and Phase II at a maximum of $850,000.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

TABLE 6-1 Summary of Agency Specific Proposal Submission Requirements

Component

Upload Technical Proposal

Online Form Cost Proposal Preparation

Phase I

Phase II

Cost

Duration

Cost

Duration

Army

Required

Required

Base: NTE $70,000

6 months

Year1 + Year2 NTE

24 months

 

 

 

Option: NTE $50,000

4 months

$730,000

 

Navy

Required

 

Base: NTE $70,000

6 months

No limit, but in general

24 months

 

 

 

Option: NTE $30,000

3 months

Base: $600,000

6 months

 

 

 

 

 

Option: $150,000

 

Air Force

Required

Required

NTE $100,000

9 months

No limit, but in general $750,000

24 months

DARPA

Required

 

NTE $99,000

8 months

Base + Option: No Limit, but in general $750,000

24 months

MDA

Required

Required

NTE $70,000

6 months(min)

Base: NTE $750,000

24 months (min)

 

 

 

 

Option: NTE $250,000

DTRA

Required

 

NTE $100,000

6 months

NTE $750,000

24 months

SOCOM

Required

 

NTE $100,000

6 months

NTE $750,000

24 months

NIMA

Required

 

NTE $100,000

9 months

Base: NTE $250,000

12 months

 

 

 

 

 

Option: NTE $250,000

12 months

OSD/Army

Required

Required

NTE $100,000

6 months

NTE $750,000

24 months

OSD/Navy

Required

 

 

 

 

 

OSD/AF

Required

 

 

 

 

 

OSD/SOCOM

Required

 

 

 

 

 

OSD/DHP

Required

 

 

 

 

 

CBD/Army

Required

Required

Follows the Army, Navy, Air Force, or SOCOM cost and duration requirements shown above

CBD/Navy

Required

Required

 

 

 

 

CBD/AF

Required

Required

 

 

 

 

CBD/SOCOM

Required

Required

 

 

 

 

SOURCE: Department of Defense Web site, accessed May 2007.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
6.7.4.4
DARPA

Phase I proposals cannot exceed $99,000, and cover a six month effort. Phase I contracts can only be extended if the DARPA TPOC decides to “gap” fund the effort to keep a company working while a Phase II proposal is being generated. The amount of gap funding depends on the funding available to the TPOC.

6.7.4.5
MDA

MDA accepts Phase I proposals not exceeding $100,000, covering six months’ work. Fast Track applications must be received by MDA 120 days prior to the Phase I award start date. Phase II proposals must be submitted within 180 days of the Phase I award start date. Phase I interim funding is not guaranteed. If awarded, it is usually limited to a maximum of $30,000. However, this funding is in addition to the $100,000 maximum awarded for Phase I.

6.7.4.6
USSOCOM

The maximum amount of SBIR funding for a USSOCOM Phase I award is $100,000 and the maximum time frame for a Phase I proposal is 6 months.

6.8
DOD SBIR PROGRAM INITIATIVES

Chartered by the Principal Deputy Under Secretary of Defense for Acquisitions, the first solicitation of FY1996 marked the start of new initiatives resulting from the Process Action Team (PAT). These initiatives attempted to reduce the time between the start of proposal evaluation and eventual funding, and to address the need for improved communications between DoD and potential or current applicants.

6.8.1
Enhanced Applicant Information and Communications

One important initiative to improve information flows between DoD and applicants, the establishment of pre-release consultations, has been discussed above. Companies were also given access to better information and answers to DoD SBIR questions via DoD Web sites. A copy of a successful SBIR proposal was posted electronically, as were model Phase I and Phase II contracts.

Program outreach activities were enhanced by initiating pre-release on the Internet and in the Commerce Business Daily, where proposed solicitation topics are made available about 45 days prior to the formal release of the solicitation. OSD—in coordination with the component programs and OSADBU—also advertises the SBIR program at conferences likely to reach minority- and woman-owned small technology companies.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

In 1997, the DoD SBIR Home Page (<http://www.acq.osd.mil/osbp/sbir/>) began offering electronic access to SBIR proposals and contracts, abstracts of ongoing SBIR projects, solicitations for SBIR and STTR programs, the latest updates on both programs, hyperlinks to sources of business assistance and financing, and other useful information. The early posting of Phase I abstracts shortly after award notification allowed potential investors to identify potential Phase I projects in which to invest. The 1997 solicitation also established the Commercialization Achievement Index (CAI) format for commercialization review.

DoD also established a 1-800 SBIR hot line21 to answer general questions about the DoD SBIR program. This hot line was expanded in 1996 to provide assistance/information relevant to proposal preparation strategy, contract negotiation, government accounting requirements, and financing strategies.

6.8.2
Electronic Submission

In FY1999, the Navy required, and the Ballistic Missile Defense Office (BMDO)22 encouraged, electronic submission of proposal cover sheets and abstracts. The Navy also directed that future Phase I and Phase II summary reports be submitted electronically. By the second solicitation of that FY1999, DoD had established a submission site (<http://www.dodsbir.net/submission>), which required all proposers to register and provide commercialization information on their prior Phase II awards electronically.

In 2000, the first entirely electronic submission of proposals occurred in DoD. CBD required, and USSOCOM allowed, complete proposals to be submitted electronically. The 2000 solicitation also stressed that DoD was using commercialization of technology (in military and/or private sector markets) as a critical measure of performance.

The last paper version of a DoD solicitation was printed and distributed during October 2002. All DoD SBIR solicitations have been available electronically since 1997. After 2002, the only source for the DoD solicitation was the submissions Web site.

In the first full use of electronic submissions in January 2003, DoD received substantially more proposals than were expected. The large number of submissions in the last three hours before the deadline23 led to computer problems that resulted in several companies submitting late proposals. DoD reopened the

21

By the first solicitation of FY1997, the hotline had been renamed the SBIR/STTR Help Desk and both a fax number and an email address were provided in addition to the phone number to provide alternate means for obtaining answers to SBIR questions.

22

BMDO was the follow on organization to SDIO and the predecessor of MDA.

23

Prior to this submission, most components required a mailed hard copy in addition to the electronic submission. Since the hard copy had to arrive by the closing date, most small businesses had to complete their proposal on line one or two days before the deadline to allow for mail delivery time. This first solicitation of 2003 was the first time no hard copy was required, resulting in many last minute submissions.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

solicitation briefly to allow these companies to compete. DoD has not suffered similar problems since.

6.9
REPORTING REQUIREMENTS

Phase I final reports are generally required within 210 days of the award. Most are filed earlier, since Phase I funding is generally complete in 180 days, and the report is needed for a Phase II evaluation. As of 2004, all Phase I and Phase II reports must be submitted electronically on the DoD submission site.

Reports fill the contractual requirement for a deliverable. Their use varies widely based on the initiative of the technical monitor and the specific technology being investigated. However, discussions with agency staff suggest that more use could be made of these reports, especially if better tools were available for allowing interested parties to search them.

6.10
EVALUATION AND ASSESSMENT

The 1996–1997 Study of Commercialization of DoD SBIR was the only formal study conducted.24 Each solicitation cycle, firms must submit their Company Commercialization Report as a part of their proposals.

In addition to use in evaluation, the DoD SADBU aggregates some of the information in the CCR and uses it to brief the DoD Principal Deputy for Acquisition and Technology on progress in the SBIR program. Using the information in the CCR, components identify successful projects and contact the firms to develop information for success stories and outreach brochures. Several components conduct annual awards ceremonies to recognize outstanding SBIR projects.

Starting in 1992, GAO has conducted a number of external reviews of the program, or aspects of the program. These include:

  • GAO/RCED-92-37. SBIR Shows Success but Can Be Strengthened. This is the first baseline study of the program. It surveyed 100 percent of all Phase II awards from 1984–1987. It was conducted in 1990–1991.

  • GAO/RCED-95-59. Interim Report on SBIR. Based on agency interviews conducted in 1994 and 1995, this report examined the quality of research and the duplication of projects.

  • GAO/RCED-98-132. Observations on the SBIR. This report compared BRTRC’s 1996 DoD survey (100 percent of Phase II awards from 1984–92) with the original GAO 1991 survey. It included an agency SBIR award database and interviews.

  • GAO/RCED-99-114. Evaluation of SBIR Can Be Strengthened.

24

BRTRC, Commercialization of DoD Small Business Innovation Research (SBIR) Summary Report, October 8, 1997, DoD Contract number DAAL01-94-C-0050, Mod P00010.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

This assessment focused on use of commercialization records in proposal evaluation.

  • GAO-07-38. Small Business Innovation Research: Agencies Need to Strengthen Efforts to Improve the Completeness, Consistency, and Accuracy of Awards Data.

In response to a congressional mandate for a review of SBIR at the five leading agencies, DoD has commissioned the NRC to undertake the current study. This review follows the previous NRC report on the Fast Track program at DoD which compared Fast Track firms with the regular SBIR program at DoD. During the NRC study’s gestation, DoD program managers also commissioned a smaller, more focused study by RAND that was just recently completed.25

  • NRC Fast Track.

  • Navy Output Report (private).

  • PART.

  • Program report (50 slides).

  • NavAir S&T report.

6.11
ADMINISTRATIVE FUNDING

The decentralized organization of SBIR at Defense makes it difficult to precisely determine how much administrative funding is spent on SBIR, or where that funding comes from. The DoD SBIR office is currently engaged in an effort to gather this information, but does not believe that precise accounting is likely, given the wide variety of inputs into the selection and management process, almost all of which is not directly charged to any SBIR budget line.26

Prior to the establishment of SBIR, each agency was presumed to be adequately staffed and funded to administer its R&D budget, and SBIR constituted only a change of direction, not an increase in R&D spending, so no additional administrative funding was anticipated. The SBIR legislation prohibits federal agencies from using any of the SBIR set-aside to administer the program. DoD thus incurs costs to administer the SBIR program—and interviews with staff suggest that it is more expensive to operate a program with hundreds of small contracts than with a single large contract—but receives no offsetting line item appropriation.

Each service and agency has had to absorb the costs of managing its SBIR program out of existing budgets. Within the components, this decentralization continues. For example, the Navy SBIR program office controls the budget for

25

Bruce Held, Thomas Edison, Shari Lawrence Pfleeger, Philip Anton, and John Clancy, Evaluation and Recommendations for Improvement of the Department of Defense Small Business Innovation Research (SBIR) Program, Arlington, VA: RAND National Defense Research Institute, 2006.

26

Interview with Michael Caccuitto, DoD SBIR/STTR Program Administrator, November 27, 2006.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×

its office; each major Navy component (such as NAVSEA or NAVAIR) controls its own SBIR program budget, and so on, down to the laboratory level.

At the service or agency level, there is an SBIR program manager and perhaps a program office, which includes contract staff support, as well as a budget that covers travel expenses. Within the larger components there are SBIR managers (and offices in some cases) at lower level commands and development agencies. Some positions are full time; other SBIR managers have additional duties as well.

At the project level, there are large numbers of technical monitors (TPOCs), who work part time on one or more SBIR projects. Their salary and travel are not specifically associated with SBIR in the components. Similarly, no separate budgets exist to support the contracting officers and legal support necessary for the operation of the SBIR program.

At the DoD level, the DoD SADBU controls the budget for that office. Similarly the DDR&E controls its budget. Neither of these SBIR-associated offices allocates or controls the SBIR administrative budget of any component.

Even if line item amounts were available for contract, legal, audit and finance support, these budgets would likely not include salaries, travel, and other expenses for the hundreds of technical monitors throughout DoD, who may spend five to fifty days a year writing topics, reviewing SBIR proposals, or monitoring SBIR awards as the Contracting Officer’s Technical Representative (COTR). Imputations of the costs incurred by these activities are possible, but have not been done. Thus, no estimate of the cost to DoD of managing its SBIR program currently exists.

Yet while precise budgeting is not possible under the current organization and financial architecture, it is clear that some agencies provide substantially more administrative funding than others.27 The Navy’s SBIR program in particular has been funded at a level of approximately $20 million.28 This has allowed the Navy SBIR program to innovate in important ways—via the TAP program, for example, and also through enhanced evaluation and assessment efforts. This level of agency commitment is not matched at other agencies, where significantly less administrative funding is available.

Both for purposes of evaluation and management, it is important to better understand the program’s operations and the impact of various procedures of program innovation. To do so, more management and evaluation resources are required, as the Navy has demonstrated. Given the substantial size of the current SBIR program at Defense, additional management funds would seem to be warranted and are likely to be cost effective.

27

It should, however, be noted that close comparisons are not self-evident, because each DoD agency funds its administrative work differently, especially in the SBIR program where so many other functions (TPOCs, administrators , topic reviewers, proposal reviewers) work on SBIR and other projects without being attached to any SBIR line item.

28

John Williams, Navy SBIR Program Manager, Private communication.

Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 187
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 188
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 189
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 190
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 191
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 192
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 193
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 194
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 195
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 196
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 197
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 198
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 199
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 200
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 201
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 202
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 203
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 204
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 205
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 206
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 207
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 208
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 209
Suggested Citation:"6 Program Management." National Research Council. 2009. An Assessment of the SBIR Program at the Department of Defense. Washington, DC: The National Academies Press. doi: 10.17226/11963.
×
Page 210
Next: Appendixes »
An Assessment of the SBIR Program at the Department of Defense Get This Book
×
Buy Hardback | $122.00 Buy Ebook | $99.99
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

The SBIR program allocates 2.5 percent of 11 federal agencies' extramural R&D budgets to fund R&D projects by small businesses, providing approximately $2 billion annually in competitive awards. At the request of Congress, the National Academies conducted a comprehensive study of how the SBIR program has stimulated technological innovation and used small businesses to meet federal research and development needs. Drawing substantially on new data collection, this book examines the SBIR program at the Department of Defense and makes recommendations for improvements. Separate reports will assess the SBIR program at NSF, NIH, DOE, and NASA, respectively, along with a comprehensive report on the entire program.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!