National Academies Press: OpenBook
« Previous: 6 The Long Hot Summer and Cold Hard Winter of 1987
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 52
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 53
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 54
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 55
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 56
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 57
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 58
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 59
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 60
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 61
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 62
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 63
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 64
Suggested Citation:"7 Evaluation and Award." National Research Council. 1989. Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study. Washington, DC: The National Academies Press. doi: 10.17226/10446.
×
Page 65

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

7 EVALUATION AND AWARD I., In which the FTS2000 proposals are evaluated and awards are snide to AT&T and US Sprint. THE SOURCE SELECTION PLAN With the receipt of proposals from AT&T, Martin-Marietta, and US Sprint, the project now moved into the evaluation and award phase. Chapter 6 described the formation of the final REP and the events leading up to the submission of the three proposals. GSA had also completed the source selection plan in parallel with this (Note 1~. The source selection plan included: · the organization, responsibilities, and resources to undertake the evaluation and award process; · the evaluation method, including evaluation criteria, scoring system, and the scoring logistics; and finally the plan for security during the evaluation process. THE SOURCE SELECTION ORGANIZATION AND RESPONSIBILITIES The source selection organization is as shown in Figure 7-1. The main components consist of: · the source selection authority; · the advisory committee; and · the source selection evaluation board. The source selection authority is required by government procurement regulations, and is the single official of the government who makes the award decision. This decision takes into account the recommendations of the evaluation board and the advisory committee. As was described in the last chapter, Golden had agreed to serve as the 52

53 rip , Ray Kline, Chairman l l l . ~ James Burrows Roger Cooper Donald Scott Jon Seymour Roscoe Egger Thomas Morris Eli Noam 1. SOURCE SELECTION AUTHORITY Administrator of GSA ADVISORY COMMITTEE | Telecommunications Interagency Management Council ' Legal Counsel | Contracting Officer | SOURCE SELECTION EVALUATION BOARD ADVISORS E.G., CPA Firm Interagency Task Force I 1 · I TECHNICAL CHAIRPERSON Sherman Naidor! i_ CHAIRPERSON Bruce Brignull9 GSA TRANSITION MANAGEIV ENT AND CHAIRPERSON BUSINESS CHAIRPERSON Dave Cleveland Martin Wagner COST CHAIRPERSON James Owens FIGURE 7-1 FTS2000 Source Selection Organization. SOURCE: General Services Administration news release, "FTS2000 Procurement Advisory Committee Named by Golden," April 22, 1988.

54 source selection authority for the procurement. There were, however, administrator changes. Golden left GSA on March 4, 1988, having ensured the receipt of proposals, a viable procurement, and a robust REP. The role of source selection authority now became the responsibility of the new-acting administrator, and after his resignation later that summer, it became the responsibility of his successor. Continuity in keeping the project as one of GSA's highest priority was ensured by Rep. Brooks. The advisory committee was not a required part of the government regulations for evaluation and award of procurements. It was, however, one used successfully in very large procurements, for example, in NASA and the DOD. An advisory committee provides an independent body to oversee the selection. The committee was also to advise the administrator of GSA concerning the procurement, and the committee's key responsibility was to assure the administrator that the evaluation plan was adequate and, most important, that it was followed. The committee was composed of a mixture of senior government employees and experts from outside of government. As with all people on the selection organization, the members had to be cleared according to the security procedures. The source selection evaluation board, required by regulation, contained most of the resources devoted to the evaluation and award process. This was composed of a mixture of GSA staff, staff from GSA's customer agencies, and the MITRE Corporation. The major responsibilities of the source selection evaluation board was to actually evaluate the proposals and produce the evaluation reports, with recommendations for the awards of the two contracts. Proposals were to be submitted in four separate volumes: a technical proposal, containing the executive summary, a checklist showing adherence to the mandatory and optional requirements, a cross-referenced chart showing compliance to evaluation criteria, analysis of the service requirements, and the detailed technical approach; a management proposal, with the management response to the REP, showing the ongoing management plan for network operations, the transition and implementation descriptions, the approach that was being taken to meeting NSEP requirements, and corporate qualifications; a business proposal, with the government-required declaration forms, representations and certifications, qualification and financial information, and the vendor's annual report; and the cost proposal, containing all of the elemental cost data that was to be delivered as a computer model. As a consequence of having four different proposal volumes to evaluate, the evaluation board consisted of four panels, each with defined responsibilities to evaluate certain sections of the proposals. The technical panel was responsible for evaluating the technical volume and the NSEP section of the management volume. The management and business

55 panel was responsible for the remainder of the management volume and the business volume. The transition panel was responsible for the transition section of the management volume. The cost panel was responsible for the cost volume. Each of the four panels was headed by e' chairmen (a government employee) who was responsible for managing the individual panel effort. The panels were composed of government employees supported by MITRE. The complete evaluation board was headed by a chairman (a GSA employee) and the whole board was supported, as needed, by government employees, MITRE experts, and private consultants. All participants had to be cleared as described below under security. Finally, of major importance in the source selecting organization, there was the contracting officer. This is the single individual responsible for executing the contracts consistent with federal procurement law, policies, and regulations. The officer also acted as chief negotiator and served as the'single point of contact with the vendors. THE EVALUATION AND AWARD PROCESS The evaluation~process was broken into several tasks, the main ones of which were: ~ - . initial evaluation of proposals to screen proposals for completeness,- roughly score the proposals, and generate questions. -Also during this task, to develop- raw scores', and apply~weights, hence completing'a first~round ranking; - establishment of the competitive range to determine if it was reasonable that all proposals tee' considered. This would be done by the contracting officer and the advisory committee; · notification to the vendors and preparation for the site visits for operational and capability demonstrations and to validate claims in the proposals and complete a second round of scoring; · carry out negotiations with each vendor and develop any necessary amendments to clarify items; '' · develop the request for best and final offers, obtain submission of best and final offers; and perform final evaluation, develop the source evaluation board reports, brief the source selecting official, select the winners, and award the contracts. This was-planned to be accomplished in 26 weeks--a most ambitious schedule, one frequently' faking as much as 18 months on large procurements. However, this'' schedule would bring the award into November 1988, when the presidential election was due. - There is always a concern in major government efforts, that a new administration might change direction or halt a major project (like

56 FTS2000) while it evaluates the benefits. There was little risk of this as the Republican administration supported the project and Brooks, the congressional supporter, was a powerful Democrat. However, Brooks was expected to move out of his role as chairman of the House Government Operations Committee after the election, into the more senior role of chairman of the House Judiciary Committee. Hence, his focus on FTS2000 would inevitably change. The evaluation process included a definition of the evaluation factors. By regulation, the evaluation factors, weights, and scoring mechanisms had to be determined before proposals were accepted. Technical, management, and business qualifications and cost were all accorded approximately equal weights in the evaluation. (Originally, before breaking up the contract into two awards, which put much of the transition responsibility on the government, the transition was about one-sixth the weight. This was now less.) As technical, management, and business scores became more equal, the importance of prices would increase. Pass/fail checklists had been developed for the team's use, from the REP specification. Basically these asked: is the item there? Does it comply with the performance requirement? Is it up to the required standard? The technical subfactors that were scored were: an understanding of the requirements; are they all addressed; the soundness of the approach; the flexibility and growth potential of the architecture; network control and diagnostics; and migration to switched ISDN. The management and business subfactors were: quality of transition and implementation plan; experience in managing multisupplier projects; experience in managing large telecommunications systems; management and operations plans; the service oversight center; billing systems; and maintainance of system performance under stress and failure conditions (including emergency preparedness and national security requirements). In the evaluation of cost, the offeror was to make the calculations for the cost figures used in the award and they were to provide the pricing model and tables. The cost evaluation was based on total cost to the government of FTS, FTS2000, and temporary interconnections--more cost meant less points. The cost panel verified offerors' calculations and submission, but more than that, most of the effort went to ensure common understanding, common definition of items, and hence common costs. SECURITY With these products, GSA was now ready to move to the evaluation process. The transition from REP development to evaluation and award involved establishing all the components of the source selection plan, handing over control and authority to the source selection organization, and "buttoning up" the secure facility in which the evaluation would be accomplished. Conceptually, GSA was creating a secure, self-contained bubble into which the proposals would be received and out of which would eventually

57 come the award decisions. This concept of a secure bubble was very important as no information concerning the bids would be allowed to move between the bubble and the outside world until the award was made. Very soon, the implementors of the source selection plan were discussing actions, people, information, and transactions in terms of "in the bubbler or "out of the bubble." Any competitive government procurement follows such principles. However, the size of the procurement and its controversy dictated much more visible, stringent, and serious measures than normal to ensure the bubble would not be breached. Prior even to release of the initial REP, senior management had decided to establish a secure facility and protect the bid information according to procedures for information classified "secret." The information could not legally be so classified, but it could be protected accordingly. These procedures were chosen as there are no established protection procedures for unclassified information of such sensitivity. New procedures, established for this procurement, were likely to be faulty. Hence, using established procedures that were well documented and well understood by many organizations, and being supported by an infrastructure of clearance organizations, cleared facilities, and so on, would be the most reliable and robust. The new team was to be housed in secure facilities on the premises of the MITRE Corporation. MITRE already undertook classified work and hence had facilities established already for this purpose. These were regularly audited by the Defense Investigative Service. The initial requirements for protection of FTS2000 bid materials were exactly the same as for secret materials, and the need to know principle was applied at all times. Security encompassed building security, the use of picture badges, and the use of safe files. Panel work areas were secured by locked entrances with changeable cipher locks and, in almost all cases, it was ensured that walls were true floor to ceiling. The room that would contain cost proposals was fitted with alarms and monitored continually from a central station. The rules of working with secret materials were followed, including, for example, no unofficial visitors, no shop talk outside of work areas, no phone calls to discuss materials, protection of notes and drafts (as were all documents) within the document control system, and no materials to be removed from work areas. Document control consisted of all sensitive materials being entered into a computerized registry with a receipt system utilized for checking materials in and out. MITRE secure document custodians kept all of the registered documents, inventories were kept and checked regularly, and safe files were used for after-hours use. Reproductions could only be made if approved by the contracting officer or chairman of the evaluation board and if carried out under security's supervision. All copies were entered into the document registry. Transmission of documents, for example, between buildings, could only be approved by the contracting officer using double-wrapped bags and a chain of receipts. Destruction of documents had to be reflected in the registry, with disposal of waste in burn bags and incineration by MITRE security staff. All staff in the selection organization were cleared to work on bid

58 materials. As part of this process, all staff were briefed on the relevant government ethics laws and regulations and security procedures. Each had to document and declare no conflict of interest. This was done by means of a statement of employment and financial interests, certification regarding conflict of interest, and certification regarding nondisclosure of information. Most of the staff, however, who would be in the bubble and governed by these procedures had not worked under secret classification previously. As much of the effectiveness of the classification process is good work habits: and a culture of protection, these had to be forcibly, and quickly, established with the team. In the classified arena, the quickest way to test procedures is to try and breach them and correct the mistakes that are found. This technique was utilized effectively. The button-up date was selected as April 27, 1988. Prior to that, senior team members attempted breaches to establish where there were holes and to reinforce the seriousness of the process. A security check of the special facility was made by the GSA's security management on April 26 . On April 27 readiness to receive proposals was declared and the project was formally handed over from the old team to the source selection organization--the new team. Shaking down security, in the first few days, was not without its incidents. On the day of delivery of proposals, AT&T apparently tried to get access to the facility in an effort to assure themselves of the adequacy of procedures. They were rebuffed. Also at the same time a reporter tried and was rebuffed also. An alleged breaching incident (later proved to be without foundation) in the early days was brought to the attention of congressional interests. People on the new team were called to have depositions taken, with questioning about security and who had handled materials in the first few days. The MITRE guards~and the people at the reception desk were also deposed (Note 2~. Congressional staff also arrived and were denied entrance. They called MITRE officials to escort them into the premises and during escorted visits to nonsecure parts of the facility, they reviewed secure areas from their perimeter. Congressional suggestions for changes to the physical security based on these reviews were acted upon. All of these endeavors helped to ensure that security was taken seriously and was tight. As the evaluation progressed, additional measures were taken, including the sweeping of areas for bugs. After best and final offers came in, intermittent searches of briefcases were made by security officials. CHANGING TEAMS Everyone in the organization established inside the bubble was new to the project. The only person left in the project who had any length of experience of FTS2000 was Kowalczyk, who became the primary person interfacing with the bubble on behalf of the GSA administrator. As a consequence of this wholesale change to a new team, if the REP

59 had not been accurate and detailed, if the source selection plan had not been robust, and if the new team members had not been mature and highly motivated to reach contract award. the project could have broken down at this point. Taking over this huge and complex task without the ability to reach outside the bubble for advice and help was a monumental task, and it is to the credit of all concerned that the process continued smoothly to a conclusion. As the team was formed and responsibilities were handed over from the old team, the new members refined the organization and supplemented their panels as they felt necessary. They then openly reached agreement that they would work to ensure that the intent of the specification was not changed. This was the greatest risk in enclosing a new-team out of touch with the original team. However, the bubble had to be firmly closed. In the early stages, as the term looked to Kowalczyk to solve problems for them, he firmly pushed the problems back in the bubble to accelerate the new te~m's gelling as a self-contained unit. EVALUATION With the team and procedures in place, at the end of April 1988, just over three years after the announcement of the project, the proposals came in (Note 3~. US Sprint's proposal arrived a day early, and AT&T's and Martin-Marietta's arrived on the appointed day, April 29, 1988. The proposals were massive and this alone required immediate changes to the processes and procedures for handling documents. In-total the proposals constituted 2,700 cubic feet of paper--enough to fill a room 9 feet high, 30 feet long, and 10 feet wide from floor to ceiling. - Logging in, breaking the proposals up into segments for each panel, and moving the documents to the evaluation teams took three days longer than the four days planned. The receipt was not without its trying moments. For example, one vendor had labeled its volumes wrongly. Since the rigor of the government process could have voided its proposals for this simple mistake, the vendor had to scramble to relabel them for log-in and acceptance. The proposals included the cost models, which in turn included the hardware to run them. In AT&T's case, the cost model was presented on a personal computer. In the other two cases, the cost model was accessed by means of a terminal hooked up via communications to the vendors' off-site computer. This required additional security procedures that had been put in place before receipt, and all equipment was swept for possible bugs. The team worked hard throughout the evaluation. The only official day off from April to award in December was Thanksgiving. The official team schedule, covered in shifts, was a 14-hour day, 7 days a week. Hotel rooms were available in the vicinity of the evaluation facilities for people who could not make it home~at key points, and in fact the chairman of the evaluation board lived in a hotel next to the site for a significant portion of the time. Th;is was the kind of dedication it took to keep on schedule. It is to the credit of the new team, formed from a variety of people from different agencies, organizations, and affiliations

60 that they met it all with good grace and camaraderie. The court actions of the prior year, the bad press over the winter, and the questioning and depositions which continued throughout the evaluation had a chilling but temporary effect upon the team. Some people bought liability insurance; some took a position that they would not give depositions to congressional staff; some said they would not talk to lawyers. Some panel members got their own lawyers and MITRE offered the service of lawyers for their people. To the end, people were loathe to sign some documents. But, as the bubble closed around them and they knit into a team, the chill subsided so that in no way did it hamper the government position in negotiations. The technical evaluation proved a smoother process than the cost evaluation and the schedule was adhered to although it took very long work hours. The cost evaluation was more problematic than the technical evaluation as dilemmas were created by bids being different in their interpretation of cost elements. An absolute need to establish a common understanding in all segments of the proposals was particularly true in the cost volumes. There was a need to ensure consistent understanding across all proposals of what was bid and a need to ensure absolutely that the competing vendors had bid the same things if the bottom-line prices were to be meaningful. There was also a need to check for errors and consistency. This was a long, slow process and it meant no firm indication of the relative positions of the price proposals until six weeks before award. Under the scheme for two awards, the bidding was structured in such a way that each bidder had to present proposals for each of the two sets of requirements, one representing 60 percent of the government's traffic (Network A) and one representing 40 percent (Network B). GSA evaluated the proposals for Network A and effectively eliminated the successful vendor for consideration of an award under Network B. No proposal was submitted for the entire requirement, thereby effectively preventing GSA from considering the trade-offs between a single award and a multiple award. Of course, the evaluation team and the Advisory Committee had access to information concerning the Network B proposal of the successful offeror for Network A. However, given the realities of the procurement, and the clear direction that two awards be made as a national policy, there was no ability to make an award to a single vendor. As a result, there was no ability to officially judge the potential impact of the competition on the total cost to the government of two awards versus one award. PROPOSALS The three proposals were different. There had been much speculation during development of the REP as to how each vendor would configure their network. AT&T, with a large, national software-defined network, could have defined a virtual private network within its national net allowing the economy of scale of the large network to carry through. Rough modeling by GSA indicated that economics between major a..

61 concentrations of government traffic would probably call for dedication of circuits. This indicated that a hybrid solution of dedicated and virtual facilities might be possible. Prior speculations about Martin-Marietta, which did not have a national network, had indicated different approaches from AT&T (though one of its subcontractors, MCI, had a national network allowing similar solutions to those of AT&T and US Sprint). Martin-Marietta's division of responsibilities between its major subcontractors (MCI, Northern Telecom, Tymenet, and the BOCs) was unknown. However, the events that took place in court the previous year indicated a high probability that the BOCs would provide switching. Consequently, GSA had conjectured a dedicated private line architecture with BOC switching, and perhaps MCI handling off-net traffic at the head of the network. US Sprint, with a new national fiber network which was software controlled, like AT&T, was in a position to offer a software-defined virtual network. - The actual bids brought some surprises (Note 4~. AT&T bid a dedicated private line fiber network with 18 nodes, each consisting of a #5ESS switch. On the surface this may seem to have a price disadvantage compared with a software-defined network. However, it did have the advantage that all costs associated with provision of FTS2000 services were easily identifiable. Hence, as a regulated company, the subsequent tariff would be more easily defendable against possible protest by the loser or by other common carriers. From GSA's point of view this architecture also had an advantage in that it meant an immediate financial commitment by AT&T in ordering the 18 switches. Hence, this would provide an incentive for AT&T to speed transition in order to generate revenues, as was proved after award. The AT&T proposal was simply outstanding. There were fewest questions about technical issues, fewest problems with cost, and fewest questions about possible misunderstandings developed from its proposal. One area where AT&T did recast its proposal was in the transition from the old FTS. Here, it had proposed a series of very large cutovers that, in response to the evaluation team's concerns, it modified to be a more gradual process. Martin-Marietta's and US Sprint's proposals were not of the same quality as AT&T's, although they were comparable in quality with each other. Martin-Marietta had proposed a situation not unlike GSA's role in the old FTS, that of an integrator of a dedicated private line network. Martin-Marietta's roles for its teaming partners were a surprise. Martin-Marietta had defined significant roles for itself but consequently, its staff proposed to lead teams were weak compared with equivalent staff from the common carriers. Also, compared with the common carriers, the experience Martin-Marietta demonstrated in telecommunications was shallow. Also, because of its use of subcontractors, the level of integration was not as strong in the whole design. For example, some information was to be moved by paper and magnetic tapes because no interface existed between components. The proposal showed that this would be worked out, but it inevitably raised questions as to the certainty of this. Finally, the company's prices were not as competitive, but this did

62 not become clear until the end. During the evaluation, as weaknesses were identified and questions posed to all vendors, Martin-Marietta substantially redid their network design, unlike AT&T and US Sprint. This raised more questions, for example, about blockage rates that seemed unduly high in portions of the network. This, in turn, raised questions as to the validity of the design-. In effect the company had recast the network to cut prices but this cut the technical quality. One aspect of Martin-Marietta's proposal that was excellent, however, was its transition plan. US Sprint, as had been anticipated by many, proposed a virtual software-defined network within its national net. This was a simpler and better strategy for them compared with AT&T's because Sprint had no problems of questions being raised about public tariffs. The initial quality of US Sprint's proposal was not high. As was described earlier, the company had to some extent always been playing catch up, first, due to the merger between GTE Sprint and U.S. Telcom, and, second, with the in-and-out relationship with its partner, EDS. EDS had finally left the teaming arrangement in February 1988, scarcely two months before submission of proposals. To some extent US Sprint's initial submission was a foot in the door. While every initial proposal had problems, its had the most. However, the company worked to substantially improve the bid over the duration of the evaluation. To summarize broadly: although Martin-Marietta was consistently below its competition, the differences between it and US Sprint were not very large compared with the differences between both of them and the leading AT&T proposal. However, it should be emphasized that there were three viable proposals even to the end, and any of the three vendors was eminently qualified to provide the government's telecommunications. Because of this, there were no problems with establishing the competitive range and passing all three proposals through for consideration up to negotiations and award. ~ Hundreds of questions were generated by the team throughout the evaluation process. As soon as they were generated they were sent out to the vendors to keep the process moving. AT&T's answers were quick and clear. It was obvious that it had devoted many resources to the effort. Even though AT&T's answers were generated quickly, it continued the strategy of moving deliberately and cautiously and were most detailed as to the meaning of every word. It argued for delays when it felt-they were necessary. Martin-Marietta, on the other hand, in clarifying questions, wanted quick response times even though it was recasting its network. It pushed to minimize delays. The evaluation team got the impression that Martin-Marietta's bid team, which had been put together with people from all over the country, seemed to have been partly disbanded. Sprint proved very willing to respond quickly, but seemed to have fewer resources than AT&T. Thus it supported AT&T on delaying the schedule for answering some questions. However, all in all, the vendors were very responsive to the questions even though they frequently had only 24 hours to reply.

63 NEGOTIATIONS With the establishment of competitive range, and with all proposals found worthy, the team then undertook site visits to verify certain aspects of the proposals.' In some ways this was a welcome break even though the pace did not slacken. Initially there were discussions of how many team members should go and~who they should be. Eventually it was' decided that almost everyone should go except the cost panel. Following the site~visits, further evaluation was accomplished and the government team was now ready for negotiations. It was a normal,- although spirited, negotiation, complex more than anything else because of the difficulty in establishing a common understanding of cost elements. Negotiations,'however,~were a critical point in the life of the project. It was difficult to strike a balance between the amendments that would be issued to all three bidders for final resolution and the need to keep them all in the bidding process. If at any time a vendor had felt it was not likely to win, and had withdrawn, GSA would have been left with two awards to make and only two bidders. 'In this position, GSA would have had to abandon the project, resetting the clock back to 1985. Therefore the conduct of negotiations was the point at which project could break down. GSA was particularly concerned with AT&T. AT&T's mistrust had virtually died 'es the project was handed to the new team for evaluation and the security procedures were put in place. However, it was still difficult to establish a sense of trust and competence so AT&T would remain. ~ ~ ~ Negotiations also presented some of the final opportunities for the vendors to jockey for position. They proceeded smoothly, even in sessions with as many as'60 pe-ople in a room~and lasting several days. As the - negotiations drew to a close, final amendments were drawn and the call was made to the vendors for best and final offers that would be finally evaluated for award. - ~ AWARD Due to-delays principally in refining the pricing, the award date, originally October 30, was jeopardized. The team came under intense ' pressure from Congress to complete the award by November. It was at this point that the advisory committee came to the fore. The advisory committee had met several times and had been briefed on progress and issues. It was a major player in changing the early specificity of the scoring scheme to ensure that differences among bids would be more easily discernible at later stages. It had'also advised on proceeding with the evaluations of best~and final offers in parallel so that problems could surface quickly. The advisory committe'e now 'took a leadership role to advocate that the team take the- necessary time to ensure a proper award. Otherwise the whole project would be in danger of making a mistake, which could result in a sustainable protest after the award. The committee provided a counter-force to congressional pressures, 'and the schedule publicly slipped to December 10. ~ .'

64 By December, the final evaluation was completed and preparation was made for award of the larger portion (60 percent of the requirements) to AT&T and the smaller portion (40 percent of the requirements) to US Sprint. To avoid speculation on the part of vendors or in the press, GSA called vendors to tell them to come in and see the contracting officer because of problems with the procurement. The vendors raced in on December 7, 1988, and to their surprise the award announcement was made (Note 5~. FTS2000 was awarded only four years, to the week, after formal approval was given to go ahead within GSA; and this was only five years after GSA decided to replace the network and began to look for a suitable strategy. This was a very speedy schedule for the largest procurement ever undertaken by the civilian agencies of government and the largest replacement project ever in the history of information technology. GSA could be proud of its achievement. FINAL THOUGHTS A case study necessarily has to cut off at some point, though projects continue to develop and do not stop. The point of award was selected to mark the end of the case but the FTS2000 project continues to move through major activities which, in their turn, could provide interesting lessons. The most immediate event after award that should be mentioned in closing, is that Martin-Marietta chose not to protest the award though a protest on the part of the losing vendor had been anticipated from the beginning. This was to Martin-Marietta's credit as protests have become the norm rather than the exception. Had Martin-Marietta protested the award, this case would not have been published for perhaps another year. As stated in the Foreword, one of the purposes of this document is to serve as a case study for teaching purposes. Consequently, no conclusions are drawn (other than that success had an element of chance unique to the times). It is up to the individual reader to determine the lessons learned from the FTS2000 procurement. However, FTS2000 has demonstrated many things that should be pointed out. FTS2000 will impact the telecommunications industry, the taxpayer, and government operations as it is implemented and used. For the telecommunications industry: FTS2000 has demonstrated that customers can obtain lower service prices outside of standard tariff offerings by use of market power and competition, instead of the traditional approach of engineering a private system. FTS2000 has demonstrated the existence of a viable, competitive, common-carrier industry at large demand levels. It has also demonstrated that telecommunications needs can be met by a variety of organizations, not necessarily the traditional common carriers. Although Martin-Marietta's bid lost the competition,

65 it was a viable proposal, and one with many advantages over public alternatives. FTS2000 was the largest corporate award to a non-AT&T common carrier and demonstrated US Sprint's ability to bid and win large complicated arrangements. FTS2000 is a major commitment by the world's largest organization, the U.S. federal government, to integrated services and ISDN. FTS2000 was a deliberate insertion of the newest technology into government to help it face the challenges of the l990s. FTS2000 presents a unified source of supply and management method to the agencies of the government to meet voice, data, video, and integrated service needs. · FTS2000 provides a massive infrastructure on which government agencies can build new systems. · FTS2000 makes possible new applications such as: a national unified electronic mail system for government and highspeed fax between government locations. Both of these bode a new era of intragovernment communication. For the taxpayer, FTS2000 means annual savings well in excess of $150 million, enough to support a major government program to address education needs, skills upgrading, or hopelessness. For the operations of government, FTS2000 has proved that: · agencies can buy the best, most advanced technology; and ~ agencies can undertake successful, speedy procurements. Events will continue to unfold for FTS2000 in the 1990's. transition, the largest in the history of technology, is yet to be undertaken. Many issues remain still to be tested, such as: . . mandatory versus voluntary use (Note 6~; the effectiveness of competition within the contract at the fourth and seventh year; ~ the cost versus benefit of two awards compared to one; o whether, in spite of the Freedom of Information Act, prices can be kept secret as was desired by vendors (Note 7~; and whether the telecommunications user industry will really attempt to capitalize on the example of FTS2000.

Next: Notes on Text »
Beyond FTS2000: A Program for Change: Appendix A -- FTS2000 Case Study Get This Book
×
MyNAP members save 10% online.
Login or Register to save!
  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!