Click for next page ( 134


The National Academies of Sciences, Engineering, and Medicine
500 Fifth St. N.W. | Washington, D.C. 20001

Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 133
5 Current Ballistic Image Databases: NIBIN and the State   Reference Databases Computerized image analysis systems, such as the Integrated ­Ballistics Identification System (IBIS), brought the promise of overcoming some of the limitations of dealing effectively with open case files of ballistics evi- dence. Well utilized, a ballistic image database maintained by an indi- vidual law enforcement agency could now surpass previous limitations of time and human recall. “Human memory or selected bullet or cartridge casing photographs [were] the only tools normally available” to draw connections between ballistics evidence in different cases; “it [was] not normally feasible to be able to link cases beyond a few weeks or months unless investigative intelligence otherwise links the cases” (Tontarski and T ­ hompson, 1998:642). But another vexing challenge to traditional exami- nation remained: the ability to draw connections between cases between different law enforcement agencies and different geographic areas. To meet this need—to make it easier for agencies in a geographic region to submit evidence for imaging and comparison and to make it possible to highlight possible connections between cases across geographic lines—a wider net- work of ballistic imaging sites was necessary. Over the course of the 1990s, the National Integrated Ballistic Information Network (NIBIN) emerged and developed to meet this need. In this chapter we describe the NIBIN program, two main policy options for which—maintenance as is or enhancement by various means— we are charged to assess. We describe the historical evolution of the pro- gram in Section 5–A and its current structure in Section 5–B. We then turn to various measures of the network’s usage (5–C) and performance (5–D). Lastly, we describe in some detail in Section 5–E the existing reference bal- 133

OCR for page 133
134 BALLISTIC IMAGING listic image databases operated by the states of Maryland and New York. NIBIN and the state databases are decidedly not directly connected—as described in the section, the systems are physically walled off from each other as well as being distinctly different in their definition and composi- tion. However, they are based on the same technical platform, and lessons from observing the state databases in operation can also inform possible enhancements for the NIBIN program. We return to the NIBIN policy options in Chapter 6. As with Chapter 4, a summary and our conclusions on the evidence in this chapter are in Chapter 6. 5–A  Evolution of the NIBIN Program 5–A.1  Early Development The program that evolved into NIBIN began in 1992 with the develop- ment by the Bureau of Alcohol, Tobacco, Firearms, and Explosives (ATF) of the CEASEFIRE initiative, the objective of which was to “[enter] into a national computer system all data obtained from firearms seized as a result of a criminal investigation by ATF personnel” (NIBIN Program, 2001). Though oriented around a particular intervention by ATF personnel, the scope of the initiative was broader: “ATF intended to allow State and local law enforcement agencies to use and retrieve information for investigative purposes, and to submit information from their own firearms-related crimi- nal investigations” (Thompson et al., 2002:10). Work on the database com- ponent developed in stages, beginning in 1993 with a partnering between the ATF National Laboratory Center in Ammendale, Maryland, and the Washington, DC, Metropolitan Police Department. This initial pilot work made it possible “to evaluate the impact of operator variability on image quality and matching, networking limitations, and ease of operator use for data entry, as well as correlations and system maintenance” (Tontarski and Thompson, 1998:646). The program grew to include other regional affilia- tions between ATF laboratories and major state and local law enforcement agencies: partnerships emerged between the ATF Atlanta laboratory and the Georgia Bureau of Investigation, and between the ATF Walnut Creek, California, laboratory to the Oakland Police Department and Contra Costa County Sheriff’s laboratories. Also in 1993, the BULLETPROOF system— the bullets-only predecessor to IBIS (see Section 4–A)—was adopted as CEASEFIRE’s hardware and software platform. In 1995, ATF developed a set of criteria for participation in ­CEASEFIRE by state and local law enforcement agencies, including the population and firearms-related crime rates of areas; ATF also considered “known firearms trafficking routes that cross jurisdictional lines” in selecting sites (NIBIN Program, 2001:6). Priority was given to agencies that had demonstrated

OCR for page 133
CURRENT BALLISTIC IMAGE DATABASES 135 willingness to participate in joint investigative programs with the ATF. This initial set of criteria developed into the guidelines now used to evaluate new applicants to host NIBIN sites; see Box 5-1. 5–A.2  DRUGFIRE, Interoperability, and a Unified System By the mid-1990s the looming problem of two potentially overlapping national databases became more apparent: the ATF continued to pursue development of CEASEFIRE at the same time that the Federal Bureau of Investigation (FBI) worked with law enforcement agencies to populate the DRUGFIRE database (see Box 5-2). The DRUGFIRE and CEASEFIRE systems were initially complementary, in that DRUGFIRE was focused on imaging cartridge case evidence and CEASEFIRE on imaging bullets. How- BOX 5-1 Criteria for Participation in the NIBIN Program To request participation in the NIBIN program, an executive of a state or local law enforcement agency had to submit a letter including the following information: • the population of the area to be served by automated ballistics technology, • the number of firearms-related violent crimes in the area serviced by the request­ing agency, • statistics on firearms-related assaults and homicides for the previous year, • the number of firearms recovered by the requesting agency for the previous year, • the number of firearms traced by the requesting agency during the previous year, • whether the requesting agency had a firearms/toolmark examiner, • whether the requesting agency would dedicate staff to support the data entry of ballistics information into the IBIS equipment, • whether the requesting agency had a bullet and casing recovery system, • whether the requesting agency had sufficient space that was climate con- trolled for placement of the equipment, • whether the agency would allow other agencies to use the IBIS equipment if the requesting agency received it, and • whether the agency would enter into a memorandum of understanding (MOU) with the ATF regarding the administration of the program. SOURCE: Reproduced from U.S. Department of Justice, Office of Inspector G ­ eneral (2005:85–86).

OCR for page 133
136 BALLISTIC IMAGING BOX 5-2 DRUGFIRE “Conceived by the FBI in 1991 as a part of the response to a call from the Office of National Drug Control Policy for an emergency action plan to help the Washington, DC Police cope with the rising tide of drug-related violence,” the DRUGFIRE system was established in 1993 using a computer system de- veloped by Mnemonics Systems, Inc. (Denio, 1999:383). DRUGFIRE differed from the successor IBIS in terms of which types of evidence were implemented first. IBIS began with BULLETPROOF (analyzing bullets) and later added the capacity to image cartridge cases (BRASSCATCHER). At the time of the Office of National Drug Control Policy (1994) benchmark evaluation of DRUGFIRE and BULLETPROOF, DRUGFIRE was solely limited to the analysis of cartridge cas- ings; the ROTOSCAN add-on to acquire bullet images was developed around 1996 (Tulleners, 2001:2-2). In their approach to acquiring and analyzing cartridge case evidence, D ­ RUGFIRE and NIBIN diverge in two important respects. First, for purposes of comparison with other exhibits, DRUGFIRE “looked only at the breech face marks and not the firing pin impressions,” while IBIS can separately generate scores and rankings by breech face, firing pin, and ejector marks. More fundamentally, D ­ RUGFIRE used oblique illumination (side light), “much as used by firearms examiners at their comparison microscopes,” while IBIS uses only radial illumina- tion (center light) images for scoring purposes. The DRUGFIRE system typically required the acquisition of “two breech face images at 90-degree orientation” per exhibit (Tulleners, 2001:2-6). Using DRUGFIRE technology, “cartridge cases are searched at approximately ten images per second; bullets are searched at approximately one image per sec- ond.” Accordingly, “for large databases, users are encouraged to use filters based on class characteristics so that the number of images passed to the automated search is drastically reduced” (Denio, 1999:384). As of May 1999, DRUGFIRE installations were located in 150 sites (Denio, 1999); Boesman and Krouse (2001) report that about 171 law enforcement a ­ gencies participated in DRUGFIRE between 1993 and 2001. Tulleners (2001: D‑1) surveyed ballistic image database usage by a number of California law enforcement agencies, including the DRUGFIRE data collected from agencies in southern California (including the Los Angeles Police Department) and main- tained by the Orange County Sheriff’s Department. Across southern California, D ­ RUGFIRE was credited with 431 cold hits on a total of 37,494 entries from center­ fire weapons (about 78 percent of which were test fires from recovered firearms and 22 percent were evidence cartridges).

OCR for page 133
CURRENT BALLISTIC IMAGE DATABASES 137 ever, as both systems continued to develop, and as the technology underly- ing both programs was upgraded to handle both bullets and cartridge cases, practical concerns about redundancy (the need to maintain two systems) and resources came into greater relief. In 1995 the Office of National Drug Control requested an independent technical “benchmark evaluation” of the Bulletproof and ­DRUGFIRE technologies to inform a comparison between the systems. The tests per- formed during this benchmark evaluation suggested strengths in both sys- tems. However, the evaluation concluded that “processing casings and projectiles on a common versatile platform would best fulfill ballistic imag- ing requirements.” This recommendation added impetus to the develop- ment of Brasscatcher as a counterpart to Bulletproof, and the combined IBIS system became the norm in existing and new CEASEFIRE sites in 1996. In January 1996 the FBI and ATF jointly agreed in a memorandum of understanding that IBIS and DRUGFIRE equipment should be made interoperable—specifically, that both systems “are able to (1) capture an image according to a standard protocol and in conformity with a minimum quality standard and (2) exchange images electronically in such a manner that an image captured on one system can be analyzed and correlated on the other” (NIBIN Program, 2001:7). The joint effort was dubbed the NIBIN system. Accordingly, a contract was established with the National Institute of Standards and Technology to study the technical interoperability of the two systems. Ultimately, however, true technical interoperability of the systems—converting the data in each system so that they would be used on both the DRUGFIRE and IBIS platforms—was not achieved. Instead, in 1999 a new memorandum of understanding established a partnership structure: the technical platform of the ATF program (IBIS) was adopted as the hardware/software standard, and the network would be constructed using the high-speed secure infrastructure maintained by the FBI. The partnership between the FBI and ATF in building NIBIN was fur- ther cemented by the structure of the NIBIN executive board (consisting of one senior ATF executive, one senior FBI executive, and an executive from a state or local law enforcement agency) and its technical working groups. However, by October 2003, it was recognized that having two agencies responsible for different aspects of the same national program was an inef- fective management arrangement. Accordingly, network responsibilities and authority were transferred from the FBI to ATF, and ATF became solely responsible for all aspects of the NIBIN program.

OCR for page 133
138 BALLISTIC IMAGING 5–A.3  Full Implementation State and local law enforcement agencies were added to the network in stages. An initial network connecting several northeastern agencies was set up in late 1998 (McLean, 1999:392), and the steps toward full rollout of the program were formalized in a strategic plan in 2000. The largest push in the rollout occurred during a 2-year deployment in 2001–2002, “in which 160 sites have received IBIS equipment,” moving toward a “completed” network of “approximately 233 sites” (Thompson et al., 2002:11). Thompson et al. (2002:11–12) note that “agencies may become part of the NIBIN program in two ways: through inclusion on the tentative deployment list or by nomination.” In addition to signing a memorandum of understanding—agreeing to abide by ATF’s regulations for use of the equipment including the entry of evidence from crime-related guns only (Thompson et al., 2002:12)— An agency must commit its own resources to the NIBIN program. . . . Agencies joining NIBIN must commit to maintaining adequate staff to support the program, and will need a comparison microscope and access to a bullet recovery system to testfire firearms. Agencies receiving a Remote Data Acquisition Station (RDAS) must have a firearms examiner available to evaluate correlation results; in some labs it is helpful to have trained technicians make entries into the IBIS system, freeing examiners to review results and confirm hits by examination of the original evidence. . . . Part- ner agencies must commit to entering as much crime gun evidence into the unit as possible, and to sharing intelligence information and evidence with other law enforcement agencies. An audit report on the NIBIN program by the U.S. Department of Justice (DOJ), Office of Inspector General (2005) indicates that “the ATF has not made any plans to deploy IBIS equipment to additional agencies beyond [those] that have already received it,” save for case-by-case requests by individual agencies and relocation of equipment from low-usage sites. However, the report suggests efforts to expand NIBIN technically by link- ing it with the ATF’s N-Force case management system and to the National Tracing Center. “The ATF is also conducting a pilot program called ‘COPS and DOCS,’ which joins together health care and law enforcement profes- sionals who recover firearms evidence and enter it into NIBIN. . . . When gunshot victims are brought into the hospital, bullets from wounds are packaged with identifying information and placed in an evidence box that is located in the hospital’s operating room.” The recovered bullets are then retrieved and entered into NIBIN by ATF (U.S. Department of Justice, Office of Inspector General, 2005:13–14).

OCR for page 133
CURRENT BALLISTIC IMAGE DATABASES 139 5–B  NIBIN Content and Structure 5–B.1  Regions and Partitions As of December 2005, the NIBIN program included 228 partner sites representing 182 agencies. At least one NIBIN site is located in each state with the exception of Kentucky. The sites are grouped into 12 geographic regions, each of which is linked to servers in one of ATF’s three national laboratories. Servers at the ATF laboratory in Ammendale, Maryland, are the central hub for NIBIN sites in the northeast and north central states; servers in Atlanta, Georgia, link the southeast United States and Puerto Rico; and Walnut Creek, California, is the focal point for NIBIN sites in Texas, the western United States, Alaska, Hawaii, and Guam. The geo- graphic distribution of NIBIN sites and servers is illustrated in Figure 5-1. The regional servers are central to the operation of NIBIN. They are not only the central data repository for the region—combining and archiving data from the distributed sites—but also the “correlation” servers for the region as well. That is, an exhibit entered into NIBIN in Idaho is uploaded to the Walnut Creek servers for comparison with other NIBIN exhibits; the correlation results are then sent back to Idaho for review. Batches of exhibits are transferred from the local sites to the regional servers at least once a day; the exhibits are compiled, comparison scores are generated, and results and images sent back to the local sites. Each of the regional servers is divided into several partitions; these partitions are important because they define the range of automatic com­ parisons (versus those comparisons requested manually). For example, NIBIN region 1B covers central and southern California, and it is divided into three partitions (roughly, northern, central, and southern). The south- ern partition contains two NIBIN installations: the San Diego Police Department and the San Diego County Sheriff’s Department. Hence, an exhibit entered into NIBIN by the San Diego County Sheriff’s Department is automatically correlated against exhibits from both San Diego-area NIBIN sites (after uploading to Walnut Creek). However, searches against data from other sites—Los Angeles or Orange Counties, for example, or Yuma, Arizona—must be specially requested by a NIBIN operator. The Inspector General audit of NIBIN (U.S. Department of Justice, Office of Inspector General, 2005:110) notes: Although regional and national searches can be performed, they must be manually selected. To perform a regional search, the requestor must d ­ esignate where to search from a map of the NIBIN regions. The requestor is then presented with a list of all the partner agencies in that region, and can either search against all the partner agencies shown or de-select those partner agencies that the requestor does not want included in the search.

OCR for page 133
0 BALLISTIC IMAGING 1B 7 Alaska Michigan Alaska Crime Lab C Battle Creek PD Hawaii Detroit PD* Honolulu PD Michigan SP Bridgeport Micronesia N Michigan SP East Lansing Guam PD Michigan SP Grand Rapids Northern California/Nevada Michigan SP Grayling Alameda Co Michigan SP Northville ATF Walnut Creek* Michigan SP Sterling Heights California DOJ Fresno Oakland Co SD California DOJ Sacramento Minnesota Contra Costa Co SO BCA Lab Bemidji Fresno Co SO BCA Lab St. Paul Oakland PD N Hennepin Co SO Sacramento Co DA* Minneapolis PD 1A Salinas PD* North Dakota Central San Francisco PD* North Dakota Dept. of Health ATF Gun Center San Mateo Co SO South Dakota Santa Clara Co DA* S California DOJ Riverside Office of the Attorney General Las Vegas Metropolitan PD Stockton PD* Wisconsin Long Beach PD Washoe Co Wisconsin DOJ Milwaukee Los Angeles PD* OR/ID/WA/MT Wisconsin SP Milwaukee* Los Angeles SO* Idaho State Police Orange County SO Montana DOJ Santa Ana PD* Oregon State Police San Bernardino SO Washington SP Seattle San Bernardino PD Washington SP Spokane 5 Northern Washington SP Tacoma Illinois Kern Co/Bakersfield DA U.S. Fish & Wildlife Ashland Illinois SP Carbondale Ventura Co SO Illinois SP Chicago* Southern Illinois SP Fairview Heights San Diego PD Illinois SP Joliet San Diego Co SO Illinois SP Morton Illinois SP Rockford Illinois SP Springfield Northern Illinois Crime Lab Indiana ATF Indiana SP Evansville Indiana SP Fort Wayne Walnut Creek Indiana SP Lowell Indiana SP HQ* 8 Indianapolis Marion Co* ATF Arizona Lake Co Arizona DPS Phoenix Arizona DPS Tucson South Bend PD Iowa and Nebraska Atlan Arizona DPS Flagstaff Iowa DCI Des Moines Maricopa Co SO Nebraska SP Lincoln Mesa PD Lincoln PD Phoenix PD Omaha PD Tucson PD 2 Kansas Colorado Northern Johnson Co SO CBI Denver Fort Worth PD KBI Kansas City CC CBI Montrose Arlington PD KBI Topeka CBI Pueblo Dallas PD Kansas City PD Colorado Springs PD Garland PD Sedgwick Co New Mexico Wichita Falls PD Missouri Albuquerque PD Oklahoma City PD Missouri State HP New Mexico DPS Santa Fe Oklahoma SBI St. Louis Co PD Clayton Utah Plano PD St. Louis Metro PD 9 Northern Utah Lab Ogden SW Institute of Forensic Sci SE Missouri Cape Girardeau Arkansas Wyoming Texas DPS El Paso Arkansas State Crime Cheyenne State Lab Texas DPS Lubbock Mississippi Texas DPS Tyler Mississippi DPS Biloxi Tulsa PD Mississippi DPS Jacks Southern Jackson PD Austin PD North Louisiana Bexar Co Louisiana State Lab S Corpus Christi PD N Louisiana Lab Ale Texas DPS Austin* North West Delta Mo Texas DPS McAllen South Louisiana Eastern Acadiana New Iberia Fort Bend Co SO Jefferson Parish Lab M Harris Co SO* Louisiana SP Baton R Houston PD* New Orleans PD* Jefferson Co SO St. Tammany Parish S Montgomery Co SO SW Louisiana Lab Lak Pasadena PD FIguRE 5-1  Geographic distribution of NIBIN sites. NOTES: * indicates presence of more than one piece of equipment (e.g., multiple  RDAS stations or combination of RDAS and Matchpoint viewers); see Chapter 4  for description of IBIS equipment. Region numbers are indicated at top of boxes;  partitions are indicated in bold type; portable Rapid Brass Identification (RBI) units    are indicated in italic, nested beneath their RDAS partner site. SOURCE: U.S. Department of Justice Inspector General (2005:App.VI).

OCR for page 133
CURRENT BALLISTIC IMAGE DATABASES  7 4 10 Connecticut ME/NH/VT/RI/MA k PD Boston PD* Connecticut SP Lab Meriden* Waterbury PD Maine SP Augusta P Bridgeport Massachusetts SP Danvers New Jersey P East Lansing Massachusetts SP Sturbridge Bergen Co P Grand Rapids Passaic Co SO Massachusetts SP Sudbury P Grayling New Hampshire SP Essex Co SO P Northville Paterson PD Rhode Island State Lab Kingston P Sterling Heights Union Co DPS Vermont DPS Waterbury o SD Hamilton SP* Newark PD emidji Somerset Co Prosecutor's* . Paul Co SO Northern New York Erie Co* 6 s PD Delaware/Maryland/DC Monroe Co DPS* a ATF Ammendale* NYSP Albany ta Dept. of Health Baltimore PD* Onondaga Co a Baltimore Co PD Southern New York e Attorney General FBI Lab Quantico Nassau Co Maryland SP Pikesville Suffolk Co DOJ Milwaukee Metropolitan (Washington) PD Westchester Co DPS SP Milwaukee* Prince George's Co PD Wilmington PD Ohio Canton/Stark Co Cleveland PD 5 Columbus PD* Hamilton Co Coroner's Carbondale Lake Co Chicago* Miami Valley Regional Fairview Heights ATF Ohio BCI Bowling Green Joliet Ohio BCI London Morton Rockford Ammendale Ohio BCI Richfield Youngstown PD Springfield Virginia nois Crime Lab Virginia DFS Fairfax Virginia DFS Norfolk* Evansville Virginia DFS Richmond* Fort Wayne Virginia DFS Roanoke Lowell West Virginia HQ* West Virginia SP s Marion Co* ATF d PD braska Atlanta 3 es Moines Alabama SP Lincoln Alabama DFS Birmingham D Alabama DFS Huntsville Alabama DFS Mobile Alabama DFS Montgomery o SO Birmingham PD s City CC Caribbean Puerto Rico IFS* y PD PR IFS Aquadilla Co PR IFS Arecibo PR IFS Ponce ate HP Virgin Islands PD o PD Clayton 3A Georgia etro PD 9 North Carolina/South Carolina ATF Atlanta* uri Cape Girardeau Arkansas Charleston Co SO GBI Decatur Arkansas State Crime Lab Charlotte PD GBI Savannah Mississippi Cumberland Co SO Valdosta PD Mississippi DPS Biloxi Greensboro PD Georgia Military Mississippi DPS Jackson Greenville PD U.S. Army Lab Atlanta Jackson PD Hickory PD North Florida North Louisiana New Hanover Co SO Florida DLE Jacksonville Louisiana State Lab Shreveport North Carolina SBI* Florida DLE Orlando* N Louisiana Lab Alexandria Guilford Co Orange County SO North West Delta Monroe High Point PD Florida DLE Pensacola South Louisiana South Carolina SLE Columbia* Florida DLE Tampa* Acadiana New Iberia Greenville Co SO Florida DLE Tallahassee Jefferson Parish Lab Metairie Tennessee South Florida Louisiana SP Baton Rouge Knoxville PD Broward Co SO New Orleans PD* Nashville-Davidson Co Metro PD Indian River Lab St. Tammany Parish SO Tennessee BI Memphis Miami-Dade PD* SW Louisiana Lab Lake Charles Tennessee BI Nashville Palm Beach Co SD Chattanooga PD

OCR for page 133
142 BALLISTIC IMAGING In addition, the “national” scope of NIBIN—like the scope of a national reference ballistic image database (RBID)—suggests an ease in requesting a search against the entire nation that is not the case under the current NIBIN search. “To perform a national search, the requestor must repeat the regional search for each NIBIN region”—12 separate searches—“as the system will not search all regions at once” (U.S. Department of Justice, Office of Inspector General, 2005:110). About 25 of the NIBIN installations may be considered satellite sites in that they possess only one or more Rapid Brass Identification (RBI) units, portable units for acquiring images from cartridge evidence. These RBI units must be connected with another site’s full Remote Data Acquisi- tion Station (RDAS) in order to transmit collected images to the regional server; “afterwards, the results are transmitted back through the RDAS unit to the RBI unit” (U.S. Department of Justice, Office of Inspector General, 2005:10). Some departments have experienced major problems with RBI units, including overheating and data transmission flaws; notably, the (non- NIBIN) Maryland RBID program ultimately returned the RBI it planned to use to permit the Baltimore Police Department to directly submit database entries after continued problems (Maryland State Police Forensic Sciences Division, 2003, 2004). NIBIN sites are meant to provide regional access to ballistic imaging technology, and so individual law enforcement agencies within a region may partner with a NIBIN site to enter evidence as needed. Individual agencies in states with only one NIBIN installation (e.g., Iowa, Montana, and ­Wyoming) may route evidence through that site as they see fit. Several states have NIBIN sites at regional laboratories maintained by state police, which may be used by individual city departments, e.g., Virginia’s distribution of NIBIN equip- ment in three regional state labs. Even some major city police departments do not have their own NIBIN sites and work through state or county NIBIN sites, such as Chicago, Atlanta, Milwaukee, Memphis, and Seattle. Prominent among the law enforcement agencies that are not NIBIN participants is the New York City Police Department (NYPD). While NYPD does follow NIBIN program protocols for the entry of ballistics evidence, the department purchased and maintains its own IBIS equipment; it has not linked directly to NIBIN due to the desire to maintain the integrity of its own database (McCarthy, 2004). However, NIBIN and NYPD continue to work on limited ties between the two databases (e.g., mounting archive data tapes off-site from NYPD for comparison under NIBIN). 5–B.2  Legal Limitations on NIBIN Content ATF maintains tight control on the content of the NIBIN database, limiting it only to pieces of evidence recovered at crime scenes or test fired from weapons recovered by the police. This prohibition on the entry of

OCR for page 133
CURRENT BALLISTIC IMAGE DATABASES 143 noncrime gun exhibits derives from the Firearm Owners’ Protection Act of 1986 (18 U.S.C. 926), which prohibits the establishment of “any system of registration of firearms, firearms owners, or firearms transactions or dispo- sitions.” It also derives from ATF interpretation of language that is regu- larly applied to the agency’s appropriations. For instance, the 2006 Science, State, Justice, and Commerce and Related Agencies Appropriations Act (P.L. 109-108) included 12 conditional clauses on the appropriated funds. First among these is the proviso that “no funds appropriated herein shall be available for salaries or administrative expenses in connection with con- solidating or centralizing, within the Department of Justice, the records, or any portion thereof, of acquisition and disposition of firearms maintained by Federal firearms licensees.” ATF has interpreted the acquisition of an image from a specimen fired from a gun for sale as such a “record,” and hence excluded new guns from consideration in the database. 5–C  NIBIN Usage 5–C.1  Deployment One metric by which utilization of NIBIN can be assessed is the number of participating agencies relative to the number of eligible law enforcement agencies. Each piece of evidence entered in NIBIN is associated with its source agency through specification of an Originating Agency ­ Identifier (ORI) code; ORIs are assigned by the FBI and are principally used to identify reporting agencies for the Uniform Crime Reports. In its audit, the U.S. Department of Justice, Office of Inspector General (2005:140), used the total number of ORIs predefined in NIBIN software (for selection by evidence-entry operators) as its measure of eligible agencies. Hence, they concluded that 231 of 38,717 agencies/ORIs were NIBIN partner sites. Responding to a draft report, ATF argued that the total number of ORIs is an inappropriate benchmark (U.S. Department of Justice, Office of I ­ nspector General, 2005:130): ATF believes that it is misleading to use the number of ORIs as the s ­ tatistical basis to evaluate technology allocation, program utilization, and performance because one single agency can have numerous ORIs assigned to it. By way of example, ATF alone has over 362 ORIs or about fifteen per field division. Similarly, many of the larger NIBIN State and local law enforcement partners have multiple ORIs within an agency, and all local law enforcement jurisdictions have at least one ORI number, regardless of size.   ther clauses in the appropriation act limit the type of information that can be transferred O or maintained in the standard gun tracing process and prohibit rules requiring a physical inventory of the stock maintained by firearms licensees.

OCR for page 133
CURRENT BALLISTIC IMAGE DATABASES 151 BOX 5-3 NIBIN Definition of “Hit” Definition of a Hit: A linkage of two different crime investigations by the user of the NIBIN technology, where previously there had been no known connection between the investigations. A hit is a linkage between cases, not individual pieces of evidence. Multiple bullets and/or casings may be entered as part of the same case record, in this event, each discovered linkage to an additional case constitutes a hit. A hit must be confirmed by a firearms examiner examining the actual speci- mens under a microscope. Other NIBIN linkages derived by investigative leads, hunches, or previously identified laboratory examinations, are not “hits” according to this definition. There- fore, other linkages previously termed “warm hits” should not be counted as hits. When an interagency hit occurs, the agency initiating and confirming the m ­ icroscopic comparison will be credited for the hit. Marking Hits in IBIS: Hits meeting the definition above should be linked in IBIS, using the procedures provided in instructional materials from Forensic Technol- ogy WAI, Inc. (FTI). Remember that if a link is confirmed between two cases, it is necessary to note this in each IBIS case record. Linkages derived by investigative leads, hunches, or previously identified labo- ratory examinations should only be noted in the comments section of the IBIS screen. These linkages are not to be designated as hits. When an interagency hit is confirmed, each involved site should mark the hit in IBIS, using procedures provided in instructional materials from FTI. Statistical Reporting: For interagency hits, only the agency initiating and confirm- ing the comparison should include the hit in its statistics reported to ATF NIBIN. Please note in the current version of IBIS, the Crystal Reports function for generating hit statistics may not yield entirely accurate results and should not be used. SOURCE: Reproduced from NIBIN Branch (2003). hit. Hits are calculated in aggregate so that it is not possible to empirically determine how evidence and nonevidence entries compare in the propensity to generate hits. A basic summary of the operational data is given in Table 5-1. Car- tridge casings make up about 71 percent of the database entries; in turn, about 72 percent of those casings are “nonevidence” test fires (as opposed to “evidence” casings directly recovered at the crime scene). An even larger fraction—81 percent—of the bullets in the database are test fires. In this

OCR for page 133
152 TABLE 5-1  NIBIN Usage Data, May 2003–April 2004 Casings Bullets Evidence Test Fire Hits Hits Evidence Test Fire Hits Hits Month (month) (month) Total (month) (total) (month) (month) Total (month) (total) 5/03 2,682 7,410 441,636 166 6,331 546 2,765 195,382 3 293 6/03 3,206 6,471 451,006 160 6,514 637 2,387 198,355 6 299 7/03 3,435 7,423 461,669 190 6,828 727 2,506 201,525 1 300 8/03 2,757 7,169 471,009 160 7,026 554 2,209 204,172 2 302 9/03 3,030 7,672 481,465 250 7,372 607 2,095 206,842 1 298 10/03 2,998 7,069 491,182 204 7,592 506 2,188 209,415 2 300 11/03 2,747 6,990 500,786 156 7,776 543 2,275 212,169 0 300 12/03 2,544 7,206 510,383 201 7,977 507 2,221 214,869 2 302 1/04 2,544 7,129 520,447 207 8,184 680 2,156 217,627 7 305 2/04 3,167 7,549 530,978 147 8,332 543 2,579 220,676 1 306 3/04 3,465 9,662 520,257 249 8,570 625 2,880 223,930 14 316 4/04 3,062 8,606 554,578 228 8,910 510 2,694 226,830 72 388 NOTES: As of April 2004, 397,349 of the 554,578 total cartridge casings in the NIBIN database were from test fires, and 183,756 of the 226,830 bullets in the database were test fires. The apparent reason for the decline in the total number of casings in the NIBIN database between February and March 2004 is transcription error in the raw spreadsheets for several police departments in Michigan, particularly the Detroit Police Department (for which the reported total number of casings in February, March, and April 2004, are 19,342, 3,956, and 19,927, respectively). Returns for five Michigan State Police laboratories and the Oakland, Michigan, Police Department show similar one-time drops in total number of casings (albeit each with smaller counts than Detroit). Because the monthly totals can reflect deletions of exhibits as well as additions, we have not attempted to reconstruct the totals by adding from a May 2003 base and instead use the reported (albeit, for at least several agencies in March 2004, flawed) totals from the source spreadsheets. SOURCE: Data from the Bureau of Alcohol, Tobacco, Firearms, and Explosives (personal communication).

OCR for page 133
CURRENT BALLISTIC IMAGE DATABASES 153 sample of months, March 2004 was the peak month for entering both b ­ ullets and casings; for both types of evidence, entry was generally lower in November–January than in March–July. Absent information on the number of queries performed and more specifics on the nature of evidence entered, it is not clear why the number of hits on bullet evidence jumped from a seemingly steady state of less than 10 per month to 14 in March 2004 and then again to 72 in April 2004; casing hits were generated at an average of 193 per month. The U.S. Department of Justice, Office of Inspector General (2005:25– 26), audit of NIBIN had access to a snapshot of the NIBIN database as of October 2004, including 888,447 records of bullet and cartridge case evidence, 514,731 records of cases (groupings of exhibits), and 254,187 records of firearms. Just as analysis showed that a small percentage of sites accounted for a large share of evidence entered into NIBIN, high- entry sites also enjoyed the largest percentage of the hits made using the database. In all, 72 percent of the hits (both bullet and cartridge casings) were realized by the 20 percent of NIBIN partners who had input the most entries; the bottom 55 percent of entry-producing partners achieved only 9 percent of the hits (U.S. Department of Justice, Office of Inspector General, 2005:31). Both our set of aggregate administrative data and the Inspector ­General’s snapshot of the entire database provide some inkling as to the structure and composition of the database; still lacking is any ability to describe how the system is actually used and how it performs. The Inspector General audit attempted to get a basic sense of the system’s utilization by comparing the number of evidence entries put into NIBIN by individual sites with the level of firearms-related crimes those agencies reported under the FBI’s Uniform Crime Reporting (UCR) program. That analysis did not progress far; “meaningful comparisons were not possible based on the available data because of variables such as population size, population density, geographic location, and other demographic factors” (U.S. Department of Justice, Office of Inspector General, 2005:30). However, the report suggests that the major deployment of IBIS equipment to complete the planned NIBIN network had worked to narrow a broader gap between the number of NIBIN entries and the number of gun crimes. We pursued a similar line of analysis in order to study whether a con- nection exists between success in generating hits and the level of crime in areas. This requires a linkage between the NIBIN usage data and the UCR data, and such a connection is fraught with complications more fundamen- tal than the quote from the Inspector General audit admits. The Bureau of Justice Statistics (BJS) has compiled a “crosswalk” dataset, linking UCR ORI codes with BJS’ Directory of Law Enforcement Agencies and data from the Census Bureau’s Governments Integrated Directory (Lindgren

OCR for page 133
154 BALLISTIC IMAGING and Zawitz, 2001) in order to estimate the service population of agencies reported in the UCR. However, defining the service population for a NIBIN site is complicated (as described in Section 5–B.1): NIBIN partner agencies process evidence submitted by other agencies because the number of NIBIN installations is relatively low. In states where NIBIN sites are located in laboratories of the state police (e.g., Wisconsin or Virginia), comparing NIBIN entries with crimes in the city where the NIBIN site is located is certainly inadequate, yet trying to associate each site with a proportionate share of the crimes in the entire state is likely inaccurate as well. Due to these difficulties, we have treated our own attempts to link the NIBIN operational data with UCR figures as merely suggestive and in no way definitive; yet we judged it important to try to get some sense of whether high-crime areas are likely to benefit from hits achieved by ballistic image comparisons. We erred on the side of simplicity by using crime data from the NIBIN site’s home city as a proxy for the number of crimes committed in the site’s service area, combining NIBIN entry counts in some cases where multiple installations are located in the same city. We also filtered cases to look only at NIBIN host cities with populations above 10,000. In this way, we augmented the NIBIN dataset with three variables for those sites for which we believed we could establish a pairing: popula- tion of the city in which the NIBIN site is located in 2003, average number of murders and non-negligent manslaughter incidents in 2002 and 2003, and total number of violent crimes (including, e.g., assault and forcible rape) in 2003. After aggregating the information from sites located in the same city or town and deleting sites with missing information, our analysis dataset included 105 cases with complete NIBIN, population, and crime informa- tion. Of these, there were 33 NIBIN sites/localities at which at least one bullet hit had been obtained and 72 localities with no hits on bullets. The number of hits when using casings was significantly larger: at 85 localities, there was at least one hit on casings and there were only 20 localities at which no hits on casings were reported. We analyzed bullets and casings separately. For each type of evidence,   e W also emphasize that this analysis is intended merely to be suggestive due to the limita- tions of the original operational dataset. As a record only of aggregate database additions and “hits,” it is not as complete a resource for studying NIBIN system performance as would be desirable. In addition, as the note for Table 5-1 suggests, the operational data spreadsheets appear to be manual updates rather than system-generated tallies. A discrepancy in the c ­ umulative count of cartridge casings from month to month led to the discovery of a significant error in reporting by Michigan agencies (particularly the Detroit Police Department), whose totals for 1 month were more than halved. Our analysis uses the year-end total entry and hit counts and so should not be affected by month-to-month discrepancies, but recording errors do apparently exist in the raw underlying data.

OCR for page 133
CURRENT BALLISTIC IMAGE DATABASES 155 we constructed a binary variable: 0 indicated no hits, and 1 indicated at least one hit. We performed two basic analyses. First, we used logistic regression to model this binary response variable as a function of the num- ber of NIBIN entries (four variables, including both evidence and test-fired bullets and casings), population size, and the two crime count variables. Second, we then restricted attention to only those localities where at least one hit was reported (for bullets or casings) and modeled the number of hits as the same function of potential predictors. Because the number of positive hits is not nearly normally distributed and because the number of sites reporting hits on bullets is very low (only 32 sites), we focused on modeling the number of hits on casings and used a log transformation to improve the distribution of the outcome variable. Taking into account the contributions of the other predictors in the model, we found that the probability of a hit on cartridge casings increased as a function of the number of violent crimes in the NIBIN site locality during 2003 as well as on the number of evidence casings entered in the system. Likewise, the probability of a hit decreased with increased murder and non-negligent manslaughters in the locality and with the total number of bullets entered into the NIBIN system. The negative association between the probability of a hit on casings and the number of bullets in the NIBIN system at the site might be spurious and is likely attributable to correla- tion between the number of bullets and the number of casings, the latter of which makes a stronger contribution to the model. Alternately, it might be due to an unobserved underlying variable correlated with both the number of bullets in the system and the probability of a hit on casings. However, the probability of a hit on bullet evidence increased only as a function of the number of bullets collected as evidence that were entered into the NIBIN system; no other factor appears to be significantly associated to the prob- ability that a bullet match will be found. We checked the fit of the logistic regression models by considering the proportion of concordant pairs and by examining the chi-squared residuals (to identify influential observations and outliers). For bullets, the percent concordant pairs exceeded 90 percent, and for casings it was approximately 89 percent, indicating that the predictors in the model explain a significant portion of the between-locality variability in the probability of a hit. Looking next at the 85 localities reporting at least one hit on casings, we fit a linear regression model to the rate of hits (computed as the num- ber of hits divided by the total number of casings in the system) in the log scale. Predictors in the model included population at the locality, average number of murders and non-negligent manslaughters in the locality in 2002 and 2003, total number of violent crimes in 2003 in the locality, the total number of evidence casings in NIBIN, the total number of nonevidence (test fire) casings in NIBIN, and the total number of bullets entered in the NIBIN

OCR for page 133
156 BALLISTIC IMAGING system at the locality. The log of the rate of casing hits was significantly associated with the number of evidence casings entered into NIBIN at the locality but was not associated with any other predictor. In particular, the association between the log rate of hits and the number of nonevidence cas- ings in the system was negative, albeit not statistically significant. We examined the fit of the linear model by inspecting residuals and estimating the degree of multicollinearity among predictors. All of the predictors in the model were positively correlated, which might partially explain the lack of statistically significant associations between the response variable and the predictors. A reasonable (43 percent) proportion of the total variability observed in the log probability of a casing hit was explained by the predictors in the model, and no outliers were detected when inspect- ing the standardized residuals from the regression. However, patterns in the standardized residuals plotted against the observed log probabilities of hits on casings do suggest that other, potentially important predictors are missing from the model. The most basic interpretation we draw from this analysis, despite its limits, is the same reached by the Inspector General audit: the probabilities of getting a hit on either bullets or casings depend vitally on the number of entries entered into the NIBIN system at each locality. We observed the strongest connection to be with the counts of bullets or casings entered as evidence, whereas hit probabilities were negatively (but not significantly) associated with the number of nonevidence (test fire) samples entered into the system. This finding suggests that agencies might be better served by prioritizing entries so that evidence samples are entered into NIBIN most promptly. 5–E  State Reference Ballistic Imaging Databases The existing state reference ballistic image databases in New York and Maryland operate using the same IBIS computer and microscope image; their networks and correlation servers are entirely distinct, however, thus complying with the current prohibition on noncrime-gun evidence in the NIBIN database. 5–E.1  Maryland: MD-IBIS As part of a larger gun legislation package, the Maryland Responsible Gun Safety Act of 2000 established a statewide database of images of car- tridge cases test fired from every handgun sold, rented, or transferred by manufacturers in the state or whose products are sold in the state, under the premise that handguns are most frequently used in crimes. The database is known as Maryland-IBIS (MD-IBIS), was established under the Maryland

OCR for page 133
CURRENT BALLISTIC IMAGE DATABASES 157 State Police, effective September 1, 2000. The database is not connected to the NIBIN network and is not networked to any other law enforcement agency in the state (though an unsuccessful attempt was made to permit entry of casings by the Baltimore city police; see Section 5–B.1). Casings for entry and comparison with MD-IBIS must be taken to the state police facility in Pikesville and processed there. In September 2003 and September 2004 the Maryland State Police Forensic Sciences Division (2003, 2004) issued progress reports on MD-IBIS; the two reports were diametrically opposite in terms of supporting con- tinued collection of image data. The first report (Maryland State Police Forensic Sciences Division, 2003:i) took an optimistic stance, arguing that the “time to crime” window—the length of time between the sale of a gun and its appearance as a crime gun—is roughly 3–6 years. Hence, the report argued that the MD-IBIS was just entering this period for guns sold in 2000 and that additional investigative “hits” would be forthcoming. The analysis drew from the example of the state’s DNA database, which “was started in 1994 and obtained its first hit in November 1998.” However, one year later, the Maryland State Police Forensic Sciences Division (2004:i) reversed course, citing “the failure of the MD-IBIS to provide any meaningful hits.” The report found that the program “has not met expectations and does not aid in the Mission statement of the Department of State Police.” It recom- mended that the data collection be suspended and that MD-IBIS staff be transferred to the DNA database unit. Both reports also commented on other problems involved with collections of data, including detected cases where the cartridge case sample packaged with a new firearm did not in fact correspond to that firearm; these arguments are discussed elsewhere in this report, particularly Section 9–C.2. The 2003 report estimated the average annual cost of MD-IBIS opera- tions over its then 3-year lifetime at $460,700, which included four staff members (technicians or firearms examiners), supplies, and service costs for the equipment. The 2004 report, which included initial capital to pur- chase the IBIS equipment, placed the cumulative cost of the database over 4 years at $2.6 million. Based on handgun sales data prior to the enabling law’s passage, it was projected that cartridge casings for approximately 30,000 handguns would be entered into the system annually; actual entry had only been about one-third that amount, with 43,729 handguns in the database through 2004 (Maryland State Police Forensic Sciences Division, 2004:2) and only 49 of 215 handgun manufacturers had sub- mitted casings for inclusion (Maryland State Police Forensic Sciences Division, 2003:2). Both reports attribute part of this shortfall to a general decrease in handgun sales in the state due to the full set of provisions in the 2000 act.

OCR for page 133
158 BALLISTIC IMAGING Of 208 police queries on the MD-IBIS database, six produced matches that were later determined to be hits. Two hits were produced in 2002 when ATF submitted two .45 caliber Taurus semiautomatic pistols whose serial numbers had been obliterated for comparison; matches were found to two pistols that had been stolen from the same dealership in December 2001. Two later hits also were found to two pistols (different manufacturers) that had been stolen from a common dealership; investigative leads were gener- ated in a robbery and “a major burglary and assault case” (Maryland State Police Forensic Sciences Division, 2003:7). However, the Maryland State Police Forensic Sciences Division (2004:2) raised an important criticism of the MD-IBIS hits, which is that they generally “did not work according to the manner in which the system was designed.” The underlying goal of an RBID is to generate an investigative lead to a point of sale without the need for the actual crime gun to be recovered; however, the gun in question was recovered and in police custody in five of the six confirmed hits. At the time of its publication, the Maryland State Police Forensic Sciences Division (2004:2) also critiqued the database’s cost-­effectiveness because “none of the ‘hits’ have been used in a criminal trial.” Following the 2004 review, the database’s future appeared uncertain. In March 2005, the Maryland House Judiciary Committee held hearings on a bill to repeal the law establishing MD-IBIS (Butler, 2005). However, within a few weeks, one confirmed MD-IBIS hit that ran counter to both Maryland State Police Forensic Sciences Division (2004) criticisms—that the few hits being produced were in cases where the gun was already recov- ered and that none were being used in criminal proceedings—produced tangible results. On April 1, 2005, Oxon Hill, Maryland, resident Robert Garner was convicted of first-degree murder—in a case in which the criti- cal investigative spark was provided by an MD-IBIS hit. As Castaneda and Snyder (2005) report: Although the [murder] weapon, a .40-caliber handgun, never was found, county police and prosecutors connected the firearm to Garner through 10 shell casings found at the scene. . . . The casings recovered at the murder scene matched a casing that was on file with Maryland State Police, show- ing that the weapon was purchased by Garner’s then-girlfriend (now his wife) in a Forestville store about three weeks before the killing, according to trial testimony. “That evidence was the cornerstone of our case,” said Glenn F. Ivey, the Prince George’s [County] state’s attorney. “It was power­ ful evidence. I hope this verdict helps our efforts to have the [MD-IBIS database] continued and expanded.”   e W note that this relatively quick 3-week span from sale to use in crime is inconsistent with the argument that RBIDs produce low numbers of hits due to a lengthy “time to crime.”

OCR for page 133
CURRENT BALLISTIC IMAGE DATABASES 159 This case appears to have won the system a temporary reprieve. The then-pending bill to scrap the MD-IBIS enabling legislation was not passed during the remainder of the 2005 legislative session; a similar bill was also introduced during the 2006 regular session but also was not enacted. Along with the proposals to stop work on the database, recent ses- sions of the Maryland legislature have also raised possible modifications to MD-IBIS; none have moved beyond referral to committee. In the 2006 session, HB 1369 would have waived the requirement of image entry into MD-IBIS if a firearm’s manufacture certified that microstamping was used on the gun’s parts to impart markings on shell casings. 5–E.2  New York: CoBIS The Combined Ballistic Identification System (CoBIS) is the state of New York’s reference ballistic image database and is maintained by the New York State Police (NYSP) at its Forensic Investigation Center in Albany. The database began operation in March 2001, following the 2000 enactment of state legislation creating a “pistol and revolver ballistic iden- tification databank.” The law required that any manufacturer shipping or delivering a pistol or revolver within the state include a shell casing from a round fired through that weapon. Firearms dealers are then required to forward the sample casing to the state police, or alternatively submit the weapon to be fired at a state police facility in order to collect a sample casing, when the gun is sold. Vendors at gun shows are dealers under this definition, and they are expected to comply with the law. Many manufacturers or dealers comply with the law by performing test fires at their facilities and including the ballistic sample in an envelope. At the time of sale, this envelope containing the sample is sent to the Albany Forensic Investigation Center; the appropriate permit or license informa- tion is included on a slip of paper stapled to the envelope. CoBIS operators detach the permit slip and forward it to the appropriate branch of the NYSP; no information from the slip (e.g., name of buyer) is processed or entered in CoBIS. The envelope is checked-in and given a bar code or ID sticker and put into the queue for acquisition; the NYSP runs a backlog in acquiring these exhibits. For the convenience of dealers in cases in which a ballistic sample is not included with the firearm, the NYSP maintains regional CoBIS centers at its six troop headquarters as well as a mobile center. Guns may be taken to any of these regional centers for test firing (using a water tank) and recov-   he enacting legislation is codified as New York General Business Law, Article 26, Section T 396-ff; the New York State Police subsequently published regulations on specific database operations as 9 NYCRR Section 493.1, Rule 18.

OCR for page 133
160 BALLISTIC IMAGING ery of a sample cartridge. At the regional centers, casings are collected and are checked in to the system (labeled and assigned a number), but ballistic images are only captured at the Albany headquarters. The Albany Forensic Investigation Center includes four IBIS Data Acquisition Stations, purchased from FTI, dedicated to entry of CoBIS samples. An FTI correlation server is connected to these four stations, as is an IBIS hub/Signature Analysis Station for use in querying the database. The Forensic Investigation Center is also a NIBIN location; a separate IBIS hub—in a separate room from the CoBIS DAS stations—is used for NIBIN entries. Any law enforcement agency in the state can submit exhibits for entry and comparison against CoBIS at no charge. At the time of the database’s creation, plans suggested an average of 25–50 comparison requests per day, including requests against crime-scene evidence as entered in NIBIN; actual usage was much lower than expected. In a letter to this committee in December 2004, Zeosky (2005) wrote: Since its inception in March 2001, cartridge cases from more than 85,000 new handguns sold in New York have been submitted to CoBIS (14,590 from weapons test fired by the State Police and 71,346 provided by the manufacturers). To date there have been approximately 276 direct queries against CoBIS, with no “hits.” As of a 2005 visit by a subgroup of committee members to the CoBIS center, 400 internal NYSP queries had been made against the database, also with no hits. Agreements have been made between the U.S. Department of Justice and the NYSP to arrange a one-way transfer of information. Specifically, ATF has provided the NYSP with NIBIN exhibits for other New York state law enforcement agencies, in the form of data tapes. With FTI assistance, these tapes have been used to run batches of requests on exhibits dating back to CoBIS’ inception in March 2001. As of the subgroup visit, about 2,400 such queries had been run using NIBIN exhibits in CoBIS; one cold hit (unconfirmed by a firearms examiner) was found between a CoBIS exhibit and a NIBIN entry from Rochester. Just as NYSP has arranged an indirect, loose tie between CoBIS and NIBIN, so too has limited connection been made between CoBIS and New York City’s NIBIN-independent ballistic image database. As with NIBIN, the connection with New York City is one way and accomplished through trans- fer of tape archives. NYSP possesses a tape of all NYPD images; these remain to be sorted and filtered, limiting the focus to post-March 2001. CoBIS and the NYPD ballistic image database did have a previous connection; NYPD was linked on a one-time set-up for a preliminary test of 900 queries.

OCR for page 133
CURRENT BALLISTIC IMAGE DATABASES 161 New York has not attempted any kind of audit of the manufacturer- supplied samples and exhibits from guns, though this is an acknowledged source of mix-ups. The basic logistical problem with such a hypothetical audit is that it would require voluntary cooperation by gun owners to turn over their firearms for new test firings. Likewise, CoBIS is limited in its capability to answer research question due to limitations on data collected. CoBIS personnel do not know how many guns recovered by the State Police were actually sold in New York State, since those data go through separate clearinghouses. Their rough impression is that the guns tend to not be “imports” from other states, but rather the use of old existing guns. Moreover, although information on the gun is recorded in the database, ammunition brand is not, though some technicians will enter that informa- tion in IBIS as comments if it is known. All that is generally recorded in CoBIS is make, model, serial number, and caliber; any other information is gathered by the permits office. In the 2005 legislative session, bills offered in the New York State Assembly suggested a range of legislative responses to the CoBIS database, from a complete repeal of the enacting law (A05093) to multiple-phase expansions in scope to include additional classes of firearms and ballistic images of bullets for those weapons that do not eject shell casings (A00968, A06462). None of these was enacted. In the 2007 session, A07477 would expand CoBIS to include rifles and shotguns.