Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.
OCR for page 117
Improving Breast Imaging Quality Standards 4 Ensuring an Adequate Workforce for Breast Cancer Screening and Diagnosis Recent media reports suggest that shortages of radio logic technologists (RTs) and interpreting physicians (see Box 4-1) have contributed to the closure of some mammography facilities (Martinez, 2000; Gorman, 2001) and articles in trade publications refer to a current or looming “crisis” in access to mammography (Maguire, 2003; Brice, 2004; Hayes, 2004). Although these reports depict alarming situations, they are largely anecdotal or impressionistic; however, it is clear that demand for breast imaging services is increasing and is likely to continue to do so over the coming decades, while there is little to suggest that the numbers of interpreting physicians and RTs will rise accordingly. Although demand for mammography could potentially decrease in the future—for example, if longer screening intervals were recommended for some portion of the population—such changes are difficult to predict. These concerns serve to highlight the lack of data on the national mammography workforce, the volume of services it delivers, and its capacity for expansion—measures that are essential to determining whether, and where, workforce shortages occur and what impact such shortages have, or potentially could have, on the delivery of mammography and other breast imaging services. In the absence of such data, the Committee relied on several sources of information in order to assess the current and future state of the mammography workforce. These included inspection reports to the Food and Drug Administration (FDA), as well as survey data from the American College of Radiology (ACR; see Appendix A), the Society of Breast Imaging (SBI; see Appendix B), and the American Society of Radiologic Technologists (ASRT). Although FDA does not collect data on individual practitioners, the Committee was able to estimate the total number of physicians who interpreted mammograms each year since 1997. However, it should be noted that these estimates are still likely to be inflated.1 It is also important to recognize the limitations of opinion surveys. Although they are helpful in gaining the perspective of current or potential members of the mammography workforce, survey methods are also prone to subject bias and error. Motivational factors may influence the results of surveys that address sensitive subjects such as employment; respondents may be unwilling to provide accurate information for reasons of self-protection or personal gain (Wentland, 1993). In addition, experiments in social psychology suggest that responses to survey questions regarding attitude are influenced by environment, survey type, and the context in which the question is presented (Tourangeau et al., 2000). The Committee’s intention in presenting findings from opinion surveys, including those conducted by the ACR and SBI, is to shed light on a variety of attitudes 1 Aggregate data obtained from FDA contained many duplicates because an interpreting physician is counted each time his or her name is recorded at a facility inspection, and many radiologists read at multiple facilities. A series of queries was used to remove most of the duplicate names, but approximately 10 percent of the entries are still likely to be duplicates, largely due to name misspellings or other variations in data entry.
OCR for page 118
Improving Breast Imaging Quality Standards BOX 4–1 The Mammography Workforce Interpreting physician: Interprets mammograms. Initial training and qualifications: Must be a physician with a state license to practice medicine and must be board certified in diagnostic radiology by a Food and Drug Administration- (FDA-) approved body or have 3 months of formal training in mammography (although physicians who qualified under the interim regulations only needed two months). Must have a minimum of 60 hours of documented Continuing Medical Education (CME) in mammography (although physicians who qualified under the interim regulations only needed 40 hours), and have interpreted at least 240 mammograms under the direct supervision of an interpreting physician in the 6 months prior to qualifying. Continuing education: Must teach or complete at least 15 Category I CME hours in mammography every 36 months. Continuing experience: Must interpret a minimum of 960 mammograms every 24 months. The lead interpreting physician in a mammography facility has general responsibility for ensuring that a facility’s quality assurance program meets all of the requirements of Section 900.12(d) through (f). Each facility must also designate at least one audit interpreting physician to review and analyze the medical outcomes audit data. This individual is responsible for documenting the results, for notifying other interpreting physicians of their results and the facility aggregate results, and for documenting the nature of any follow-up actions. Radiologic technologist (RT): Performs mammographic examinations and prepares films or digitized images for interpretation. Initial training and qualifications: Must be state licensed to perform general radiographic procedures, or have a general certification from an FDA-approved body to perform radiologic examinations. Must complete 40 hours of training specific to mammography, including performance of a minimum of 25 examinations under direct supervision (although technologists who qualified under the interim regulations did not need to perform 25 exams and the number of hours of mammography training was not specified). Continuing education: Must obtain 15 continuing education units every 36 months. Continuing experience: Must perform 200 mammograms every 24 months years. Medical physicist: Surveys mammography equipment and oversees the equipment-related quality assurance practices of the facility. Initial training and qualifications: Must be board certified in an appropriate specialty area by an FDA-approved body, or be State licensed or approved for medical physics surveys of mammography facilities. Must have a master’s degree or higher in physical science with no less than 20 semester hours or equivalent of college undergraduate- or graduate-level physics. Must complete 20 hours of specialized training in conducting surveys of mammographic facilities. Must survey at least 1 mammography facility and a total of 10 mammography units. (Medical physicists who qualified under the interim regulations could continue to perform surveys under the final regulations with a bachelor’s degree in physical science and 10 semester hours of physics, provided they had 40 hours of training in surveys and had done surveys of at least one facility and 20 mammography units.) Continuing education: Must obtain 15 continuing education units every 36 months. Continuing experience: Must conduct surveys of two facilities and six units every 24 months.
OCR for page 119
Improving Breast Imaging Quality Standards Breast imaging specialist: Specializes in interpreting the results of mammographic and nonmammographic imaging examinations of the breast and performs interventional procedures, including image-guided biopsies of the breast. Training and qualifications are not defined by MQSA, but are generally considered to include some or all of the following: fellowship training in breast imaging, spending a majority of time on the interpretation of breast images, and conducting a high volume of breast imaging. Radiologist assistant (RA): A recently created physician extender position, the RA is an advanced-level radiologic technologist who works under the supervision of a radiologist. Experience as an RT is a prerequisite for admission to ACR- and ASRT-approved RA training programs at four U.S. universities (additional programs are under development). The RA is an ARRT-certified radiographer who has successfully completed an advanced academic program encompassing a nationally recognized curriculum and a radiologist-directed clinical preceptorship. Under a radiologist’s supervision, the RA performs patient assessment, patient management, and selected examinations. The roles and responsibilities of the RA, as agreed on by the ACR and ASRT, will not include interpretations (preliminary, final, or otherwise) of any radiological examination or the transmission of observations to anyone other than to the supervising radiologist. The RA may make initial observations of diagnostic images and forward them to the supervising radiologist. SOURCES: 21 C.F.R. § 900.1 (2003); IOM (2004); Williams and Short (2004). that may influence the present and future breast imaging workforce; it is not an attempt to determine or predict the magnitude of influence associated with a specific attitude or opinion. Although predicting future workforce trends is fraught with uncertainty, the Committee commissioned Paul Wing, of the Center for Workforce Studies at the State University of New York School of Public Health in Albany, to model the possible effects of current trends and potential changes in regulations on the supply and demand for RTs who perform mammograms and the physicians who interpret them (see Appendix C). Key statistics, derived from the three surveys and the workforce modeling study, are summarized in Box 4-2. The Committee examined a variety of factors that could limit the future supply of interpreting physicians, including concerns that reading mammograms, as compared with other areas of radiology, is less lucrative, more regulated, and carries greater medicolegal risk. It was also noted that the expanded use of nonmammographic imaging technologies for breast cancer detection and diagnosis are likely to increase future demand for breast imaging, and thereby the workload of some interpreting physicians. These issues are considered in proposing strategies to increase the number of new entrants to the field of breast imaging, retain the current mammography workforce, and enhance the productivity of new and existing practitioners.
OCR for page 120
Improving Breast Imaging Quality Standards BOX 4–2 Key U.S. Mammography Workforce Statistics In 2003–2004: Approximately 62 percent of all radiologists interpreted mammograms. The supply of interpreting physicians was approximately 14,400 full-time equivalent (FTE) radiologists, which translates to approximately 2.4 FTE radiologists interpreting mammograms per 10,000 women aged 40 and older. However, an FTE radiologist is not an FTE interpreting physician, as most are general radiologists who spend a significant portion of their time interpreting other radiologic exams. The average interpreting physician (50th percentile) read 1,670 mammograms per year. Among interpreting physicians: The 25 percent who interpreted fewer than 1,000 mammograms per year account for 6 percent of all mammograms. The 54 percent who interpreted fewer than 2,000 mammograms per year read less than 25 percent of all mammograms. The 12 percent who interpreted 5,000 or more mammograms per year read more than 33 percent of all mammograms. The effective workforce of radiologic technologists (RTs) in mammography is approximately 26,000 FTEs. Less than half of all members of the American Registry of Radiologic Technologists who are certified in mammography work primarily in mammography. On an average hourly wage basis, RTs working primarily in mammography earned significantly less than those working primarily in nuclear medicine (26 percent), magnetic resonance imaging (12 percent), sonography (10 percent), and computerized tomography (6 percent). Future projections based on current trends (assumes no change in the numbers of physicians or RTs entering or exiting the field): The population of women over age 40 will increase by nearly 30 percent by 2025. The number of interpreting physicians per 10,000 women over age 40 will decline by 14 percent by 2015 and by 23 percent by 2025. The shortfall could be overcome by increasing the number of interpreting physicians or by increasing the volume of mammograms read by the available pool of interpreting physicians. The supply of RTs will decline by approximately 22 percent by 2025; the number of RTs per 10,000 women over age 40 will decline by 23 percent by 2015 and by 40 percent by 2025. SOURCES: Wing (2005); American College of Radiology (2004); ASRT (2004a).
OCR for page 121
Improving Breast Imaging Quality Standards TABLE 4–1 Number of Interpreting Physicians by Year Year (FY) FDA Aggregate Number Number of Individuals (most duplicates removed) 1997 56,421 20,604 1998 59,747 20,981 1999 61,225 21,636 2000 62,316 21,625 2001 61,971 21,562 2002 60,559 21,345 2003 59,265 21,029 NOTE: Food and Drug Administration (FDA) aggregate numbers include all interpreting physicians listed on all inspections for the year; hence there are many duplicates. Also, FDA does not inspect 100 percent of facilities during the course of each fiscal year (FY) because facilities can be inspected within a range of 10 to 14 months from their prior inspection date. FDA’s actual FY inspection percentage is 98 percent. FDA Database Deduplication Process Two files obtained from FDA were first imported into a single Microsoft Access database. This database was then split by inspection year into separate tables, from 1996 to 2004. Each table went through the following steps. Four new columns were added to the table to hold revised first and last name strings: Physician Last Name Fix, Physician First Name Fix, Physician Last Name Best, and Physician First Name Best. Two queries targeted entries with contaminated Physician Last Name or Physician First Name data fields. These fields contained extraneous commas, spaces, and other values. The first query copied the Physician Last Name string preceding an embedded comma into the corresponding Physician Last Name Fix column field. The second copied the Physician First Name string following an embedded space into the corresponding Physician First Name Fix column field. This ensured that only the first and last name text strings, and not extraneous data, were copied into the Fix columns. Data fields that did not require the above cleaning step were merged with the corresponding Fix column into a new Best column (e.g., Physician Last Name and Physician Last Name Fix into Physician Last Name Best). This merging process was carried out for first names as well. A final query appended the Physician Last Name Best and Physician First Name Best columns, removing any duplicates. The resulting nonduplicate entries were saved as a new table. The number of entries in this new table was assumed to equal the number of interpreting physicians for that particular year. Approximately 10 percent of the entries are still likely to be duplicates, largely due to name misspellings or other variations in name entry during the data entry process. However, note that the percentage of duplication seems to be relatively consistent over time, suggesting that the trend in the number of interpreting physicians is real. CURRENT STATUS: IS ACCESS TO MAMMOGRAPHY ENDANGERED? As shown in Table 4–1, the number of interpreting physicians increased by 5 percent between 1997 and 1999, and then declined by about 3 percent by 2003. Although the absolute numbers are inflated due to redundancy in the data source, FDA estimates are
OCR for page 122
Improving Breast Imaging Quality Standards useful in that they reflect year-to-year trends. The ACR estimated a smaller number of interpreting physicians from the results of a 2003 survey2 of radiologists and nuclear medicine specialists with major ties to radiology (for a detailed account of the ACR’s survey methods, see Appendix A). The discrepancy between FDA and ACR estimates probably results from differences in the processes that produced them, and also is exacerbated by variability in name entry that eluded efforts to factor out redundancy. As the data collection and analysis methods used to produce the ACR estimates were also used to address other questions about workforce, these results were used to model future workforce supply and demand for consistency (see below). In 2003, the ACR estimates that about 16,000 radiologists—60 percent of the U.S. total—interpreted mammograms, and that the equivalent of 2.4 full-time radiologists interpreted mammograms for every 10,000 women in the U.S. population aged 40 years and older. However, it is important to note that a full-time equivalent (FTE) radiologist is not necessarily an FTE interpreting physician. In fact, the vast majority of radiologists who interpret mammograms spend a significant portion of their time reading other types of images. The actual number of full-time equivalent interpreting physicians is thus much lower. As no attempt has been made to determine the optimal ratio of radiologists (or interpreting physicians) to women aged 40 and older, the ratio calculated by the ACR is meaningful only as a basis of comparison from year to year. Moreover, if such an optimal ratio could be determined, it would need to reflect technological innovation and screening intervals, and therefore would probably change over time. Although the volume of mammograms read by individual practitioners cannot be determined from FDA data, the ACR has collected such information. Figure 4–1, which classifies interpreting physicians in the United States according to the volume of mammograms they read during 2003, shows that 75 percent of the 16,000 estimated interpreting physicians read at least 1,000 of them, and 46 percent read at least 2,000. Seventy to 80 percent of radiologists in small and medium-sized practices (2 to 10 radiologists) interpreted mammograms, as compared with less than 40 percent in large, and apparently more specialized, practices (30 or more radiologists). The ASRT’s 2004 data indicate that approximately 26,000 full-time RTs work primarily in mammography in the United States (ASRT, 2004a). They comprise less than half of all technologists certified by the American Registry of Radiologic Technologists (ARRT) who are certified in mammography. Unfilled Positions One frequently cited indicator of workforce supply relative to demand in mammography is the number of unfilled job openings for interpreting physicians and RTs who perform mammography. In the SBI’s October 2003 survey,3 29 percent of nearly 570 breast imaging practices reported unfilled positions for physicians. More than a third of 2 The 2003 ACR survey (see Appendix A) was sent to a “stratified random sample” of 3,090 physicians derived primarily from the American Medical Association’s Physician Masterfile, representing vascular/interventional radiologists, all other types of allopathic radiologists, osteopathic radiologists, and nuclear medicine specialists with major ties to radiology (e.g., those holding American Board of Radiology [ABR] certification and/or membership in the ACR). The sample included residents, fellows, and retirees; 1924 usable responses were received, yielding a response rate of 63 percent. 3 The October 2003 SBI survey was sent to every breast imaging practice (one survey per practice) in the organization’s database. Surveys were received from 575 practices (64 percent of study population).
OCR for page 123
Improving Breast Imaging Quality Standards FIGURE 4–1 Estimated radiologists interpreting mammograms and percentage of total mammograms, by volume, United States, 2003. According to the figure, 5,474 radiologists read between 2,000 and 5,000 mammograms per year, accounting for 40.4 percent of all mammograms read each year. SOURCE: Derived from Sunshine et al. (2004a) and Wing (2005). these practices had two or more such openings, and nearly one-quarter had been attempting to hire an interpreting physician for more than 2 years (Farria et al., in press). There were many more interpreting physician and RT openings in academic practices than in private and government practices. Nearly two-thirds of the 12 percent of practices surveyed that offered breast imaging fellowships reported that these positions were unfilled. A survey of 53 community-based mammography facilities in three states (Washington, New Hampshire, and Colorado) conducted by D’Orsi and coworkers (2005) in 2001–2002 found shortages of interpreting physicians relative to mammography volume in 44 percent of these facilities. Job vacancies in mammography do not appear to reflect an overall trend within radiology, which in 2003 saw multiple signs that a severe shortage of radiologists had eased (Sunshine et al., 2004b). Demand for all types of specialist physicians, and particularly for radiologists, rose between 2002 and 2003 (Merritt, Hawkins & Associates, 2003). The New York State Resident Exit Survey found that starting salaries for radiologists in general (both diagnostic and therapeutic) who had completed training within that state rose more than 37 percent between 1999 and 2003, and more than 45 percent between 1998 and 2003 (Center for Health Workforce Studies, 1999–2004). This survey indicates a slight drop (about a 2 percent change) in median salary between 2002 and 2003, but that came on the heels of a more than 17 percent in-crease between 2001 and 2002. Inasmuch as academic radiologists’ salaries reflect general trends in the field, it can
OCR for page 124
Improving Breast Imaging Quality Standards also be noted that the median compensation4 for a full-time assistant professor of diagnostic radiology rose by about 23 percent between 1999 and 2003, according to the Association of American Medical Colleges (1999–2003). This survey also reported a 9 percent median increase for assistant professors in all clinical departments and a 14 percent salary rise for those in therapeutic radiology over the same period. Year-to-year trends in these figures show a slight slowdown in annual salary increases for academic diagnostic radiologists after 2001–2002. Trends in vacancies for mammography RT positions appear to mirror those for interpreting physicians. Thirty percent of the breast imaging practices that responded to the SBI survey reported unfilled mammography technologist positions; of these, 45 percent had two or more openings in 2003 (Farria et al., in press). Similarly, 20 percent of the community mammography facilities that responded to the aforementioned survey by D’Orsi and coworkers reported a shortage of MQSA-qualified RTs; 46 percent reported some difficulty in maintaining an adequate staff of qualified technologists (D’Orsi et al., 2005). Data from the ARRT indicated a steady and substantial decline in examinees for certification in mammography between 1996 and 2000 (American Registry of Radio logic Technologists, 2001). However, since the exam was first offered in 1992, this decline may in part reflect the fact that many people taking the exam in its first few years were already working in the field (IOM, 2001). Since 2000, the total number of registrants has remained essentially flat, although 2003 showed the first increase in first-time examinees in recent years (a nearly 6 percent increase over the number of first-time examinees in 2002). A key barrier to filling RT positions in mammography is their low pay in comparison to RT positions in other subspecialties. In the 2003 SBI survey, wages for RTs working primarily in mammography ranked third out of four radiology subspecialties (Farria et al., in press). On an average hourly wage basis, mammography RTs earned significantly less than those working primarily in nuclear medicine (26 percent), magnetic resonance imaging (MRI) (12 percent), sonography (10 percent), and computerized tomography (CT) (6 percent), according to the ASRT (2004b).5 The average salaries of RTs who worked primarily in nuclear medicine in 2004 were 28 percent higher than those who worked primarily in mammography (ASRT, 2004b). Medical physicists (see Box 4-1) also play an essential role in the breast imaging workforce, but supply/demand issues for these professionals are less well understood than those of interpreting physicians and RTs. A 1993 report written by the National Mammography Quality Assurance Advisory Committee showed that there were 511 medical physicists qualified under the interim rules to perform mammography surveys, and concluded that this number was sufficient to support mammography across the United States (National Mammography Quality Assurance Advisory Committee, 1996). However, concerns were subsequently raised that there would not be enough physicists to perform MQSA evaluations unless physicists substantially increased the number of mammography units they evaluated each year (Rothenberg et al., 1995). Moreover, a 2001 survey of 4 Compensation includes salary, practice supplement, bonus/incentive pay, and uncontrolled outside earnings. 5 Personal communication, R.Harris, ASRT Director of Research, November 10, 2004.
OCR for page 125
Improving Breast Imaging Quality Standards TABLE 4–2 American College of Radiology (ACR) Mammography Accreditation Program: Reason for Facility Closures Since April 2001 (as of October 2004) Reason Number of Facilities Closed % of Total Financial 523 33.5 Moved to sister site 370 23.7 Equipment 173 11.1 Staffing 161 10.3 Unknown 159 10.2 Other 84 5.4 Bankruptcy 34 2.2 Change in ownership 30 1.9 Mobile unit merged with another site 29 1.9 Total 1,563 — SOURCE: Destouet et al. (In press). Reprinted from the Journal of the American College of Radiology, In press, Destouet JM, Bassett LW, Yaffe MJ, Butler PF, Wilcox PA. The American College of Radiology Mammography Accreditation Program—10 years of experience since MQSA, with permission from The American College of Radiology. 850 medical physicists revealed that clinical activities in breast imaging were among the most time-consuming activities they performed (Cypel and Sunshine, 2004). Due to the dearth of recent data in general—let alone among those who evaluate mammographic equipment—the possibility of a present or future shortage of medical physicists active in breast imaging cannot be determined. Nonetheless, the lack of even anecdotal reports on the supply of and demand for medical physicists suggests there is no shortage of these personnel. Facility Closures The ACR documented the closure of 1,563 (out of 8,325, or about 19 percent) facilities accredited by that organization between April 2001 and October 2004 (Destouet et al., in press). Although partially offset by the opening of hundreds of new facilities, these closures contributed to a net loss of 752, or more than 8 percent, of ACR-accredited facilities over that time period. Financial factors, cited by about one-third of respondents, were the most frequent reason for facility closures, as shown in Table 4–2. The second most common reason for the closure of ACR-accredited facilities, “moved to sister site,” was cited in nearly one-quarter of these cases. This response may reflect consolidation that could provide more efficient delivery of services, but the prevalence of such closures suggests that access to mammography may have declined in many communities. According to FDA, the number of mammography units operated by hospitals and clinics rose 5.4 percent between 2000 and 2004. As a result of concerns about the increasing number of mammography facility closures, the U.S. Government Accountability Office (GAO) is currently conducting a
OCR for page 126
Improving Breast Imaging Quality Standards study to evaluate the factors that contributed to the closing of facilities nationwide since 2001. The study, to be completed by July 2005, will attempt to determine whether these facilities closed due to consolidation, or whether they represent a true reduction in mammography availability. It will also explore the relationship between certified units and facility capacity, evaluate capacity issues, and determine the effect facility closings have had on public access (including underserved populations) to mammography services since the April 2002 GAO report on access to mammography.6 Wait Times for Screening and Diagnosis A national survey of 9,908 mammogram facilities conducted in 1999–2000 found that 64 percent could schedule a patient for a screening mammogram within 7 days (IMV Medical Information Division, 2002). Similar results were obtained in a statewide survey representing 89 percent of licensed mammography practices that was conducted by the Florida Department of Health in July 2004, in conjunction with a study of the accessibility of mammography services in that state (The Workgroup on Mammography Accessibility, 2004). Survey results indicated that wait times for screening mammograms in the nation’s fourth most populous state were highly variable, ranging from less than 24 hours to several months, but that 50 percent of appointment wait times were less than 3 days. The median wait time for a diagnostic mammogram scheduled by the patient was 2 days, and if scheduled by a physician, 1 day. Seventeen percent of mammography practices reported appointment wait times exceeding 28 days for screening mammograms (as compared with 8 percent in the national survey) (Eastern Research Group and U.S. Food and Drug Administration, 2001), 24 percent had patient-scheduled diagnostic appointment wait times longer than 7 days, and 21 percent had physician-scheduled appointment wait times longer than 7 days. Reports of lengthy wait times for mammograms indicate that some breast cancer screening facilities are operating at or near full capacity (IOM, 2001). In New York City, patients waited an average of more than 40 days in 2003 for first-time screening mammograms, as compared with 14 days in 1998 (Maguire, 2003). In 2004, waits for screening mammograms in Jacksonville, Florida, where four breast imaging centers had closed within 2 years, reportedly ranged from 10 weeks to more than 5 months. The aforementioned three-state survey of community-based mammography facilities reported wait times for screening mammograms of up to 8 weeks (D’Orsi et al., 2005). Mammography facilities with staff vacancies are likely to require longer wait times for appointments. The 2003 SBI survey found a strong association between the percentage of unfilled radiologist or RT positions in breast imaging practices and the length of time symptomatic women had to wait for a mammogram (Farria et al., in press). In facilities where at least 80 percent of either radiologist or RT positions were filled, average wait times for symptomatic women were less than 24 hours. Where only 40 percent of either radiologist or technologist positions were filled, symptomatic patients waited an average of at least 2 weeks. The Florida accessibility study identified several additional factors contributing to longer wait times for mammography appointments. These included reports by 6 Letter from Arlen Specter, Tom Harkin, and Barbara A.Mikulski, U.S. Senate, to David M.Walker, Comptroller General, GAO, June 15, 2004.
OCR for page 127
Improving Breast Imaging Quality Standards TABLE 4–3 Fees for Screening Mammograms Vary by Insured Status Insurance Status Amount (2004) Private insurancea $167.00 Uninsuredb $106.39 Medicarec $88.54 Medicaidd $45.48 a The amount reported for private insurance identifies the fee considered fair and reasonable as reported by the Florida Department of Health. b Based on survey results from the American Cancer Society’s Mammography in Florida: A Consumer’s Guide, July 2004. This amount represents the average fee amount for a screening mammogram for the reporting facilities. c For Medicare, the amount is the maximum authorized for screening mammograms. The reimbursement rate is 50 percent higher for screening mammograms when digital equipment is used. Medicare patients pay 20 percent of the Medicare-approved amount. d Medicaid patients pay an additional $3 co-payment. NOTE: Fees listed are from the state of Florida. SOURCES: The Florida Legislature: Office of Program Policy Analysis & Government Accountability (2004); American Cancer Society (2004). interpreting physicians at facilities with long wait times that they limited the number of mammograms they read in order to limit their exposure to medical malpractice lawsuits (The Florida Legislature: Office of Program Policy Analysis & Government Accountability, 2004). Women with private health insurance and/or who are members of health maintenance organizations may also face extended wait time because their primary physicians are contractually obliged to refer patients to designated—and therefore, high-volume—mammography facilities. Most importantly, however, the Florida study found that low-income women face a variety of barriers to access to mammography, as described below. There is no consensus regarding optimal or acceptable wait times for screening or diagnostic appointments. Like other measures of workforce capacity, there are no national data to systematically assess wait times. There are different ways to measure this parameter, but consistently recording time to the third appointment (a standard measure for access in the health care industry; [National Quality Measures Clearinghouse, 2004]) for both screening and diagnostic exams would be a useful start. Low-Income Limits Access Many studies have identified a link between socioeconomic factors and limited access to mammography (reviewed by Lawson et al., 2000; Lannin et al., 2002; Ward et al., 2004). In Florida, the cost of services and the stipulation by most facilities that a woman must obtain a referral for a mammogram from a primary care provider were found to limit access to mammography for low-income women without insurance (The Florida Legislature: Office of Program Policy Analysis & Government Accountability, 2004). For women in Florida’s Medicaid Program, reimbursement rates and facility admission criteria can serve as barriers to obtaining mammography services. More than 20
OCR for page 153
Improving Breast Imaging Quality Standards should MQSA eventually be amended to require double reading for mammograms, the resulting increase in workload for radiologists could potentially be eased, and the cost-effectiveness of double reading increased, by permitting nonphysician clinicians (e.g., radiologist assistants, radiologic technologists, nurse practitioners, physician assistants, etc.) to serve as second readers under the direct supervision of interpreting physicians. Based on evidence from several studies evaluating the interpretation of screening mammograms by RTs working under the supervision of board-certified radiologists (Sumkin et al., 2003; Wivell et al., 2003; Casey, 2003), the IOM report Saving Women’s Lives (2005) recommended that mammography facilities “enlist specially trained non-physician personnel to prescreen mammograms for abnormalities or double-read mammograms to expand the capacity of breast imaging specialists.” Immediately following the June 2004 release of Saving Women’s Lives, the ACR expressed strong opposition to technologists reading screening mammograms (Brice and Kaiser, 2004). However, it should be stressed that the previous IOM Committee did not recommend that technologists serve as the sole readers of any mammograms, but rather as second readers—thus all mammograms would still be read by a physician. The existence of a precedent for according comparable responsibility to nonphysicians in the United States should also be noted: the routine interpretation of cervical cancer screening tests. Papanicolaou slide interpretation is carried out largely by nonphysician cytotechnologists under the supervision of a physician. Physicians are required to be onsite to provide technical oversight of the testing staff, and all gynecologic slide preparations positive for cell abnormalities must be confirmed by physicians before patient results are released (42 C.F.R. § 493). Analogous to mammography facilities, laboratories and personnel that perform Pap tests must adhere to quality assurance regulations stipulated by The Clinical Laboratory Improvement Amendments (CLIA), passed by Congress in 1988. CLIA established standards for all clinical laboratories to “ensure the accuracy, reliability, and timeliness” of clinical test results (Box 4-4). CLIA is user-fee funded; laboratories are responsible for the cost of registration, compliance, and surveys. The Centers for Medicare and Medicaid Services oversees registration, fee collection, surveys, enforcement, accreditation, and proficiency testing for all laboratories under CLIA (CMS, 2004b). The current IOM Committee concurs that the potential benefits provided by a second reading of screening mammograms by an experienced and well-trained nonphysician clinician, supervised by a licensed, MQSA-qualified interpreting physician, could be significant. In order to better characterize potential benefits and risks, the Committee recommends the implementation of demonstration programs to evaluate the potential contribution of nonphysicians to the double reading of screening mammograms. These programs will require careful design in order to ensure women’s participation. Improving Workplace Design and Organization The incorporation of key elements of successful breast cancer screening programs in other countries, including centralized expert interpretation of all breast imaging modalities and a thorough quality assurance process, could increase the quality and effectiveness of breast cancer detection in the United States. Such improvements are discussed throughout this report, and are collected in the description of Breast Imaging Centers of
OCR for page 154
Improving Breast Imaging Quality Standards BOX 4–4 CLIA Regulation of Pap Testing The Clinical Laboratory Improvement Amendments (CLIA) regulations define standards for laboratories performing clinical tests. Quality standards for cytology laboratories performing Pap testing are described below. Certificates Registration Certificate: Required initially for all laboratories performing nonwaived tests, requires paid fee, and is valid for 2 years, or until the Department of Health and Human Services (HHS) compliance inspection, whichever is shorter. Certificate of Compliance: Issued after successful completion of HHS compliance inspection. Certification requires a paid fee, is valid for 2 years, and can be renewed. Laboratories undergo announced or unannounced inspections by HHS biannually to ensure compliance. Certificate of Accreditation: Issued in lieu of a Certificate of Compliance to laboratories certified through a private, not for-profit accrediting program approved by HHS. Laboratories undergo random sample validation inspections, conducted randomly by HHS to validate the accrediting process. HHS monitors inspection and proficiency testing data from accredited labs. Certificate is valid for 2 years, requires paid fee, and is renewable. Failure to meet accreditation standards results in full compliance review by HHS. Proficiency testing Overview: A laboratory must enroll in HHS-approved proficiency testing (PT) programs for each specialty for which it seeks certification. HHS uses PT data to measure laboratory compliance; PT data are available to the public. General requirements: PT samples must be tested in the same manner by the same personnel as patient samples. Communication between laboratories on PT samples is forbidden. Pap cytology: Personnel (cytotechnologists and technical supervisors) are tested once per year via announced or unannounced testing events. Test overview: Sample slides provided by the PT program are distributed to cytology laboratories. Individual responses are collected and compared with the predetermined consensus agreement from at least three physicians certified in anatomic pathology. Scoring: Slides are graded individually. Scoring rewards or penalizes participants in proportion to the distance of their answers from the correct response, and is weighted in proportion to severity of sample lesion. Personnel must achieve a PT score of 90+ to pass. Compliance: Two hours to complete the basic 10-slide proficiency test, Failure. First failure: Retested with an additional 10 slides. Second failure: Mandatory remedial training and education followed by a 4-hour, 20-slide test. All gynecologic slides evaluated subsequent to notice of failure must be reviewed and documented until the 20-slide retest is taken. Third failure: Personnel must cease examination of gynecologic slides, and must complete 35 hours of formal continuing education until effective completion of 20-slide test.
OCR for page 155
Improving Breast Imaging Quality Standards Quality systems Preanalytic, analytic, and postanalytic systems: Laboratories must adhere to standards for all phases of the testing process. Cytology analytic systems: Written policies: Laboratories must establish written policies for detecting errors in performance, including review of slides determined to be negative for cell abnormalities, comparison of clinical information with prior cytology reports, statistical laboratory evaluation, and evaluation of each individual interpreting slides against the laboratory’s overall performance. Workload limits: Technical supervisors establish limits for laboratory personnel, not to exceed examination of 100 slides in 24 hours. Oversight: Technical supervisors must confirm each gynecological examination interpreted to exhibit cell abnormalities (e.g., malignancy). Personnel Laboratory director: Must be a doctor of medicine, osteopathy, or podiatry, licensed in the state, with certification in anatomic or clinical pathology and significant laboratory experience. Responsible for overall operation and administration of the laboratory. Technical supervisor: Must be a doctor of medicine or osteopathy licensed by the state. Responsible for technical and scientific oversight of the laboratory. Clinical consultant: Must qualify as a laboratory director. Provides consultation on appropriateness of tests ordered and interpretation of results. Cytology general supervisor: Must be qualified as a technical supervisor or have 3 years of full-time experience in the preceding 10 years. Responsible for daily oversight of laboratory operation; must be accessible to provide onsite assistance. Must document all cytology cases he or she examines or reviews. Cytotechnologist: Must be state licensed. Responsible for interpretation results of each gynecologic cytology case examined or reviewed. Testing personnel: Must be state licensed. Responsible for specimen processing, test performance, and reporting test results. Inspection Basic inspections: The Centers for Medicare and Medicaid Services (CMS) or a CMS agent may interview personnel, require the facility to analyze test samples, observe personnel performing all phases of the testing process, or examine records and data. Compliance inspections: Laboratories issued a Certificate of Registration are subject to initial compliance inspections. Subsequent inspections are conducted on a biennial or more frequent basis as necessary to ensure compliance. Certificate of Accreditation inspections: CMS conducts validation and complaint inspections at labs operating under a Certificate of Accreditation. CMS may conduct a full review if there is evidence of noncompliance. Enforcement/sanctions Sanctions against laboratories with noncompliance violations include suspension, limitation, or revocation of CLIA certificate; Medicare payment approval cancellation; directed plans of correction; civil money penalties; and onsite monitoring. SOURCE: 42 C.F.R. § 493 (2003).
OCR for page 156
Improving Breast Imaging Quality Standards Excellence in Chapter 2; similar recommendations were also made in Saving Women’s Lives. These same attributes—centralized, high-volume interpretation and ex-parties in nonmammographic imaging—may also make Breast Imaging Centers of Excellence attractive places to work. Such centers could offer breast imagers the opportunity to use diverse skills, rather than focusing solely on mammography. These multidisciplinary environments should foster networking and feedback among practitioners with a common interest in breast health—practices that are not only likely to increase job satisfaction for radiologists and radiologic technologists, but which also appear to encourage accuracy in mammogram interpretation (Beam et al., 2003; Maguire, 2003). The continuity among screening, diagnosis, and treatment at breast health centers could also facilitate quality assurance, allowing it to become a more natural part of workflow and less of a burden. A structure for organizing such multidisciplinary breast units throughout Europe was proposed in a 2000 position paper by the European Society of Mastology (European Society of Mastology (EUSOMA), 2000). This document established guidelines intended to convert existing, heterogeneous practices into a unified system with strong standards for the diagnosis and treatment of breast cancer (Mansel, 2000). While breast cancer experts in both Europe and the United States applauded the proposal’s overall aims—ensuring prompt and efficient diagnosis of breast cancer by specialists—they also expressed concern with the inflexibility of the proposed requirements, including the roles and protocols prescribed for the core members of the breast care teams (Silverstein, 2000; Mansel, 2000). These objections make clear that specialization in and of itself is not a prescription for job satisfaction, particularly in the United States, where physicians highly prize their right to individual judgment (Silverstein, 2000). Increasing Administrative Efficiency If RTs and interpreting physicians are to maximize their productivity, they should be able to focus their efforts on image interpretation and performing interventional breast imaging procedures, undistracted by administrative tasks. Administrative personnel, data entry personnel, and others could make an important contribution by taking on nontechnical responsibilities in quality control and administration. The Committee therefore recommends support for demonstrations to evaluate the roles of such nontechnical personnel in mammography, and to assess the costs and benefits of alternative staffing configurations on the efficiency, productivity, and quality of breast imaging services. SUMMARY AND CONCLUSIONS Because early detection of occult breast cancer is a key element for reducing breast cancer morbidity and mortality, it is important to accurately monitor the capacity of mammography services and to ensure adequate access for women. The paucity of robust national and regional data on the supply of and demand for mammography services necessitated an assessment of the mammography workforce based on estimates and projections and informed by anecdotal and regional reports of unfilled positions, facility closures, wait times, and barriers to access. Barring changes that would decrease demand, demographic projections predict that access to mammography is likely to become in-
OCR for page 157
Improving Breast Imaging Quality Standards creasingly limited, particularly in light of trends in training and employment for both interpreting physicians and RTs. The most severe restrictions in access will probably occur among currently underserved populations, including low-income women. Clearly, data on the national mammography workforce, volume of services, and capacity should be routinely collected and analyzed, both in order to determine the status quo and to plan for the future. The Committee recommends that FDA address this need by collecting the relevant data during the annual inspection and using unique identifiers for all certified physicians, technologists, and medical physicists. The Health Resources and Service Administration should analyze this data to produce routine reports on the volume of mammography services by region, state, and type of service. There is an urgent need to begin data collection immediately because it will take several years to identify trends. If the fragile stability of the breast imaging workforce moves toward crisis, data will be needed to react swiftly and effectively. Tracking mammography capacity will also be very important to monitor the impact of the new regulations and voluntary programs recommended in this report. There is always potential for unintended consequences of changes designed to improve quality. For example, it is possible that mammography facilities that lack the resources to participate in the voluntary advanced audit program or to seek designation as a Center of Excellence, as described in Chapter 2, might unfairly be viewed as providing substandard care by patients and insurers, and thus could see their patient base and income decrease This could lead to facility closures and reduce access, especially among women who lack the means to travel and pay for services. Likewise, the added costs of the proposed new medical audit procedures, whether covered by increased reimbursements or not, could disproportionately affect access by low income women. Initiatives to expand the mammography workforce face a spectrum of factors that discourage today’s radiology residents from choosing breast imaging as a subspecialization in radiology and general radiologists from interpreting mammograms. Strategies proposed here to ease these problems focus on increasing the number of entrants to the field of breast imaging and their employment in underserved communities and on retaining skilled breast imagers. The first of these aims could be advanced through existing loan repayment and J-1 visa waiver programs. The second could be achieved by encouraging federal and state agencies and health care payers to develop incentives to recruit and retain skilled breast imagers, for example through support for part-time interpretation of mammograms. Establishing reimbursement rates for mammography that reflect the workload and expense of adhering to requirements of MQSA, as recommended in Chapter 2, would also have a positive impact on the workforce. Improvements in workplace organization and effectiveness could act in a variety of ways to increase access to mammography, by simultaneously boosting recruitment, retention, and productivity in the breast imaging workforce. REFERENCES Aben GR, Bryson HA, Bryson TC. 2004 (June 21). Digital vs. Conventional Mammography, An Opportunity for Process Improvement. Presentation at the meeting of the Seventh International Workshop on Digital Mammography, Chapel Hill, NC.
OCR for page 158
Improving Breast Imaging Quality Standards Adams D. 2003. amednews.com: Doctors Resigned to Public Web Profiles. [Online]. Available: http://www.ama-assn.org/amednews/2003/05/05/prsa0505.htm [accessed November 17, 2004]. Advanced Practice Advisory Panel. 2002. The Radiologist Assistant: Improving Patient Care while Providing Work Force Solutions. Consensus Statements from the Advanced Practice Advisory Panel, March 9–10, 2002. Washington, DC: American Society of Radiologic Technologists. Aiello E, Buist DS, White E, Porter PL. 2005. The association between mammographic breast density and breast cancer tumor characteristics. Cancer Epidemiology, Biomarkers & Prevention 14(3):662–668. American Board of Nuclear Medicine. 2004. Certification and Requirements. [Online]. Available: http://www.abnm.org/frameset-certreq.html [accessed September 15, 2004]. American Board of Radiology. 2004. Training Requirements. [Online]. Available: http://www.theabr.org/DRAppAndFeesinFrame.htm [accessed December 6, 2004]. American Cancer Society. 2004. Mammography in Florida: A Consumer’s Guide. [Online]. Available: http://www.cancer.org/floridamammogram [accessed January 5, 2005]. American College of Radiology. 2004. Analysis and Reports on Radiologists Performing Mammography. Provided to the Institute of Medicine by the American College of Radiology. Reston, VA: American College of Radiology. American Heart Association. 2000. Women, Heart Disease & Stroke Survey Highlights. [Online]. Available: http://www.americanheart.org/presenter.jhtml?identifier=10382 [accessed February 9, 2005]. American Registry of Radiologic Technologists. 2001. Annual Report of Examinations: Result of the 2001 Examination in Radiography, Nuclear Medicine Technology, and Radiation Therapy. St. Paul, MN: American Registry of Radiologic Technologists. Arenson R. 2004. National Radiology Fellowship Match Program: Success or failure? Journal of the American College of Radiology 1(3):188–191. ASRT (American Society of Radiologic Technologists). 2004a. Mammography Data for MQSA Reauthorization Effort. Albuquerque, NM: ASRT. ASRT. 2004b. Radiologic Technologist Wage and Salary Survey. Albuquerque, NM: ASRT. Association of American Medical Colleges. 1999–2003. AAMC Report on Medical School Faculty Salaries. Washington, DC: Association of American Medical Colleges. Bassett LW, Monsees BS, Smith RA, Wang L, Hooshi P, Farria DM, Sayre JW, Feig SA, Jackson VP. 2003. Survey of radiology residents: Breast imaging training and attitudes. Radiology 227(3):862–869. Beam CA, Conant EF, Sickles EA. 2003. Association of volume and volume-independent factors with accuracy in screening mammogram interpretation. Journal of the National Cancer Institute 95(4):282–290. Berlin L, Chair, Department of Radiology, Rush North Shore Medical Center, Professor of Radiology, Rush Medical College. 2003. Mammography Quality Standards Act Reauthorization. Statement at the April 8, 2003, hearing of the Subcommittee on Aging, Committee on Health, Education, Labor, and Pensions, U.S. Senate. Brewin B. 2003. Bio-IT Bulletin: DOD System Brings Medical Expertise Closer. [Online]. Available: http://www.bio-itworld.com/news/021203_report2006.html [accessed April 11, 2003]. Brice J. 2004. Closing doors in mammography threaten continued access to care. Diagnostic Imaging (Sept.):25–35.
OCR for page 159
Improving Breast Imaging Quality Standards Brice J, Kaiser JP. 2004. ACR Pans Proposal to Allow Physician Assistants to Read Mammography. Diagnostic Imaging Online. [Online]. Available: http://www.diagnosticimaging.com/dinews/2004061401.shtml [accessed August 19, 2004]. California Department of Health Services, State Office of Rural Health. 2004. CalSORH: J-1 Visa Waiver Request Guidelines and Documents. [Online]. Available: http://www.dhs.ca.gov/pcfh/prhcs/Programs/CalSORH/WaiverRequest.htm [accessed December 6, 2004]. Casey B. 2003. Breast Center Enlists Radiographers for First Look at Mammograms. [Online]. Available: http://www.auntminnie.com/default.asp?Sec=sup&Sub=wom&Pag=dis&ItemId=57614&stm=radiographers [accessed February 19, 2004]. Center for Health Workforce Studies. 1999–2004. Residency Training Outcomes by Specialty in New York State: Annual Summaries of Responses to the NYS Resident Exit Survey, 1998–2003. Rensselaer, NY: University at Albany, State University of New York, School of Public Health, Center for Health Workforce Studies. Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion. 2002. Behavioral Risk Factor Survey. Atlanta, GA: Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion. CMS (Centers for Medicare and Medicaid Services). 2004a. Medicare program: Revisions to payment policies under the Physician Fee Schedule for calendar year 2005. Final Rule. Federal Register 69(219):66235–66915. CMS. 2004b. CLIA Program: Clinical Laboratory Improvement Amendments. [Online]. Available: http://www.cms.hhs.gov/clia/ [accessed December 14, 2004]. Congressional Research Service. 2003. Medicare: Payments to Physicians. RL31199. Washington, DC: Congressional Research Service. Cooper RA, Stoflet SJ, Wartman SA. 2003. Perceptions of medical school deans and state medical society executives about physician supply. JAMA 290(22):2992–2995. Cronin KA, Yu B, Krapcho M, Miglioretti DL, Fay MP, Izmirlian G, Ballard-Barbash R, Geller BM, Feuer EJ. In press. Modeling the dissemination of mammography in the United States. Cancer Causes and Control. Cypel YS, Sunshine JH. 2004. Diagnostic medical physicists and their clinical activities. Journal of the American College of Radiology 1(2):120–126. Destouet JM, Bassett LW, Yaffe MJ, Butler PF, Wilcox PA. In press. The American College of Radiology Mammography Accreditation Program—10 years of experience since MQSA. Journal of the American College of Radiology. D’Orsi C, Tu SP, Nakano C, Carney PA, Abraham LA, Taplin SH, Hendrick RE, Cutter GR, Berns E, Barlow WE, Elmore JG. 2005. Current realities of delivering mammography in the community: Do challenges with staffing and scheduling exist? Radiology 235(2):391–395. Duffy SW, Day NE, Tabar L, Chen HH, Smith TC. 1997. Markov models of breast tumor progression: Some age-specific results. Journal of the National Cancer Institute Monographs (22):93–97. Dunnick NR. 2004. ACR Intersociety Conference 2003 summary: Radiologist assistants and other radiologist extenders. Journal of the American College of Radiology 1(6):386–391. Eastern Research Group and U.S. Food and Drug Administration. 2001. Availability of Mammography Services. Contract ID 223–94–8031. Lexington, MA: Eastern Research Group. Elmore JG, Taplin S, Barlow WE, Cutter G, D’Orsi C, Hendrick RE, Abraham L, Fosse J, Carney PA. In press. Community radiologists’ medical malpractice experience, concerns, and interpretive performance. Radiology.
OCR for page 160
Improving Breast Imaging Quality Standards Enzmann DR, Anglada PM, Haviley C, Venta LA. 2001. Providing professional mammography services: Financial analysis. Radiology 219(2):467–473. European Society of Mastology (EUSOMA). 2000. The requirements of a specialist breast unit. 36(18):2288–2293. Farria D, Schmidt ME, Monsees BS, Smith RA, Hildebolt C, Yoffie R, Monticciolo DL, Feig SA, Bassett LW. In press. Professional and economic factors affecting access to mammography: A crisis today, or tomorrow? Results from a national survey. Cancer. Feig SA, Hall FM, Ikeda DM, Mendelson EB, Rubin EC, Segel MC, Watson AB, Eklund GW, Stelling CB, Jackson VP. 2000. Society of Breast Imaging residency and fellowship training curriculum. Radiologic Clinics of North America 38(4):xi, 915–920. Gairard B, Renaud R, Haehnel P, Dale G, Schaffer P. 1992. How to organize a non centralized screening programme. In: Ioannidou-Mouzaka L, Agnantis NJ, Karydas I, eds. Senology. Vol. 1005. International Congress Series. Pp. 139–142. Gairard B, Renaud R, Schaffer P, Guldenfels C, Kleitz C. 1997. Breast cancer screening in France: An update. Journal of Medical Screening 4(1):5. Gorman C. 2001. Need a mammogram? It could take a while. TIME 157(10):78–81. Hagopian A, Thompson MJ, Kaltenbach E, Hart LG. 2003. Health departments’ use of international medical graduates in physician shortage areas. Health Affairs 22(5):241–249. Hayes JC. 2004. Mammography crisis continues as experts struggle to find solution. Diagnostic Imaging (Sept.):5. Helvie MA. 2004. Image analysis. In: Harris JR, Lippman ME, Morrow M, Osborne CK, eds. Diseases of the Breast. New York: Lippincott Williams & Wilkins. Pp. 131–148. Hulka CA, Slanetz PJ, Halpern EF, Hall DA, McCarthy KA, Moore R, Boutin S, Kopans DB. 1997. Patients’ opinion of mammography screening services: Immediate results versus delayed results due to interpretation by two observers. American Journal of Roentgenology 168(4):1085–1089. IMV Medical Information Division. 2002. Benchmark Report: Mammography 2000. Des Plaines, IL: IMV Limited. IOM (Institute of Medicine). 2001. Mammography and Beyond: Developing Technologies for the Early Detection of Breast Cancer. Washington, DC: National Academy Press. IOM. 2005. Saving Women’s Lives: Strategies for Improving Breast Cancer Detection and Diagnosis. Washington, DC: The National Academies Press. Irwig L, Houssami N, van Vliet C. 2004. New technologies in screening for breast cancer: A systematic review of their accuracy. British Journal of Cancer 90(11):2118–2122. Jansen JT, Zoetelief J. 1997. Assessment of lifetime gained as a result of mammographic breast cancer screening using a computer model. British Journal of Radiology 70(834):619–628. Kopans DB. 2004. Sonography should not be used for breast cancer screening until its efficacy has been proven scientifically. American Journal of Roentgenology 182(2):489–491. Kriege M, Brekelmans CT, Boetes C, Besnard PE, Zonderland HM, Obdeijn IM, Manoliu RA, Kok T, Peterse H, Tilanus-Linthorst MM, Muller SH, Meijer S, Oosterwijk JC, Beex LV, Tollenaar RA, de Koning HJ, Rutgers EJ, Klijn JG, Magnetic Resonance Imaging Screening Study Group. 2004. Efficacy of MRI and mammography for breast-cancer screening in women with a familial or genetic predisposition. New England Journal of Medicine 351(5):427–437. Lannin DR, Mathews HF, Mitchell J, Swanson MS. 2002. Impacting cultural attitudes in African-American women to decrease breast cancer mortality. American Journal of Surgery 184(5):418–423.
OCR for page 161
Improving Breast Imaging Quality Standards Lawson HW, Henson R, Bobo JK, Kaeser MK. 2000. Implementing recommendations for the early detection of breast and cervical cancer among low-income women. Morbidity & Mortality Weekly Report 49(RR-2):37–55. Lee CH. 2004. Problem solving MR imaging of the breast. Radiologic Clinics of North America 42(5):vii, 919–934. Liberman L, Morris EA, Benton CL, Abramson AF, Dershaw DD. 2003. Probably benign lesions at breast magnetic resonance imaging: Preliminary experience in high-risk women. Cancer 98(2):377–388. Lindfors KK, O’Connor J, Parker RA. 2001. False-positive screening mammograms: Effect of immediate versus later work-up on patient stress. Radiology 218(1):247–253. Linver MN. 2002. Coding and billing in breast imaging. Decisions in Imaging Economics (April). Maccia C, Nadeau X, Renaud R, Castellano S, Schaffer P, Wahl R, Haehnel P, Dale G, Gairard B. 1995. Quality control in mammography: The pilot campaign of breast screening in the Bas-Rhin region. Radiation Protection Dosimetry 57(1–4):323–328. Maguire P. 2003. Is an access crisis on the horizon in mammography? ACP Observer (Oct.). Mansel RE. 2000. Should specialist breast units be adopted in Europe? A comment from Europe. European Journal of Cancer 36(18):2286–2287. Martinez B. 2000 (October 30). Mammography centers shut down as reimbursement feud rages on. The Wall Street Journal. P. A1. McCann J, Wait S, Seradour B, Day N. 1997. A comparison of the performance and impact of breast cancer screening programmes in East Anglia, U.K. and Bouches Du Rhone, France. European Journal of Cancer 33(3):429–435. Merritt, Hawkins & Associates. 2003. 2003 Review of Physician Recruiting Incentives. Irving, TX: The MHA Group. Michalowski J. 2003. Telemedicine: Transporting cancer expertise to all corners of the world. Benchmarks 3(6):1. Mosca L, Ferris A, Fabunmi R, Robertson RM. 2004. Tracking women’s awareness of heart disease: An American Heart Association national study. Circulation 109(5):573–579. National Mammography Quality Assurance Advisory Committee. Subcommittee on Physicist Availability. 1996. Medical Physicist Availability Report. Unpublished. National Quality Measures Clearinghouse. 2004. Access: Time to Third Next Available Long Appointment. [Online]. Available: http://www.qualitymeasures.ahrq.gov/summary/summary.aspx?doc_id5743 [accessed February 22, 2005]. Odle TG. 2003. Mammography coding and reimbursement. Radiologic Technology 74(5):385–404; quiz 405–412. Perry NM. 2004 (September 2). Mammography Quality and Performance in the National Health Service Breast Screening Programme. Presentation at the meeting of the Institute of Medicine Committee on Improving Mammography Quality Standards, Washington, DC. Physician Insurers Association of America. 2002. Breast cancer study. 3rd ed. Rockville, MD: Physician Insurers Association of America. Renaud R, Gairard B, Schaffer P, Guldenfels C, Haehnel P, Dale G, Bellocq JP. 1994. Europe Against Cancer Breast Cancer Screening Programme in France: The ADEMAS Programme in Bas-Rhin. European Journal of Cancer Prevention 3(Suppl 1):13–19. Rothenberg LN, Deye JA, High MD, Jessop NW, Sternick ES. 1995. Demographic characteristics of physicists who evaluate mammographic units. Radiology 194(2):373–377. RSNA (Radiological Society of North America). 2004a. Radiology assistants will share workload in diagnostic imaging. Radiological Society of North America News 14(2):5–6.
OCR for page 162
Improving Breast Imaging Quality Standards RSNA. 2004b. Radiologist shortage easing, physician shortage growing. Radiological Society of North America News 14(4):6–7. Shtern F, Winfield D, eds. 1999. Report of the Joint Working Group on Telemammography/Teleradiology and Information Management: March 15–17, 1999. Washington, DC: U.S. Public Health Service, Office on Women’s Health. Sickles EA, Miglioretti DL, Ballard-Barbash R, Geller BM, Leung JW, Rosenberg RD, Smith-Bindman R, Yankaskas BC. In press. Performance benchmarks for diagnostic mammography. Radiology. Silverstein MJ. 2000. State-of-the-art breast units-a possibility or a fantasy? A comment from the U.S. European Journal of Cancer 36(18):2283–2285. Smith RA, Cokkinides V, Eyre HJ. 2003. American Cancer Society guidelines for the early detection of cancer, 2003. CA: A Cancer Journal for Clinicians 53(1):27–43. Smith-Bindman R, Chu P, Miglioretti D, Quale C, Rosenberg RD, Cutter G, Geller B, Bacchetti P, Sickles EA, Kerlikowske K. 2005. Physician predictors of mammographic accuracy. Journal of the National Cancer Institute 97(5):358–367. Sumkin JH, Klaman HM, Graham M, Ruskauff T, Gennari RC, King JL, Klym AH, Ganott MA, Gur D. 2003. Prescreening mammography by technologists: A preliminary assessment. American Journal of Roentgenology 180(1):253–256. Sunshine J, Bhargavan M, Lewis R. 2004a (September 3). Information on Radiologists Who Interpret Mammograms. Presentation at the meeting of the Institute of Medicine Committee on Improving Mammography Quality Standards, Washington, DC. Sunshine JH, Maynard CD, Paros J, Forman HP. 2004b. Update on the diagnostic radiologist shortage. American Journal of Roentgenology 182(2):301–305. Taplin SH, Rutter CM, Lehman C. Submitted. Testing the effect of computer assisted detection upon interpretive performance in screening mammography. The Florida Legislature: Office of Program Policy Analysis & Government Accountability. (OPPAGA). 2004. OPPAGA Report: Access to Mammography Services in Florida Is More Limited for Low-Income Women. 04–79. Tallahassee, FL: OPPAGA Report Production. The Society of Breast Imaging. 1999–2005. Breast Imaging Residency Training Curriculum. [Online]. Available: http://www.sbi-online.org/residentfellow.htm [accessed December 2, 2004]. The Workgroup on Mammography Accessibility. 2004. Report of The Workgroup on Mammography Accessibility. Tallahassee, FL: Florida Department of Health. Thorwarth WT, Borgstede JP. 2001. Mammography reimbursement: components and strategies for change. Reston, VA: American College of Radiology. Tourangeau R, Rips LJ, Rasinski K. 2000. The Psychology of Survey Response. Cambridge, UK: Cambridge University Press. Pp. 165–229. Trevino M. 2003. Air Force Teleradiology Project Aims to Alleviate Staff Shortages: Plan Could Even Out Workflow and Improve Access to Subspecialty Reads. [Online]. Available: http://www.diagnosticimaging.com/pacsweb/cover/cover05090202.shtml [accessed May 13, 2003]. U.S. Census Bureau. 2004. U.S. Interim Projections by Age, Sex, Race, and Hispanic Origin. [Online]. Available: http://www.census.gov/ipc/www/usinterimproj/ [accessed August 26, 2004]. U.S. Department of Health and Human Services, Health Resources and Services Administration, National Health Service Corps. 2004a. Career Information: Loan Repayment Program. [Online]. Available: http://www.wphca.org/nhsc.html [accessed December 30, 2004].
OCR for page 163
Improving Breast Imaging Quality Standards U.S. Department of Health and Human Services, Health Resources and Services Administration, National Health Service Corps. 2004b. Fiscal Year 2005 Loan Repayment Program Application Information Bulletin. [Online]. Available: http://nhsc.bhpr.hrsa.gov/applications/lrp_05/e.cfm [accessed December 30, 2004]. U.S. Department of State, Bureau of Educational and Cultural Affairs. 2004. Waivers. [Online]. Available: http://exchanges.state.gov/education/jexchanges/participation/waivers.htm [acessed December 6, 2004]. U.S. Government Accountability Office. 2002. Mammography: Capacity Generally Exists to Deliver Services. GAO-02–532. Washington, DC: U.S. Government Accountability Office. Wait S, Schaffer P, Seradour B, Guldenfels C, Gairard B, Morin F, Piana L. 2000. The cost of breast cancer screening in France. Journal de Radiologie 81(7):799–806. Ward E, Jemal A, Cokkinides V, Singh GK, Cardinez C, Ghafoor A, Thun M. 2004. Cancer disparities by race/ethnicity and socioeconomic status. CA: A Cancer Journal for Clinicians 54(2):78–93. Warner E, Plewes DB, Hill KA, Causer PA, Zubovits JT, Jong RA, Cutrara MR, DeBoer G, Yaffe MJ, Messner SJ, Meschino WS, Piron CA, Narod SA. 2004. Surveillance of BRCA1 and BRCA2 mutation carriers with magnetic resonance imaging, ultrasound, mammography, and clinical breast examination. JAMA 292(11):1317–1325. Wentland EJ. 1993. Survey Responses: An Evaluation of their Validity. San Diego, CA: Academic Press. Pp. 71–93. White E, Miglioretti DL, Yankaskas BC, Geller BM, Rosenberg RD, Kerlikowske K, Saba L, Vacek PM, Carney PA, Buist DS, Oestreicher N, Barlow W, Ballard-Barbash R, Taplin SH. 2004. Biennial versus annual mammography and the risk of late-stage breast cancer. Journal of the National Cancer Institute 96(24):1832–1839. Williams CD, Short B. 2004. ACR and ASRT development of the radiologist assistant: Concept, roles, and responsibilities. Journal of the American College of Radiology 1(6):392–397. Wing P. 2005. IOM Mammography Projects and Related Background Information. Rensselaer, NY: University at Albany, State University of New York, School of Public Health, Center for Health Workforce Studies. Wisconsin Department of Health and Family Services. 2003. Wisconsin J-1 Visa Waiver Program. [Online]. Available: http://dhfs.wisconsin.gov/DPH_BCDHP/J_1VISA/ [accessed December 6, 2004]. Wivell G, Denton ER, Eve CB, Inglis JC, Harvey I. 2003. Can radiographers read screening mammograms? Clinical Radiology 58(1):63–67. Yasmeen S, Romano PS, Pettinger M, Chlebowski RT, Robbins JA, Lane DS, Hendrix SL. 2003. Frequency and predictive value of a mammographic recommendation for short-interval follow-up. Journal of the National Cancer Institute 95(6):429–436.
Representative terms from entire chapter: