Page 269

Appendixes



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 269
Page 269 Appendixes

OCR for page 269

OCR for page 269
Page 271 Appendix A— Site Visit Summaries As part of its data-gathering activities, the Committee on Enhancing the Internet for Health Applications visited eight sites that were either developing health-related applications of the Internet or engaged in health-related activities that could be transferred to the Internet in the future. These visits, conducted between December 1998 and February 1999, spanned half a day to one full day each. They provided committee members with a snapshot of the state of deployment of the Internet in the health community at that point in time and an opportunity to better understand the technical and other challenges associated with use of the Internet in support of health care. This appendix briefly summarizes the committee's eight site visits, including four in California, two in North Carolina, and two in Washington State. The sites were the Laboratory for Radiological Informatics at the University of California at San Francisco (UCSF); Kaiser-Permanente of Northern California, Oakland; Stanford Center for Professional Development, Stanford University; the National Aeronautics and Space Administration (NASA) Ames Research Center, Mountain View, California; the Center for Health Sciences Communication at East Carolina University (ECU), Greenville, North Carolina; the University of North Carolina (UNC), Chapel Hill; and the University of Washington (UW) and Regence BlueShield in Seattle.break

OCR for page 269
Page 272 Laboratory for Radiological Informatics Members of the study committee spent half a day at the UCSF Laboratory for Radiological Informatics (LRI) on December 16, 1999. H.K. (Bernie) Huang and his colleagues at LRI demonstrated systems designed to share three different types of telemedical images: digital mammograms; cardiograms; and neurological images made by magnetic resonance imaging (MRI) and computerized tomography (CT).1 For the most part, these systems make use of a picture archiving and communications system (PACS) at UCSF that stores digital medical images in several terabytes of optical disk storage and an asynchronous transfer mode (ATM) wide-area network (WAN) that connects UCSF with nearby Mount Zion Hospital. Originally, the laboratory used a dedicated WAN, but then the university installed a synchronous optical network (SONET) ring, which now supports the network. A T1 connection—which supports data rates of 1.544 megabits per second (Mbps)—links UCSF with Stanford University Medical Center, an hour's drive away. Telemammography Steve Frankel of the Breast Imaging Section at UCSF and Andrew Lou of LRI provided an overview of the teleimaging system. UCSF now has two full-field digital telemammography systems that produce images of 40 to 60 megabytes (MB) compressed (using an acceptably lossless compression scheme). A typical study generates four such images, two of each breast, but also requires a historical set of equal size for comparison. Images can be transmitted across the WAN for remote interpretation and diagnosis or real-time reading by expert mammographers. In the demonstration, staff physicians at UCSF sent images to Mount Zion for interpretation. Physicians at both UCSF and Mount Zion used high-resolution (2,000 × 2,000 pixel) monitors to view the images. Using custom software developed at UCSF, the referring and consulting physicians could use an on-screen dual-pointer to identify objects of interest and change the brightness and contrast of the images to aid in interpretation. An electronic magnifying glass enabled mammographers to examine portions of the image in greater detail. Expert reading of mammograms in real time is not necessarily needed for regular screenings, but it can be useful if potential abnormalities are discovered. In such cases, remote experts can provide faster diagnoses or request additional images before the patient leaves the mammography center. Busy centers may examine 80 to 100 women per day (20 per day is more typical for an average center), with the expectation that images willcontinue

OCR for page 269
Page 273 be read the following day. Longer delays are not uncommon; many mobile mammogram centers have 2-week turnaround times. Telemammography is viewed as a means of supporting increased demand for mammograms. The National Cancer Institute (NCI) recommendation that all women over age 40 have an annual mammogram could, if widely heeded, greatly increase the rate of mammography, UCSF system developers noted. They also said the nation has too few expert mammographers to handle the increased volume and, moreover, that many rural areas have no local experts. To provide teleradiology interpretation services for other health care organizations, UCSF has established a consortium with Emory University in Atlanta; Wake Forest University in Winston-Salem, North Carolina; Brigham and Women's Hospital in Boston; and the University of Pennsylvania. The consortium, named Telequest, accepts images over a dedicated T1 line and provides transcribed diagnoses. Participating physicians must be cross-licensed in the examination sites. Cardiology Teleconferencing Tony Chou, director of the catheterization labs, described a cardiology teleconferencing system linking UCSF and Stanford University. This application, which runs on a T1 line, is used as an educational tool to enable physicians to present cases to colleagues for postmortem reviews. It is not yet used for diagnostic purposes but it could be, once the institutions install digitized catheterization labs. The system has been used to review angiography and intravascular ultrasound images, which are currently captured as analog video and digitized. Digitized angiogram files are roughly 60 MB in size; intravascular ultrasounds are about 50 MB. The size of these files is expected to grow by a factor of 10 once the fully digital systems are installed. Videos are displayed at a rate of 25 frames per second. In applications tested to date, digitized videos of angiograms and intravascular ultrasound have been transmitted across the T1 network and replicated on both sides of the connection, a process that takes approximately 5 minutes. In one case, images were transferred to a medical center in Germany overnight. Participants in the teleconferences can examine images simultaneously, using on-screen pointers to note items of interest and zooming in on particular portions of the video. Physicians report that the quality of the video is already sufficient for most diagnoses, but the system has been used only for reviewing outcomes and medical decision making.break

OCR for page 269
Page 274 Neuroimaging Bill Dillon, chief of the Neuroimaging Department, provided an overview of the neuroimaging teleconsultation system linking UCSF and Mount Zion. The system is used primarily for MRI and CT scans, which usually are digitally captured and stored in the PACS. Before the PACS was established, some 15 to 20 percent of films were lost and hundreds went unread—meaning that the hospital could not bill for the service. The PACS allows immediate access to images and was accepted rapidly by physicians, who found that the system greatly increased the ease of locating needed studies before meeting with a patient. The WAN has enabled UCSF to offer image interpretation services. In fact, UCSF now has a resident on call to read neuroimages taken at Mount Zion. Such centralization of interpretation skills helps accommodate the interests of the state of California, which is pressuring California hospitals and medical schools to reduce the number of specialists and trainees in specialty areas. Similar pressures are being felt at the national level as health care providers merge and attempt to consolidate services and eliminate duplicative capabilities across large health care delivery systems. Kaiser-Permanente of Northern California The study committee visited Kaiser-Permanente of Northern California on the afternoon of December 16, 1999. Kaiser-Permanente is the nation's largest health maintenance organization (HMO), with more than 9.4 million members in 20 states. It has approximately 100,000 employees, including 10,000 physicians and 30,000 nurses and pharmacists, and it operates some 30 hospitals with more than 8,000 beds. The HMO is not a single entity but, rather, an affiliation of two separate organizations: the Kaiser Foundation Health Plan, which handles administrative and management functions, and Permanente Medical Group, which consists of 12 separate groups of practitioners. Kaiser-Permanente has experimented with network-based telemedicine applications, particularly in teledermatology and teleradiology. Its most notable success was with a system for retinal screening of diabetic patients, which began when a clinic bought a camera and began sending images electronically to an expert reader for interpretation. This simple procedure increased initial screening rates from 30 to 93 percent of all patients who met the criteria in the risk guidelines. Kaiser-Permanente staff began evaluating different Internet applications at the request of its chief executive officer. The result was a unified, three-pronged strategy consisting of a provider-oriented system; acontinue

OCR for page 269
Page 275 customer-focused system; and a common, shared database. Development of the provider system and shared database—the Permanente Knowledge Connection (PKC)—has proceeded under the auspices of Kaiser-Permanente's Care Management Institute. Development of the consumer component, the KPOnline system, has proceeded separately. The discussions during the site visit focused on both the institute's efforts to develop internal applications for use by care providers and on the efforts of KPOnline, a customer-focused system. Permanente Knowledge Connection Peter Juhn, executive director, described the activities of the Care Management Institute (CMI), a national entity within Kaiser-Permanente that operates on behalf of both the Kaiser Foundation Health Plan and the Permanente Medical Group. CMI was established in 1997 and employs approximately 80 people, 30 of whom work in Oakland and the rest of whom are distributed throughout the Kaiser-Permanente system. Its principal function is to develop evidence-based approaches to care management. This work has three areas of emphasis: content, measurement, and implementation. Content work includes the development of management programs for conditions such as diabetes, asthma, and depression, as well as an overall compendium of clinical best practices. Measurement work encompasses large-scale national studies of health outcomes and has included studies of 200,000 diabetic patients, 320,000 cardiovascular patients, and 90,000 asthma patients in the Kaiser-Permanente system. The implementation work builds on the content foundation, using the collected information as a basis for clinical systems that can influence care at the point of delivery. The objective of these activities is to change care providers' behavior to coincide with best practices developed throughout the Kaiser-Permanente system. Such systems can have dramatic effects on care. Over the previous year and a half, Kaiser-Permanente found a 10 to 15 percent gain in the use of practice guidelines in some areas where it had developed content. The PKC is a network-based application that was developed to support the CMI's objective of improving care. Its primary function is to allow care providers to access current CMI content on best practices. The CMI staff realized that each Kaiser-Permanente site was gathering useful information that could benefit other local and national care providers, but little of it was shared. Using Kaiser-Permanente's national intranet, the PKC now has national and regional databases of best-practice information that is vetted before it is entered into the databases. A board of directors that represents all executive and physician groups approves information for inclusion in the national database. Twelve regionalcontinue

OCR for page 269
Page 276 groups establish approval processes for their own regions, and local offices establish processes for local approvals. This structure balances local freedom against the need for greater structure at the highest levels of the organization. The databases are linked together through the national office's intranet Web site, which makes the information searchable and reduces duplication of effort. In addition to linking care management information, the PKC provides a centralized outlet for other information resources of interest to care providers. It contains a section for continuing medical education (CME) that allows users to look up their CME credits. It also provides access to online text books and journals and supports discussion groups with threaded discussions. Its workgroup functions enable users to post material and conduct national meetings in a virtual manner. Workgroups can be designated for members only, with membership defined by the chair. The CMI staff plan to build on the PKC's current capabilities to make PKC more useful to care providers. The staff will address topics such as improving search capabilities, tailoring information to provider needs, and expanding access to affiliated care providers. The PKC can support searches across the national intranet using a commercial search engine, but the CMI is looking into ways to improve search capabilities and help different types of users find information they need, perhaps by adding "metadata" capabilities. The staff also plan to use "push" (as opposed to pull) technologies and customization to provide users with information relevant to their immediate and long-term needs. Plans for this enhancement are tied to efforts to place a computer on every physician's desktop and provide access to a clinical information system (more than 40 percent of Kaiser physicians had desktop computers at the time of the site visit). Overall, however, the Kaiser-Permanente staff do not consider technology a limiting factor. They have modest aims for now, and technology is often a distraction from reaching other goals. They prefer to take known technology and find ways of using it to support their business, rather than pushing the technological envelope. Nevertheless, determining how best to expand the PKC to affiliated providers will entail both technical and policy considerations. Kaiser-Permanente has agreements with approximately 40,000 affiliate providers in physician groups outside of California. How can Kaiser convince the affiliates to care for Kaiser patients according to Kaiser practices? Should the affiliates have access to all the available knowledge, or should some information be considered proprietary? How can information be layered and filtered easily to accommodate restrictions placed on affiliate access? How can firewalls be extended to include specific partners while still providing adequate protection for Kaiser's information systems? At present, the system does not contain patient-level data, which alleviatescontinue

OCR for page 269
Page 277 some concerns regarding security, but Kaiser needs to evaluate alternative ways of providing security on an extranet and/or the Internet. It also needs to determine how best to blend public and proprietary information to benefit its providers. Kaiser officials do not consider their practice guidelines proprietary—and would even like to make them public—but the tools for implementing these practices are unique and will be kept proprietary. Kaiser would like to translate the guidelines into lay terms and make them available on its consumer-oriented Web site. The Kaiser staff anticipate three types of outcomes from the PKC. First are the financial benefits. They want to leverage the size of the organization while avoiding duplication of effort—something the PKC can facilitate. The staff will try to determine how much money was saved by not starting new programs or sustaining existing ones because PKC indicated that similar work may prompt changes in clinical decision making that yield improved clinical outcomes and use of facilities. Other benefits will accrue from improved knowledge management, which should succeed in educating care providers about new diagnostic approaches and new techniques. The Kaiser staff hope to make the link between successful patient-provider interactions and the PKC system evident, to demonstrate the system's ability to support corporate objectives. If the system is to be successful, then usage rates must increase. At the time of the site visit, 2,700 of Kaiser-Permanente's 10,000 physicians used the system, but all were expected to use it by the end of 1999. To attain and maintain that level of usage, the system will need to prove itself capable of educating physicians about the new diagnostic approaches and clinical techniques. The system will also need to demonstrate its capability to enhance the goals of the organization. Management will need to see a direct linkage between improved patient-provider interactions and the tools that support that interaction. Privacy issues will also need to be addressed so that providers know whether information will be collected on their searches of medical literature and whether such searches will be viewed as positive (i.e., the provider is engaging in continuous learning) or negative (i.e., searches indicate gaps in a provider's knowledge). KPOnline Anna-Lisa Silvestre and Richard Leopold described Kaiser's consumer-oriented Web site, the primary component of KPOnline. KPOnline is a three-tiered system with a Web page interface that interacts with legacy systems through an intermediate object layer. Ms. Silvestre views the Web site as a service for interacting with Kaiser-Permanente members. It is not a marketing tool or a mechanism for providing content. Rather, it is intended to provide members with an alternative to telephone calls andcontinue

OCR for page 269
Page 278 office visits. The site provides members with capabilities for messaging, scheduling appointments, and checking prescriptions 24 hours a day, 7 days a week. Eventually, the Kaiser staff would like to integrate KPOnline with the PKC so they can take the information gathered from patients through KPOnline and incorporate it into practice guidelines and, conversely, incorporate practice guidelines into chat rooms and e-mail discussions with patients. The goal is to help patients better understand health information, tailor it to them, connect them with providers, and to help them make sound, coordinated decisions regarding their care. At the time of the site visit, approximately 16,000 registered users had logged on to KPOnline more than once. This number is just a small fraction of Kaiser-Permanente's 9.4 million members, but the staff believes usage will increase as more of them gain Internet access. The recent biannual survey reported that 72 percent of members are adults, 53 percent of whom have Internet access at home, work, or both. Ten percent of all members have requested a personal identification number (PIN) to use with the system. Hence, Kaiser expects hundreds of thousands of members to access the system simultaneously in the near future (the target was 350,000 active users in 1999) and is in the process of procuring additional servers to handle the load. The organization is still trying to learn how members use the site and what types of services they seek. Kaiser is starting to collect data on site usage (it does not track an individual's movements within the site) but does not yet have adequate volume to examine usage by demographic category. Consumers are coming online quickly, and KPOnline is expected to become a basic utility that will benefit both consumers and Kaiser-Permanente. A basic evaluation will be performed to determine who uses the system, their level of satisfaction, and the overall utility of the system. More formal cost-benefit analyses will also be conducted. A tangible cost-benefit analysis will evaluate processes such as online pharmacy refills and automated appointment scheduling and compare them to more traditional, manual processes. The Kaiser staff expects that online pharmaceutical services have the highest benefits per unit cost because filling prescriptions becomes much less expensive when done in high volumes. Other tangible benefits not associated with cost reductions, such as helping members make good decisions, will also be considered. Some of the benefits will be difficult to quantify, but there may be ways to determine if online material helped to prevent an unnecessary visit to a Kaiser facility, improve the appropriateness of a subsequently scheduled visit, or increase membership retention rates. Early experience with KPOnline has uncovered numerous issues that need to be resolved. One of the main ones is the need for standards for measuring the quality of online transactions. For example, with respect tocontinue

OCR for page 269
Page 279 online questions to nurses, it might be desirable to track the time of receipt, the time at which the question was answered, the time at which the member retrieved the response, and whether the answer given was valid. Similar standards are now in place to help train nurses who provide advice over the telephone, but such situations involve near-real-time feedback. In addition, nurses need training in how to provide care based strictly on text input (which creates a record of the interaction), with no voice or personal interaction with patients. At the time of the site visit, all of Kaiser-Permanente's care providers had e-mail accounts, but patients were not yet using them. The use of clinical e-mail raises several questions, as yet unanswered, regarding medical records. Other issues include the determination of rules for intervening in sponsored chat groups. Kaiser-Permanente had a case in which a member's postings to a chat group suggested suicidal tendencies. The organization had to decide whether and how to intervene. Should it link the user's anonymous login with the medical record database to check the medical history and find contact information? In this case, the Kaiser staff did just that, and an advice nurse called the patient and arranged an appointment, which revealed that the patient was indeed suicidal. Events such as this prompt policy reviews. Kaiser has established a set of technical measures and administrative processes to provide security on the Web site. The site uses Secure Socket Layer (SSL) encryption to protect messages between members and Kaiser, and members need a PIN to access chat groups. Members are required to provide their membership number and address to obtain a PIN. In addition, the manager of the business unit is a security trustee and has to ensure that policies and procedures are in place. These policies are reviewed regularly and upgraded as needed, and attempts are made to achieve consistency between online and off-line policies. For example, Kaiser dropped the authentication requirement for scheduling an appointment online because such authentication is not performed when scheduling appointments via the telephone. Members of the Kaiser-Permanente staff identified several technical capabilities they would like to be able to incorporate into KPOnline: • Authentication technologies that would allow multiple users within a single household to use the same computer but keep their information separate. In the current KPOnline system, if users forget to log out of a session, then other family members can see what they did and what information they retrieved. The Kaiser staff would like to obtain improved technologies to identify users and authenticate their identities. One possible solution is biometrics, but this approach would be costly to implement across many computers.break

OCR for page 269
Page 303 The local physicians are satisfied with the program; they use the library's resources and learn to like the Internet. But several issues must be resolved before the telemedicine program can be expanded. First, the program crosses state lines and requires licensed physicians at either end, so beginning students cannot present patients. When students do clinical rotations in rural areas (as half of the students must do), they must get licensed in both states—unless they are participating in telemedicine programs operated by the federal government (such as the Veterans Administration), which are exempt from state licensure requirements. Second, insurance is an issue; some patients will not use this medium for consultations because their insurance will not cover it. Third, telemedicine can affect local referral patterns. Finally, reimbursement is a major problem. Consulting doctors are currently paid from the grant money. The HCFA and the Rural Health Policy Agency support only 80 percent of a normal payment, and the doctor has to split the fee with the referrer. Montana approved telemedicine for Medicaid because it saves money normally spent on patient travel. Medical Projects at the Human-Interface Technology Lab The Human-Interface Technology (HIT) laboratory is a research unit in the UW College of Engineering. It has a roster of 108 people; 18 are regular staff members and the rest are made up of faculty associates, visiting scholars, graduate students, and so on. The income of the laboratory is about $17 million, two-thirds of which comes from grants and contracts. Forty companies participate as members of the Virtual Worlds Consortium, providing a little less than one-sixth of the revenue. Spinoffs from the laboratory include 13 different companies; 10 of these are still active, including 3 created recently. Suzanne Weghorst, assistant director, introduced the HIT laboratory, which has a goal of developing and demonstrating mission-transferable technology. Tom Furness, they director, elaborated on the laboratory's history. He worked for many years at Wright-Patterson Air Force Base, designing human-machine interfaces in airplane cockpits. He came to Seattle in 1989 to found the laboratory, by which he hoped to extend the scope of his work, with the general goal of providing better coupling of humans to advanced machines. Furness emphasizes his view of the future as technology that simulates "being there," improving humans' ability to transport themselves by moving their eyes to different places and times. Such experiences range from teleconferencing to "transport" through an endoscope to view hidden portions of the body. For instance, a videotape made 5 years ago showed an interactive teleconference in which participants in the Unitedcontinue

OCR for page 269
Page 304 States and Japan wore virtual reality (VR) helmets and cooperated in the task of herding virtual creatures across a conference table with paddles. The telecommunications link consisted of four ISDN lines. The images displayed were somewhat flat and cartoonish, but the interaction was successful. The committee was shown three active laboratory efforts. In the virtual operating room, the participant wears a VR helmet and holds a control stick. The helmet's location and orientation are sensed by a device mounted on a fixed stand over the space, and this information is used to drive the displays to the video helmet (much in the manner of current VR games). The environment is based on photographs and equipment from Harborview Medical Center. There is a patient on an operating table; by using the control stick, the participant can manipulate displays, such as the electrocardiogram output, and instruments, such an endoscope inserted in the patient's lower abdomen. The displays can be controlled so that they are visible regardless of the participant's perspective, or they can be fixed at various positions in the virtual room. There were some noticeable lags in the following speed of the display if the participant turned quickly, and the overall precision of the location sensors appeared to be on the order of inches rather than millimeters. The second active effort involved simulation of surgical suturing through a computer-controlled force-feedback device. The user interacts with the environment through a pair of scissors holding a virtual needle. A standard video monitor displays the position of the needle relative to a wound in a small area of skin. A finite-element model of the skin—it has about 200 nodes, with a relatively higher concentration of nodes near the wound—simulates the restoring force of the skin against the needle and controls the force feedback. If the user inserts and virtual needle into the skin orthogonally to the skin surface, for example, then little force is felt, but if he or she holds it at an oblique angle, then considerable force is needed to pierce the skin. The force-feedback device requires updates about 1,000 times per second; the visual display runs at a standard 30-cycles-per-second refresh rate. The third effort is the Virtual Retinal Display. The idea is to paint an image directly on the retina with photons instead of projecting the display elsewhere and requiring the person to follow it visually. The image appears only on the participant's retina. The lab bench setup included low-power red, green, and blue laser light sources fed through an optical fiber, with the fiber's output scanned mechanically over the retina by a moving mirror. Although the lab setup seemed cumbersome, a company called Microvision was spun off in 1993 to commercialize this technology and evidently has had some success in reducing it to a practical size and weight for portable use. Because the image can be focused so that itcontinue

OCR for page 269
Page 305 passes through only a small part of the user's cornea and can potentially be focused on a specific part of the retina, the technology holds promise for assisting persons with impaired vision and providing bright, high-precision displays for VR applications. A number of surprising results have been found in experimental use. For instance, users do not perceive flicker in the static images even at relatively low refresh rates (e.g., 15 frames per second), meaning that the system offers twice the resolution of other displays at the same bandwidth. Seattle/Pacific Northwest GigaPOP Jim Corbato discussed the evolution, current architecture, and expected future evolution of high-bandwidth IP network infrastructures for UW and its Seattle-area health care partners, which include the Harborview Medical Center, Children's Hospital, and the Fred Hutchinson Cancer Research Center. Network interconnections in Seattle are simplified by the physical proximity of data networks from various interexchange carriers (e.g., Qwest and US West) and ISPs at the gigaPOP facility in downtown Seattle. The network connections have relatively poor site security (because they are in a general-use office building) and present a potential single-point-of-failure for Internet access for the entire Pacific Northwest. Next Generation Internet Projects UW received a phase 1 award from the NLM to examine biomedical applications that would benefit from the NGI and has since received a phase 2 award from NLM to further this work. This project is titled Patient-Centric Tools for Regional Collaborative Cancer Care Using the NGI. Brent Stewart outlined the project, in which a high-performance metropolitan area network is being designed to transmit clinical data, including radiology images (with a capacity to deliver eight simultaneous 10-MB image sets simultaneously, using a 622-Mb channel), to a planned cancer care facility on the south shore of Lake Union. The center is scheduled to be completed within a year. The bandwidth requirement is based on fully digital radiology, with interpretation of images by radiologists at UW and a digital archive at UW rather than at the clinic site. The development of this center is based on three hypotheses: that health care is becoming highly distributed and differentiated; that health care is operating in a resource-limited environment; and that the NGI will enable more collaborative practice, regardless of where patients are located at a given time. The NGI will enable the formation of the cancer care alliance; facilitate teaching and research; enable a fully integratedcontinue

OCR for page 269
Page 306 team approach to diagnosis, treatment, and management of cases; and accelerate the discovery and dissemination of knowledge. The underlying technology will consist of the local gigaPOP as well as a virtual, enterprise-wide multimedia electronic medical record based on MINDSCAPE. There will be a backup line in place, perhaps a leased DS-3 line. The system will transmit real-time video (e.g., ultrasound, fluoroscope, and synchronous telemedicine consultations), store-and-forward video, and interactive radiation oncology treatment planning (e.g., graphics, images, and video). Additional multimedia knowledge resources, such as the Digital Anatomist and streaming video for patient education, also will be available. Technical requirements are based largely on the needs of remote radiological image archiving and display. To allow the simultaneous downloading of eight different 10-MB images within one second, the system needs bandwidth of 640 Mbps. The UW and Harborview radiology departments are all digital now, but they use computed radiography (in which an imaging plate is scanned by a laser) rather than flat-panel digital. The centers will become fully digitized once the technology comes down in price. Stewart envisions that a radiologist covering at a remote site might use the system to perform work that he or she would have done at the home site, downloading images remotely. There are no formal plans for evaluating the technical infrastructure. Because the Internet today provides no QOS, UW will put in place as much bandwidth as possible and use whatever service level results. Dr. Stewart viewed medicine's demand for bandwidth as similar to that of other industries; he compared multisite telemedical collaboration to automobile companies linking together their remote research and development sites. Biomedical Library Services The UW Health Sciences Library is moving all of its services to Web-based delivery (see <healthlinks.washington.edu>. Geographic issues related to supporting the WWAMI program (i.e., mountains, small towns, long distances between towns) make this transformation necessary. In addition to the WWAMI medical education program, the pharmacy, nursing, public health, and social work programs have distance education programs. Faculty want to deliver digital video to their WWAMI-based students and provide them with access to course materials and the clinical digital library resources. This program will require high bandwidth and many servers. At present, the schools use scanned PDF format to deliver interlibrary loan and course materials (e.g., course notes andcontinue

OCR for page 269
Page 307 reserve materials) to distant students. The documents are an estimated 300 kb in size each. The Health Sciences Libraries offers over 1,400 full-text online electronic journals to UW faculty, staff, and students. A major issue is compliance with licensing agreements. UW librarians negotiate aggressively for licenses allowing digital materials to be available to faculty, staff, and students at any location. They control access with user IDs and passwords and UW IDs and are increasingly making materials available through a proxy server. Access is also a problem, because not all materials are locally loaded, and network latency for materials stored at remote sites (NLM, journal publishers, etc.) is an issue. The plan is to support nomadic computing because students, faculty, and residents move around constantly. Network latency is a serious problem for accessing remotely stored full-text journals and other resources. UW was part of the NLM test on access time for PubMed over the Internet that was published recently in the Journal of the American Medical Informatics Association. NLM has tried to track the latency problem and 3 years ago performed a minor test for the Utah link for the online journals (because it intended to provide all resources remotely). Users see real degradation of performance from 11 a.m. to 3 p.m. Pacific Standard Time, but that latency is due in part to issues within the UW network (in other words, the latency formerly observed in NLM's MEDLINE for that time period now is observed on the Internet.) However, the latency depends on the connection. UW is upgrading the network to 10 Mbps Ethernet (10Base-T) to improve throughput; moving to 100 Mbps Ethernet (100Base-T) on every public library machine is a much more costly venture (about $400 per station). The UW Health Sciences Library has many public stations, including about 150 in the microcomputer lab it manages for the Health Sciences Center and over 100 public workstations in the three Health Sciences Library sites. Remote users may have trouble with commercial connections to library online resources (e.g., through MSN or AOL), which can impose latency problems during certain time periods. Digital Anatomist The Digital Anatomist is an NLM-sponsored project to develop an electronic repository of anatomical images. Anatomy is, of course, fundamental to health sciences education, and it provides a framework for organizing other biomedical information. Jim Brinkley presented the work of the UW Structural Informatics Group. The group works in three areas: representations of structural anatomical information, from the level of individual cells to gross anatomy; methods for accessing and using structural information; and practical applications of their tools forcontinue

OCR for page 269
Page 308 research, education, and clinical work. It tries to exploit opportunities for online systems dealing with anatomical information. The key data structure for its work is an ontology of anatomy, developed by Cornelius Rosse, that serves as a common data structure for most of the applications. They call this the foundational model of anatomy. The system demonstrated for the committee provides authoring tools using both symbolic information (e.g., names, semantics, structures) and spatial images (typically three-dimensional images) of anatomical structures. It consists of a symbolic information database and a separate three-dimensional image database accessed through a single server. A number of intelligent agents have been developed to assist in retrieving and assembling data sets and images. The agents have knowledge of both the information available on the system and the user's level of sophistication. In response to the command ''Show me the structures of the left lung," for example, the system will check the symbolic database to find out what structures are in the left lung, then go to the image database to determine what images are available, and then use a scene-generator to assemble the pieces properly. The user then can highlight particular elements of interest, rotate or zoom in on the image, and remove objects that block the view of other objects of interest. All processing is done on the server. On the authoring side, the system contains a knowledge-builder for adding information to the symbolic database. It is a relational database containing 25,000 terms describing all structures with dimensions of 1 mm or greater in a particular set of organisms, including humans. For the spatial database, the system can create volumetric, three-dimensional models of anatomical structures from two-dimensional images. It also can create and perform animations. A brain image demonstrated for the committee superimposed vascular structures onto the brain and contained some 100,000 polygons. Images can be annotated for clinical and educational purposes. On the user side, the system contains an annotated image server. Users can call up images and click on individual structures within the image. The system outlines the selected structure and generates its name. This system is used in anatomy education, but the images are so large and the networks so slow that students tend to use a CD-ROM rather than access the database through either campus or remote networks. The system's quiz mode can test students' knowledge of anatomy. A tutorial system can embed images from the atlas into other documents. The Digital Anatomist and interactive atlas get an estimated 10,000 hits per day. In a separate effort called the Brain Project, a digital neuroscientist system is being built that will overlay neurological images on images of the brain. The system will allow cutaways of volume-rendered images. These data are collected operatively by neurosurgeons who do real-timecontinue

OCR for page 269
Page 309 stimulation and mapping of critical regions for various neurological functions, such as speech. NGI technology could streamline this process by enabling real-time superposition of neurological data on an open brain; the system then could sense the surgeon's probes and automatically maintain information about the location of the probe and the result. This would require capabilities similar to telesurgery, but the system would also be linked to databases for documentation and postmortem analysis. The NGI offers several other opportunities for applying this technology. One is anatomical education, in the form of virtual dissections and intelligent scene generation. Another opportunity is brain mapping for either research (e.g., language mapping to identify correlations between brain structures and language skills/development) or clinical purposes, such as surgical planning. The technology also could provide structure-based visual access to biomedical information. Indeed, some have suggested that the ideal user interface to biomedical information resources is a model of human structure, which can serve as the organizing principle for this information. Regence BlueShield The visit to the corporate offices of Regence BlueShield, the largest health care insurer in the Pacific Northwest with annual revenues of approximately $2 billion, took place on February 11, 1999. The committee heard presentations on Internet-related activities within Regence as well as related activities in the Seattle area. The related activities include programs operated by the Foundation for Health Care Quality (FHCQ); the Washington State Department of Public Health; and the Community Health Information Technology Alliance (CHITA), working with a group called Agora. Regence Web-based Services Steve Moe, manager of electronic business practices for the Regence Group, presented a Web-based interface application called Network Data Express (NDEX) for determining beneficiary eligibility and making referrals. The Web-based system offers claim status inquiries, provider directories, reference materials (such as the formulary), e-mail, and managed care data and reports. It processes about 20,000 transactions per month (peak times are early in the day and during lunch), doing the work of two or three full-time employees who otherwise would give the same information out by phone (Regence processes millions of claims a month). Regence has deployed 1,500 workstations (including 800 intranet and 700 dial-up systems linked to a private Web server), of which about 30 per-soft

OCR for page 269
Page 310 cent are in use on a regular basis. All users are assigned an ID and password and sign a confidentiality agreement. Patient and customer information is indexed by social security number. Kirk Bailey, manager of security policy, stated unequivocally that the Internet is considered unsafe and will not be used for Regence's electronic commerce (e-commerce) transactions until it has sufficient security measures and functionality to meet the company's business requirements. The Internet raises concerns about security, privacy, and reliability. Other factors that have slowed the adoption and use of NDEX are a lack of content sponsors, especially among payers; the lack of Web browsers in many provider offices (they will get browsers during their next hardware upgrades, but Regence does not fund such upgrades); and user behavior (e.g., administrative workers are accustomed to using the phone instead of the computer to get information). Foundation for Health Care Quality Rick Rubin, president of the FHCQ, gave an overview of the foundation, a not-for-profit entity created in 1988 to meet the shared health information needs of the Seattle region. The foundation serves as a neutral meeting ground for providers, payers, plan purchasers, consumers, and others involved in health care. It participates in or sponsors programs in three areas. One area is e-commerce pilot projects, including a multistate effort funded by the Robert Wood Johnson Foundation to define eligibility and referrals, and CHITA, described in further detail below. The second area is performance measurements for health plans and providers, and the third is consumer affairs. The FHCQ, which views itself as an economic development agency for the region, has learned a number of lessons about operating in the highly competitive health care marketplace. These lessons emphasize the importance of (1) enabling instead of mandating standards, because mandates may change willingness but do not affect capabilities; (2) making a business case that differentiates needs (i.e., things that stakeholders are willing to pay for) from wants (i.e., things they are not willing to pay for); (3) the Internet, which is widely viewed in the region as a plausible means of achieving long-held visions of seamless integration of information across organizations and which allows organizations to assume that networking capabilities will be in place so they can concentrate on higher order functionality; (4) information security and privacy, which can be either a barrier or an enabler, depending upon the circumstance; (5) the widespread sharing of expertise and information; (6) education as a means of facilitating the migration of information technology into health care, especially through efforts to reengineer the way organizations operate (acontinue

OCR for page 269
Page 311 process that can be more important than the technology itself); (7) working to refine national standards and develop implementation manuals; and (8) balancing competition and cooperation (firms can cooperate on some subsets of issues but not on others that are seen as having greater proprietary value). Current or recent projects include an effort to standardize eligibility information, for which there is agreement on data items but not on presentation. Other regional projects are aimed at exchanging data on pediatric immunizations, referrals, claims, and lab transactions. Community Health Information Technology Alliance Project with Agora Peter B. Summerville, director of CHITA, and Kirk Bailey, manager of security policy for the Regence Group and founder of Agora, presented an overview of the Three-State Model Security Prototype. CHITA was chartered in 1997 and has 60 member organizations, including providers, payers, and state agencies. It is part of the FHCQ but has a separate board of directors. Agora, a local group interested in computer security, has about 450 members representing 120 Pacific Northwest region corporations. It was formed by chief information officers and security officers who became increasingly concerned about network vulnerabilities as their companies began to move online. CHITA's early work focused on eligibility and referral transactions—negotiating agreements on data fields and standards to facilitate the electronic interchange of information. CHITA and the FHCQ worked with organizations in Massachusetts and Minnesota on a three-state project focusing on electronic security. The goals were to determine how electronic security could be implemented affordably and to develop a business case for a community-wide, secure infrastructure for electronic business. The group worked with Science Applications International Corporation (SAIC) to develop a security and risk management plan for business-to-business health information networks. The plan identifies seven levels of increasing health care security. Together with Agora, CHITA is working to implement health security level 6 (HSL 6) within participating organizations. HSL 6 includes specifications for three network-based information services: authenticated, secure messaging; authenticated, secure file escrow and transfer; and authenticated, role-based access. The security model has been developed and published, and CHITA is in the process of identifying a bridge operator organization that will function as a trusted intermediary to oversee a prototype implementation, followed by a wider pilot project in the region. Issues to be addressed include the identification of a certificate authority,continue

OCR for page 269
Page 312 which might be a nonprofit organization, the government, or a private corporation such as Verisign. CHITA has no plans to attempt to change the Internet or its directions but rather will attempt to accommodate whatever weaknesses it exhibits with respect to information security. While asserting that businesses need to move too quickly to wait for the NGI, Mr. Bailey wondered if it would be possible to allocate part of the Internet 2 (perhaps one or two frequencies) for health care. He also would consider the formation of a separate health information network as a means of avoiding some of the security concerns associated with the Internet. According to him, security officers in health care have responsibilities that differ from those of their counterparts in other industries. The applicable state and federal laws are different, the privacy and security concerns are greater, and health care organizations must meet requirements for successful electronic data interchange. At the same time, the health care industry is driven by economics, not privacy. Washington State Laboratory Reporting Project Jac Davies, representing the Washington State Department of Health, described the Electronic Laboratory Reporting System (ELBRS) project, which involves the electronic submission and tabulation of reportable events within the state, of which there are fewer than 100,000 every year. (Physicians and testing laboratories are required to report certain conditions to their county health department.) Such reports generally are sent by regular mail, fax, or voice mail. Public health officials then are required to follow up with the doctor and patient to further investigate possible causes, paths of contagion, and so on. Often, reports are sent to the wrong county and/or are not subsequently forwarded to the state. Furthermore, different states and counties tend to have their own lists of reportable conditions, which are tied closely to local concerns (the conditions vary, for example, between urban and agricultural counties), and they have different rules for where to send the information. As laboratories (and health organizations generally) consolidate into national entities, tracking different reporting requirements has become time-consuming. SmithKline Beecham, for example, operates a number of clinical laboratories and has three or four people dedicated to tracking different reporting requirements. Under Washington's planned system, lab reports would be sent directly to the state rather than to local health departments. The state then would process the reports and forward information down to local communities and up to the Centers for Disease Control and Prevention (CDC), as necessary. Such centralization would allow the state to bettercontinue

OCR for page 269
Page 313 track incidents across county lines. Planners hope that the system will encourage greater communication between the state and local communities or the CDC and that it will improve compliance with reporting requirements. Several issues have informed the planning for this proposed system. One is the use of the Internet, which is not only a logical choice but also the only viable option. Another issue is the sensitive nature of the data; there is, for example, a state requirement for reporting AIDS cases. A third issue is privacy, which is a major concern of the governor and residents of Washington. A pilot program is under way with Group Health of Puget Sound. Labs encrypt their test reports and send them to the state health department's file transfer protocol server, which sits outside a firewall. State personnel move the file behind the firewall, check for errors, run it through an HL-7 formatter, put the data on an SQL database server, and send them to the county. They use a public key cryptography system (Pretty Good Privacy) described as minimal. There is no formal program in place for changing keys. According to a preliminary evaluation, the pilot program improved the completion and timeliness of reports. The time required to send information to the local health office improved modestly (to less than 1 day) and the time required to send information to the state improved by an average of 40 days (to about 1 day). Note 1. Bernie H.K. Huang relocated to the Children's Hospital of Los Angeles and the University of Southern California as professor and director of Informatics effective January 1, 2000.break