Click for next page ( R2


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page R1
Mental Moclels in H uman-Computer ~ nteraction Research Issues About What the User of Software Knows John M. Carroll and Judith Reitman Olson, Editors Workshop on Software Human Factors: Users' Mental Models Nancy Anderson, C-hair Committee on Human Factors Commission on Behavioral and Social Sciences and Education National Research Council - NATIONAL ACADEMY PRESS Washington, D.C. 1987

OCR for page R1
NOTICE: The project that is the subject of this report was approved by the Governing Board of the National Research Council, whose members are drawn from the councils of the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine. The members of the committee responsible for the report were chosen for their special competences and with regard for appropriate balance. This report has been reviewed by a group other than the authors according to procedures approved by a Report Review Committee consisting of members of the National Academy of Sciences, the National Academy of Engineering, and the Institute of Medicine. The National Academy of Sciences is a private, nonprofit, self-perpetuating society of distinguished scholars engaged in scientific and engineering research, dedicated to th'e furtherance of science and technology and to their use for the general welfare. Upon the authority of the charter granted to it by the Congress in 1863, the Academyihtas a mandate that requires it to advise the federal government on scientific and technical matters. Dr. Frank Press is president of the National Academy of Sciences. The National Academy of Engineering was established in 1964, under the charter of the National Academy of Sciences, as a parallel organization'of outstanding engineers. It is autonomous in its administration and in the selection of its members, sharing with the National Academy of Sciences the responsibility for advising the federal government. The National Academy of Engineering also sponsors engineering programs aimed at meeting national needs, encourages education and research, and recognizes the superior achievements of engineers. Dr. Robert M. White is president of the National Academy of Engineering. The Institute of Medicine was established in 1970 by the National Academy of Sciences to secure the services of eminent members of appropriate professions in the examination of policy matters pertaining to the health of the public. The Institute acts under the responsibility given to the National Academy of Sciences by its congressional charter to be an adviser to the federal government and, upon its own initiative, to identify issues of medical care, research, and education. Dr. Samuel O. Thier is president of the Institute of Medicine. The National Research Council was organized by the National Academy of Sciences in 1916 to associate the broad community of science and technology with the Academy's purposes of furthering knowledge and advising the federal government. Functioning in accordance with general policies determined by the Academy, the Council has become the principal operating agency of both the National Academy of Sciences and the National Academy of Engineering in providing services to the government, the public, and the scientific and engineering communities. The Council is administered jointly by both Academies and the Institute of Medicine. Dr. Frank Press and Dr. Robert M. White are chairman and vice chairman, respectively, of the National Research Council. The United States government has at least a royalty-free, nonexclusive and irre- ~rocable license throughout the world for government purposes to publish, translate, reproduce, deliver, perform, dispose of, and to authorize others so as to do, all or any portion of this work. Available from Committee on Human Factors Commission on Behavioral and Social Sciences and Education National Research Council 2101 Constitution Ave., N.W. Washington, D.C. 20418 Printed in the United States of America

OCR for page R1
WORKSHOP ON SOFTWARE HUMAN FACTORS: USERS' MENTAL MODELS NANCY ANDERSON (Chair), Department of Psychology, University of Maryland ELIZABETH K. BAILEY, Consultant, Falls Church, Virginia JOHN M. CARROLL, Watson Research Merited, IBM Corporation, Yorktown Heights, New York RICHARD J. JAGACINSKI, Department of Psychology, Ohio State University DAVID R. LENOROVITZ, Computer Technology Associates, Inc., Englewood, Colorado MARILYN MANTEl, Center for Machine Intelligence, Ann Arbor, Michigan PHYLLIS RElSNER, Almaden Research Center, IBM Research, San Jose, California JUDITH REITMAN OLSON, Department of Computer and - Information Systems, Graduate School of Business Administration, University of Michigan JANET WALKER, Symbolics, Inc., Cambridge, Massachusetts JOHN WHITESIDE, Digital Equipment Corporation, Nashua, New Hampshire STANLEY DEUTSCH, Study Director 111

OCR for page R1

OCR for page R1
COMMITTEE ON HUMAN FACTORS 1986-1987 THOMAS B. SHERIDAN (Chair), Department of Mechanical Engineering, Massachusetts Institute of Technology NANCY S. ANDERSON, Department of Psychology, University of Maryland CLYDE H. COOMBS, Department of Psychology, University of Michigan JEROME I. ELKIND, Information Systems, Xerox Corporation, Palo Alto BARUCH B. FISCHHOFF, Decision Research (a branch of Perceptronics, Inc.), Eugene, Oregon OSCAR GRUSKY, Department of Sociology, University of California, Los Angeles ROBERT M. GUION, Department of Psychology, Bowling Green State University DOUGLAS H. HARRIS, Anacapa Sciences, Santa Barbara, California JULIAN HOCHBERG, Department of Psychology, Columbia University THOMAS K. LANDAUER, Information Sciences Division, Bell Communication Research, Morristown, New Jersey JUDITH REITMAN OLSON, Department of Computer and Information Systems, Graduate School of Business Administration, University of Michigan RICHARD W. PEW (Past Chair), Computer and Information Sciences Division, Bolt Beranek and Newman Laboratories Cambridge, Massachusetts STOVER H. SNOOK, Liberty Mutual Research Center Hopkinton, Massachusetts v L ~

OCR for page R1
ROBERT C. WILLIGES, Department of Industrial Engineering and Operations Research, Virginia Polytechnic Institute and State University STANLEY DEUTSCH, Study Director (1984-1987) HAROLD VAN COTT, Study Director E~UM - p. vii The Air Force office of Scientific Research should have been included as one of the sponsors of the Committee on Human Factors. V1

OCR for page R1
Foreword The Committee on Human Factors was established in October 1980 by the Commission on Behavioral and Social Sciences and Education of the National Research Council. The committee is sponsored by the Office of Naval Research, the Army Research Institute for the Behavioral and Social Sciences, the National Aeronautics and Space Administration, and the National Science Foundation. The principal objectives of the committee are to provide new perspectives on theoretical and methodological issues, to identify basic research needed to expand and strengthen the scientific basis of human factors, and to attract scientists both within and outside the field for interactive communication and to perform needed~research. The goal of the committee is to provide a solid foundation of research as a base on which effective human factors practices can build. Human factors issues arise in every domain in which humans interact with the products of a technological society. In order to perform its role effectively, the committee draws on experts from a wide range of scientific and engineering disciplines. Members of the committee include specialists in such fields as psychology, en- gineering, biomechanics, physiology, medicine, cognitive sciences, machine intelligence, computer sciences, sociology, education, and human factors engineering. Other disciplines are represented in the working groups, workshops, and symposia. Each of these dis- ciplines contributes to the basic data, theory and methods required to improve the scientific basis of human factors. . V11

OCR for page R1

OCR for page R1
Contents Abstract ee ~ o~ XV Introduction . Models of What, Held by Whom? .... Types of Representations of Users' Knowledge.. Simple Sequences, 6 Methods and Ways to Choose Among Them, 8 Mental Models, 12 Surrogates, 13 Metaphor Models, 13 Glass Box Models, 14 Network Representations of the System, 15 Comparisons, 17 How Users' Knowledge Affects Their Performance. Chaos and Misconception in Both Novices and Experts, 20 Skilled Performance, 21 .19 Applying What We Know of the User's Knowledge to Practical Problems 23 Designing Interfaces, 24 User Training, 26 Research Recommendations. References . 1X 29 ....34

OCR for page R1

OCR for page R1
Preface There has been a long-standing problem with inferring the causes of complex behavior. Mental events are not directly ob- servable; they must be inferred from overt behavior. Behaviorists reject mental events as legitimate scientific concepts. More re- cently,~ however, developments in cognitive science and artificial intelligence, in which mental events are specifically modeled and found to have measurable correlates in behavior, have brought the concepts back into fashion. These mental events, their description and postulated interrelationships, are the subject of this report. We focus specifically on the mental events that are postulated to occur as someone learns or performs complex tasks on computer software. From the point of view of cognitive science, users of computer software systems base their behavior on stored knowledge about particular sequences of actions, on general rules about how to accomplish certain tasks, or on a mental mode! fan underlying understanding of how the system works). Knowing what the user knows about or expects from a system has implications for both design and training purposes. From a design point of view, the system could be designed to fit the user's goals in accomplishing tasks or could display enough of how it works to make accomplish- ing a task easy to understand. From the training point of view, X1

OCR for page R1
users could be given instructions aunt exercises that clearly present sequences, rules, and/or a mode} in order to make learning and - per ormlng easy. At present, there is no satisfactory way of describing what the user knows. There Is no way to characterize the differences among users of various systems as they go through the process of developing an awareness and understanding of how the system works or how a given task is to be performed. Consequently, the Committee on Human Factors conducted a two-day workshop on May 15 and 16, 1984, to determine means for achieving a better understanding of what users know and its implications for system and software design as well as user training. This workshop was a continuation of the committee's efforts to define research needs in the area of software human factors. Ten nationally known researchers on software design, cognitive psychology, and human factors met to discuss the issues having to do with what a user of software knows. As background for this workshop, John M. Carroll wrote an invited paper entitled "Mental Modeb and Software Human Fac- tors: An Overview. This was distributed to all participants in advance of the meeting. In turn, the workshop members prepared short two. to three-page position papers addressing additional top- ics and issues that they believed were important and warranted discussion at the workshop. Much of the discussion at the work- shop centered on sifting through the many definitions of the term mental model, gathering ideas from among the variety of methods used to represent users' knowledge about software systerr~s. This report was prepared by merging the ideas generated by the workshop members with those in Carroll's paper. It includes his central organization and literature review, adds more recent information, and clarifies the distinction between mental models and task representations. This report was then distributed to workshop participants for changes and additions. This report is written for the researcher concerned with the psychology of performance of complex tasks and for the prac- titioner who would like to use information about how the user thinks about both the task and the system in the design of com- puter software, its documentation, or training for its use. Most of the research on these questions has used software-based text- editing tasks as a domain and looked at the mental models people are purported to build of only simple devices. The results should be X11

OCR for page R1
gener~lzed to even more complex tasks, such ~ process control, t=Ucal decision m~lng' project pl~nlug, and ~~blcs design; but tbek scope bag not been tested. Abe exclusion ~ these kluds of tasks ~ not to be taken ~ ~ ludlc~lon that the rese~rcb re- ported cannot cover these more complex tasks. But their scope an important resewn need. Judith ~~ Olson . .. flu

OCR for page R1

OCR for page R1
Abstract Users of software systems acquire knowledge about the system and how to use it through experience, training, and imitation. Currently, there is a great deal of debate about exactly what users know about software. This knowledge may include one or more of the following: simple rules~that prescribe a sequence of actions that apply under certain conditions; general methods that fit certain general situations and goals; and mental models, knowledge of the components of a system, their interconnection, and the processes that change the components; knowledge that forms the basis for users be- ing able to construct reasonable actions; and explanations about why a set of actions is appropriate. Discovering what users know and how these different forms of knowledge fit together in learning and performance is impor- tant. It applies to the problem of designing systems and training programs so that the systems are easy to use and the learning is efficient. Research on the effects of different representations on ultimate performance is mixed. Research on exactly what users xv