National Academies Press: OpenBook

Measuring What Counts: A Conceptual Guide for Mathematics Assessment (1993)

Chapter: 4 Assessing to Support Mathematics Learning

« Previous: 3 Assessing Important Mathematical Content
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

4
ASSESSING TO SUPPORT MATHEMATICS LEARNING

High-quality mathematics assessment must focus on the interaction of assessment with learning and teaching. This fundamental concept is embodied in the second educational principle of mathematics assessment.

THE LEARNING PRINCIPLE

Assessment should enhance mathematics learning and support good instructional practice.

This principle has important implications for the nature of assessment. Primary among them is that assessment should be seen as an integral part of teaching and learning rather than as the culmination of the process.1 As an integral part, assessment provides an opportunity for teachers and students alike to identify areas of understanding and misunderstanding. With this knowledge, students and teachers can build on the understanding and seek to transform misunderstanding into significant learning. Time spent on assessment will then contribute to the goal of improving the mathematics learning of all students.

The applicability of the learning principle to assessments created and used by teachers and others directly involved in classrooms is relatively straightforward. Less obvious is the applicability of the principle to assessments created and imposed by parties outside the classroom. Tradition has allowed and even encouraged some assessments to serve accountability or monitoring purposes without sufficient regard for their impact on student learning.

A portion of assessment in schools today is mandated by external authorities and is for the general purpose of accountability of the schools. In 1990, 46 states had mandated testing programs, as

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

compared with 20 in 1980.2 Such assessments have usually been multiple-choice norm-referenced tests. Several researchers have studied these testing programs and judged them to be inconsistent with the current goals of mathematics education.3 Making mandated assessments consonant with the content, learning, and equity principles will require much effort.

Instruction and assessment—from whatever source and for whatever purpose—must support one another.

Studies have documented a further complication as teachers are caught between the conflicting demands of mandated testing programs and instructional practices they consider more appropriate. Some have resorted to "double-entry" lessons in which they supplement regular course instruction with efforts to teach the objectives required by the mandated test.4 During a period of change there will undoubtedly be awkward and difficult examples of discontinuities between newer and older directions and procedures. Instructional practices may move ahead of assessment practices in some situations, whereas in other situations assessment practices could outpace instruction. Neither situation is desirable although both will almost surely occur. However, still worse than such periods of conflict would be to continue either old instructional forms or old assessment forms in the name of synchrony, thus stalling movement of either toward improving important mathematics learning.

From the perspective of the learning principle, the question of who mandated the assessment and for what purpose is not the primary issue. Instruction and assessment—from whatever source and for whatever purpose—must be integrated so that they support one another.

To satisfy the learning principle, assessment must change in ways consonant with the current changes in teaching, learning, and curriculum. In the past, student learning was often viewed as a passive process whereby students remembered what teachers told them to remember. Consistent with this view, assessment was often thought of as the end of learning. The student was assessed on something taught previously to see if he or she remembered it. Similarly, the mathematics curriculum was seen as a fragmented collection of information given meaning by the teacher.

This view led to assessment that reinforced memorization as a principal learning strategy. As a result, students had scant oppor-

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

tunity to bring their intuitive knowledge to bear on new concepts and tended to memorize rules rather than understand symbols and procedures.5 This passive view of learning is not appropriate for the mathematics students need to master today. To develop mathematical competence, students must be involved in a dynamic process of thinking mathematically, creating and exploring methods of solution, solving problems, communicating their understanding—not simply remembering things. Assessment, therefore, must reflect and reinforce this view of the learning process.

This chapter examines three ways of making assessment compatible with the learning principle: ensuring that assessment directly supports student learning; ensuring that assessment is consonant with good instructional practice; and enabling teachers to become better facilitators of student learning.

ASSESSMENT IN SUPPORT OF LEARNING

Mathematics assessments can make the goals for learning real to students, teachers, parents, and the public.

Assessment can play a key role in exemplifying the new types of mathematics learning students must achieve. Assessments indicate to students what they should learn. They specify and give concrete meaning to valued learning goals. If students need to learn to perform mathematical operations, they should be assessed on mathematical operations. If they should learn to use those mathematical operations along with mathematical reasoning in solving mathematical problems, they must be assessed on using mathematical operations along with reasoning to solve mathematical problems. In this way the nature of the assessments themselves make the goals for mathematics learning real to students, teachers, parents, and the public.

Mathematics assessments can help both students and teachers improve the work the students are doing in mathematics. Students need to learn to monitor and evaluate their progress. When students are encouraged to assess their own learning, they become more aware of what they know, how they learn, and what resources they are using when they do mathematics. "Conscious knowledge about the resources available to them and the ability to engage in self-monitoring and self-regulation are important characteristics of self-assessment that successful learners use to promote ownership of learning and independence of thought."6

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

In the emerging view of mathematics education, students make their own mathematics learning individually meaningful. Important mathematics is not limited to specific facts and skills students can be trained to remember but rather involves the intellectual structures and processes students develop as they engage in activities they have endowed with meaning.

The assessment challenge we face is to give up old assessment methods to determine what students know, which are based on behavioral theories of learning and develop authentic assessment procedures that reflect current epistemological beliefs both about what it means to know mathematics and how students come to know.7

Current research indicates that acquired knowledge is not simply a collection of concepts and procedural skills filed in long-term memory. Rather the knowledge is structured by individuals in meaningful ways, which grow and change over time.8

A close consideration of recent research on mathematical cognition suggests that in mathematics, as in reading, successful learners understand the task to be one of constructing meaning, of doing interpretive work rather than routine manipulations. In mathematics the problem of imposing meaning takes a special form: making sense of formal symbols and rules that are often taught as if they were arbitrary conventions rather than expressions of fundamental regularities and relationships among quantities and physical entities.9

LEARNING FROM ASSESSMENT

Modern learning theory and experience with new forms of assessment suggest several characteristics assessments should have if they are to serve effectively as learning activities. Of particular interest is the need to provide opportunities for students to construct their own mathematical knowledge and the need to determine where students are in their acquisition of mathematical understanding.10 One focuses more on the content of mathematics, the other on the process of doing mathematics. In both, the assessment must elicit important mathematics.

Constructing Mathematical Knowledge Learning is a process of continually restructuring one's prior knowledge, not just adding to it. Good education provides opportunities for students to connect what is being learned to their prior knowledge. One knows

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

Assessment must reflect the value of group interaction for learning mathematics.

mathematics best if one has developed the structures and meanings of the content for oneself.11 For assessment to support learning, it must enable students to construct new knowledge from what they know.

One way to provide opportunities for the construction of mathematical knowledge is through assessment tasks that resemble learning tasks12 in that they promote strategies such as analyzing data, drawing contrasts, and making connections. It is not enough, however, to expand mathematics assessment to take in a broader spectrum of an individual student's competence. In real-world settings, knowledge is sometimes constructed in group settings rather than in individual exploration. Learning mathematics is frequently optimized in group settings, and assessment of that learning must reflect the value of group interaction.13

Some mathematics teachers are using group work in instruction to model problem solving in the real world. They are looking for ways to assess what goes on in groups, trying to find out not only what mathematics has been learned, but also how the students have been working together. A critical issue is how to use assessments of group work in the grades they give to individual students. A recent study of a teacher who was using groups in class but not assessing the work done in groups found that her students apparently did not see such work as important.14 Asked in interviews about mathematics courses in which they had done group work, the students did not mention this teacher's course. Group work, if it is to become an integral and valued part of mathematics instruction, must be assessed in some fashion. A challenge to developers is to construct some high-quality assessment tasks that can be conducted in groups and subsequently scored fairly.

Part of the construction of knowledge depends on the availability of appropriate tools, whether in instruction or assessment. Recent experimental National Assessment of Educational Progress (NAEP) tasks in science use physical materials for a miniexperiment students are asked to perform by themselves. Rulers, calculators, computers, and various manipulatives are examples from mathematics of some instructional tools that should be a part of assessment. If students have been using graphing calculators to explore trigonometric functions, giving them tests on which calculators are banned greatly limits the questions they can be asked and

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

consequently yields an incomplete picture of their learning. Similarly, asking students to find a function that best fits a set of data by using a computer program can reveal aspects of what they know about functions that cannot be assessed by other means. Using physical materials and technology appropriately and effectively in instruction is a critical part of learning today's mathematics and, therefore, must be part of today's assessment.

Since the use of manipulatives is a critical part of today's mathematical instruction, such tools must be part of today's assessment.

Reflecting Development of Competence As students progress through their schooling, it is obvious that the content of their assessments must change to reflect their growing mathematical sophistication. When students encounter new topics in mathematics, they often cannot see how the unfamiliar ideas are connected to anything they have seen before. They resort to primitive strategies of memorization, grasping at isolated and superficial aspects of the topic. As learning proceeds, they begin to see how the new ideas are connected to each other and to what they already know. They see regularities and uncover hidden relationships. Eventually, they learn to monitor their thinking and can choose different ways to tackle a problem or verify a solution.15 This scenario is repeated throughout schooling as students encounter new mathematics. The example below contains a description of this growth in competence that is derived from research in cognition and that suggests the types of evidence that assessment should seek.16

Indicators of Competence

  • Coherent Knowledge. Beginners' knowledge is spotty and shallow, but as proficiency develops, it becomes structured and intergrated into prior knowledge.

  • Principled problem solving. Novices look at the surface features of a task; proficient learners see structure of problems as they represent and solve them.

  • Usable knowledge. Experts have knowledge that is connected to the conditions in which it can be applied effectively. They know not only what to do but when to do it.

  • Attention-free and efficient performance. Experts are not simply faster than novices, they are able to coordinate their automated skills with thinking processes that demand their attention.

  • Self-regulatory skills. As people develop competence, they also develop skills for monitoring and directing their preformance.

A full portrayal of competence in mathematics demands much more than measuring how well students can perform automated skills although that is part of the picture. Assessment should also examine whether students have managed to connect the concepts they have learned, how well they can recognize underlying principles and patterns amid superficial differences, their sense of when to use processes and strategies, their grasp and command of their own understanding, and whether

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

they can bring these skills and abilities together to produce smooth, proficient performance.

PROVIDING FEEDBACK AND OPPORTUNITIES TO REVISE WORK

An example of how assessment results can be used to support learning comes from the Netherlands.17 Eleventh-grade students were given regular 45-minute tests containing both short-answer and essay questions. One test for a unit on matrices contained questions about harvesting Christmas trees of various sizes in a forest. The students completed a growth matrix to portray how the sizes changed each year and were asked how the forest could be managed most profitably, given the costs of planting and cutting and the prices at which the trees were to be sold. They also had to answer the questions when the number of sizes changed from three to five and to analyze a situation in which the forester wanted to recapture the original distribution of sizes each year.

After the students handed in their solutions, the teacher scored them, noting the major errors. Given this information, the students retook the test. They had several weeks to work on it at home and were free to answer the questions however they chose, separately or in essays that combined the answers to several questions. The second chance gave students the opportunity not simply to redo the questions on which they were unsuccessful in the first stage but, more importantly, to give greater attention to the essay questions they had little time to address. Such two-stage testing essentially formalizes what many teachers of writing do in their courses, giving students an opportunity to revise their work (often more than once) after the teacher or other students have read it and offered suggestions. The extensive experience that writing teachers have been accumulating in teaching and assessing writing through extended projects can be of considerable assistance to mathematics teachers seeking to do similar work.18

During the two-stage testing in the Netherlands, students reflected on their work, talked with others about it, and got information from the library. Many students who had not scored well under time pressure—including many of the females—did much better under the more open conditions. The teachers could grade the students on both the objective scores from the first stage and

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

the subjective scores from the second. The students welcomed the opportunity to show what they knew. As one put it

Usually when a timed written test is returned to us, we just look at our grade and see whether it fits the number of mistakes we made. In the two-stage test, we learn from doing the task. We have to study the first stage carefully in order to do well on the second one.19

In the Netherlands, such two-stage tasks are not currently part of the national examination given at the end of secondary school, but some teachers use them in their own assessments as part of the final grade each year. In the last year of secondary school, the teacher's assessment is merged with the score on the national examination to yield a grade for each student that is used for graduation, university admission, and job qualification.

LEARNING FROM THE SCORING OF ASSESSMENTS

Teachers can use scoring guides to communicate the goals of improved mathematical performance.

Assessment tasks that call for complex responses require scoring rubrics. Such rubrics describe what is most important about a response, what distinguishes a stronger response from a weaker one, and often what characteristics distinguish a beginning learner from one with more advanced understanding and performance. Such information, when shared between teacher and student, has critically important implications for the learning process.

Teachers can appropriately communicate the features of scoring rubrics to students as part of the learning process to illustrate the types of performance students are striving for. Students often express mystification about what they did inadequately or what type of change would make their work stronger. Teachers can use rubrics and sample work marked according to the rubric to communicate the goals of improved mathematical explication. When applied to actual student work, rubrics illustrate the next level of learning toward which a student may move. For example, a teacher may use a scoring rubric on a student's work and then give the student an opportunity to improve the work. In such a case, the student may use the rubric directly as a guide in the improvement process.

The example on the following page illustrates how a scoring rubric can be incorporated into the student material in an assess-

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

ment.20 The benefits to instruction and learning could be twofold. The student not only can develop a clearer sense of quality mathematics on the task at hand but can develop more facility at self-assessment. It is hoped that students can, over time, develop an inner sense of excellent performance so that they can correct their own work before submitting it to the teacher.

Incorporating a Scoring Rubric Directions for students

Today you will take part in a mathematics problem-solving assessment. This means that you will be given onen problem to solve. You will hve thirty (30) minutes to work on this problem. Please show all your work. Your paper will be read and scored by another person—someone other than your teacher. Please be sure to make it clear to the reader of your paper how you solved the problem and what you were thinking. The person who will read your paper will be looking mainly for these things:

  1. How well you understand the problem and the kind of math you use.

  2. How well you can correctly use mathematics.

  3. How well you can use problem-solving strategies and good reasoning.

  4. How well you can communicate your mathematical ideas and your solution.

Your paper will receive a score for each of these. You will do all your work here in class on the paper provided and you may use manipulatives or a calculator ro work on your problem.

Guide to Completing the Problem

  1. Conceptual understanding of the problem

  • I used diagrams, pictures, and symbols to explain my work.

  • I used all the important information to correctly solve the problem

  • I have thought about the problem carefully and feel as if I know what I'm talking about

  1. Procedural knowledge

  • I used appropriate mathematical computations, terms, and formulas.

  • I correctly solved and checked my solution to the problem.

  • I used mathematical ideas and language precisely.

  • I checked my answer for correctness.

  1. Problem-solving skills & strategies

  • I looked for other possible ways to solve the problem.

  • I used problem solving skills/strategies that showed some good reasoning.

  • My work is clear and organized.

  1. Communication

  • I communicated clearly and effectively to the reader.

  • In my solution, one step seems to flow to the next.

  • I clearly used mathematics vocabulary and terminology.

  • My sentences make sense and there are no words left out.

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

The rubrics can be used to inform the student about the scoring criteria before he or she works on a task. The rubric can also be used to structure a classroom discussion, possibly even asking the students to grade some (perhaps fictional) answers to the questions. In this way, the students can see some examples of how responses are evaluated. Such discussions would be a purely instructional use of an assessment device before the formal administration of the assessment.

STIMULATING MOTIVATION, INTEREST, AND ATTENTION

Intrinsic sources of motivation offer a fruitful approach to encourage students to perform well.

Because assessment has the potential to affect the learning process substantially, it is important that students do their best when being assessed. Students' motivation to perform well on assessments has usually been tied to the stakes involved. Knowing that an assessment has a direct bearing on a semester grade or on placement in the next class—that is, high personal stakes—has encouraged many students to display their best work. Conversely, assessments to judge the effectiveness of an educational program where results are often not reported on an individual basis carry low stakes for the student and may not inspire students to excel. These extrinsic sources of motivation, although real, are not always consonant with the principle that assessment should support good instructional practice and enhance mathematics learning. Intrinsic sources of motivation, such as interest in the task, offer a more fruitful approach.

Students develop an interest in mathematical tasks that they understand, see as relevant to their own concerns, and can manage. Recent studies of students' emotional responses to mathematics suggest that both their positive and their negative responses diminish as tasks become familiar and increase when the tasks are novel.21 Because facility at problem solving includes facility with unfamiliar tasks, the regular use of nonroutine problems must become a part of instruction and assessment.

In some school districts, educational leaders are experimenting with nonroutine assessment tasks that have instructional value in themselves and that seem to have considerable interest for the students. Such a problem was successfully tried out with fifth-grade students in the San Diego City School District in 1990 and has

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

subsequently been used by other districts across the country to assess instruction in the fifth, sixth, and seventh grades. The task is to help the owner of an orange grove decide how many trees to plant on each acre of new land to maximize the harvest.22 The yield of each tree and the number of trees per acre in the existing grove are explained and illustrated. An agronomist consultant explains that increasing the number of trees per acre decreases the yield of each tree and gives data the students can use. The students construct a chart and see that the total yield per acre forms a quadratic pattern. They investigate the properties of the function and answer a variety of questions, including questions about extreme cases.

The assessment can serve to introduce a unit on quadratic functions in which the students explore other task situations. For example, one group of sixth-grade students interviewed an elementary school principal who said that when cafeteria lunch prices went up, fewer students bought their lunches in the cafeteria. The students used a quadratic function to model the data, orally reported to their classmates, and wrote a report for their portfolios.

Sixth-grade students can be successful in investigating and solving interesting, relevant problems that lead to quadratic and other types of functions. They need only be given the opportunity. Do they enjoy and learn from these kinds of assessment activities and their instructional extensions? Below are some of their comments.

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

It is worth noting that the level of creativity allowable in a response is not necessarily tied to the student's level of enjoyment of the task. In particular, students do not necessarily value assessment tasks in which they have to produce responses over tasks in which they have to choose among alternatives. A survey in Israel of junior high students' attitudes toward different types of tests showed that although they thought essay tests reflected their knowledge of subject matter better than multiple-choice tests did, they preferred the multiple-choice tests.23 The multiple-choice tests were perceived as being easier and simpler; the students felt more comfortable taking them.

ASSESSMENT IN SUPPORTOF INSTRUCTION

If mathematics assessment is to help students develop their powers of reasoning, problem solving, communicating, and connecting mathematics to situations in which it can be used, both mathematics assessment and mathematics instruction will need to change in tandem. Mathematics instruction will need to better use assessment activities than is common today.

Too often a sharp line is drawn between assessment and instruction. Teachers teach, then instruction stops and assessment occurs. Results of the assessment may not be available in a timely or useful way to students and teachers. The learning principle implies that "even when certain tasks are used as part of a formal, external assessment, there should be some kind of instructional follow-up. As a routine part of classroom discourse, interesting problems should be revisited, extended, and generalized, whatever their original sources."24

When the line between assessment and instruction is blurred, students can engage in mathematical tasks that not only are meaningful and contribute to learning, but also yield information the student, the teacher, and perhaps others can use. In fact, an oftstated goal of reform efforts in mathematics education is that visitors to classrooms will be unable to distinguish instructional activities from assessment activities.

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

INTEGRATING INSTRUCTION AND ASSESSMENT

An oft-stated goal of reform is that visitors to classrooms will be unable to distinguish instructional activities from assessment activities.

The new Pacesetter™ mathematics project illustrates how instruction and assessment can be fully integrated by design.25 Pacesetter is an advanced high school mathematics course being developed by the College Board. The course, which emphasizes mathematical modeling and is meant as a capstone to the mathematics studied in high school, integrates assessment activities with instruction. Teachers help the students undertake case studies of applications of mathematics to problems in fields, such as industrial design, inventions, economics, and demographics. In one activity, for example, students are provided with data on the population of several countries at different times and asked to develop mathematical models to make various predictions. Students answer questions about the models they have devised and tackle more extended tasks that are written up for a portfolio. The activity allows the students to apply their knowledge of linear, quadratic, and exponential functions to real data. Notes for the teacher's guidance help direct attention to opportunities for discussion and the interpretations of the data that students might make under various assumptions.

Portfolios are sometimes used as the method of assessment; a sample of a student's mathematical work is gathered to be graded by the teacher or an outside evaluator.

This form of assessment involves assembling a portfolio containing samples of students' work that have been chosen by the students themselves, perhaps with the help of their teacher, on the basis of certain focused criteria. Among other things, a mathematics portfolio might contain samples of analyses of mathematical problems or investigations, responses to open-ended problems, examples of work chosen to reflect growth in knowledge over time, or self-reports of problem-solving processes learned and employed. In addition to providing good records of individual student work, portfolios might also be useful in providing formative evaluation information for program development. Before they can be used as components of large-scale assessment efforts, however, consistent methods for evaluating portfolios will need to be developed.26

Of course the quality of student work in a portfolio depends largely on the quality of assignments that were given as well as on

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

the level of instruction. At a minimum, teachers play a pivotal role in helping students decide what to put into the portfolio and informing them about the evaluation criteria.

The state of Vermont, for example, has been devising a program in which the mathematics portfolios of fourth- and eighth-grade students are assessed;27 other states and districts are experimenting with similar programs. Some problems have been reported in the portfolio assessment process in Vermont.28 The program appears to hold sufficient merit, however, to justify efforts under way to determine how information from portfolios can be communicated outside the classroom in authoritative and credible ways.29

The trend worldwide is to use student work expeditiously on instructional activities directly as assessment. An example from England and Wales is below.30

Assessment can be integrated with classroom discourse and activity in a variety of other ways as well: through observation, questioning, written discourse, projects, open-ended problems, classroom tests, homework, and other assignments.31 Teachers need to be alert to techniques they can use to assess their students' mathematical understanding in all settings.

Coursework Assessment

As part of a new course in England and Wales, students aged 16 to 19 years are assessed through an externally marked final examination, tests given at the end of each unit of approximately I month's duration, and work done during the course. Each unit of coursework consists of a practical investigation extending throughout the unit and two short investigations of about 2 hours each. At the end of the course, 20 percent of each student's grade is based on the coursework and 80 percent is based on unit test and final examination scores. The coursework investigations are chosen from a bank provided by the examination board. Certain investigations are discussed in the text materials and are not used for assessment. Students usually work in groups during an investigation, but then each student writes an individual report to be marked by the teacher according to a set of criteria previously explained to the students.

For example, students in one class worked on the problem of finding a model for the motion of a ball rolling along an inclined plane. The data were collected and discussed in groups. Some students contributed greatly to the discussion; others did not. Although all those in the group had the benefit of the common work, the written reports clearly showed who had understood the problem and who had not.

The most effective ways to identify students' methods are to watch students solve problems, to listen to them explain how the problems were solved, or to read their written explanations. Students should regularly be asked to explain their solution to a problem. Each individual cannot be asked each day, but over time the teacher can get a reading on each student's understanding and proficiency. The teacher needs to keep some

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

From a Teacher-Constructed Seventh-Grade Japanese Semester Examination

Number game

  1. Choose a positive number and add 3.

  2. Multiply the result of (1) by 2.

  3. Subtract 3 from the result of (2).

  4. Multiply the result of (3) by 5.

If the result of (4) is 155, what is the original number? How did you find it? Explain how to find it

Analysis

The teacher analyzed students' explanations and found seven types of meaningful responses concerning the use of letters, as follows:

  • Uses a literal expression (roughly) and tries to explain by transformation

  • Explains by using numbers but does not depend on the numbers from the viewpoint of content

  • Explains by depending on numbers, but cannot detach from numbers

  • Finds a relation inductively

  • Explains by figures

  • Explains by language

  • Finds by the reverse process

The teacher evaluated each student according to these categories. Usually, it is difficult to carry out this type of analysis on a semester examination, since there is too little time. But if it is carried out, the result is useful not only for assigning a grade but also for obtaining instructional feedback.

record of students' responses. Sunburst/Wings for Learning,32 for example, recently produced the Learner Profile™, a hand-held optical scanner with a list of assessment codes that can be defined by the teacher. Useful in informal assessments, a teacher can scan comments about the progress of individual students while walking around the classroom.

Elaborate schemes are not necessary, but some system is needed. A few carefully selected tasks can give a reasonably accurate picture of a student's ability to solve a range of tasks.33 An example of a task constructed for this purpose appears above.34

USING ASSESSMENT RESULTS FOR INSTRUCTION

The most typical form of assessment results have for decades been based in rankings of performance, particularly in mandated assessment. Performances have been scored most

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

typically by counting the number of questions answered correctly and comparing scores for one individual to that for another by virtue of their relative percentile rank. So-called norm referenced scores have concerned educators for many years. Although various criticisms on norm referencing have been advanced, the central educational concern is that such information is not sufficiently helpful to improve instruction and learning and may, in fact, have counterproductive educational implications. In the classroom setting, teachers and students need to know what students understand well, what they understand less well, and what the next learning steps need to be. The relative rankings of students tested may have uses outside the classroom context, but within that context, the need is for forms of results helpful to the teaching and learning process.

Assessment programs must inform teachers and students about what the students have learned, how they learn, and how they think about mathematics.

To plan their instruction, for example, teachers should know about each student's current understanding of what will be taught. Thus, assessment programs must inform teachers and students about what the students have learned, how they learn, and how they think about mathematics. For that information to be useful to teachers, it will have to include an analysis of specific strengths and weaknesses of the student's understanding and not just scores out of context.

To be effective in instruction, assessment results need to be timely.35 Students' learning is not promoted by computer printouts sent to teachers once classes have ended for the year and the students have gone, nor by teachers who take an inordinate amount of time to grade assessments. In particular, new ways must be found to give teachers and students alike more immediate knowledge of the students' performance on assessments mandated by outside authorities so that those assessments—as well as the teacher's own assessments—can be used to improve learning. Even when the central purpose of an assessment is to determine the accomplishments of a school, state, or nation, the assessment should provide reports about their performance to the students and teachers involved. School time is precious. When students are not informed of their errors and misconceptions, let alone helped to correct them, the assessment may have both reinforced misunderstandings and wasted valuable instructional time.

When the form of assessment is unfamiliar, teachers have a particular responsibility to their students to tell them in advance

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

how their responses will be evaluated and what criteria will be used. Students need to see examples of work a priori that does or does not meet the criteria. Teachers should discuss sample responses with their students. When the California Assessment Program first tried out some open-ended questions with 12-grade students in its 1987-1988 Survey of Academic Skills, from half to three-fourths of the students offered either an inadequate response or none at all. The Mathematics Assessment Advisory Committee concluded that the students lacked experience expressing mathematical ideas in writing.36 Rather than reject the assessment, they concluded that more discussion with students was needed before the administration of the assessment to describe what was expected of them. On the two-stage tests in the Netherlands, there were many fewer problems in scoring the essays when the students knew beforehand what the teacher expected from them.37 The teacher and students had negotiated a kind of contract that allowed the students to concentrate on the mathematics in the assessment without being distracted by uncertainties about scoring.

ASSESSMENT IN SUPPORT OF TEACHERS

Teachers will require assistance in using assessments consonant with today's vision of mathematics instruction.

The new visions of mathematics education requires teachers to use strategies in which they function as learning coach and facilitator. Teachers will require support in several ways to adopt these new roles. First, they will need to become better diagnosticians. For this, they will need "… simple, valid procedures that enable [them] to access and use relevant information in making instructional decisions"; "assessment systems [that] take into account the conceptualizations of learning, teaching, and the curriculum that are held by teachers"; and systems that "enable teachers to share assessment data with students and to involve students in making instructional decisions."38 Materials should be provided with the assessments developed by others that will enable teachers to use assessment tasks productively in their instruction. Help should be given to teachers on using assessment results to encourage students to reflect on their work and the teachers to reflect on theirs.

Teachers will require assistance in using assessments consonant with today's vision of mathematics instruction. The Classroom Assessment in Mathematics (CAM) Network, for example, is an electronic network of middle school teachers in seven urban centers

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

who are designing assessment tasks and sharing them with one another.39 They are experimenting with a variety of new techniques and revising tasks to fit their teaching situation. They see that they face some common problems regarding making the new tasks accessible to their students. Collaborations among teachers, whether through networks or other means, can assist mathematics teachers who want to change their assessment practice. These collaborations can start locally or be developed through and sponsored by professional organizations. Publications are beginning to appear that can help teachers assess mathematics learning more thoroughly and productively.40

Collaborations with others can assist mathematics teachers who want to change their assessment practice.

There are indications that using assessments in professional development can help teachers improve instruction. As one example, Gerald Kulm and his colleagues recently reported a study of the effects of improved assessment on classroom teaching:41

We found that when teachers used alternative approaches to assessment, they also changed their teaching. Teachers increased their use of strategies that have been found by research to promote students' higher-order thinking. They did activities that enhanced meaning and understanding, developed student autonomy and independence, and helped students learn problem-solving strategies.42

This improvement in assessment, however, came through a substantial intervention: the teachers' enrollment in a three-credit graduate course. However, preliminary reports from a number of professional development projects such as CAM suggest that improved teaching practice may also result from more limited interventions.

Scoring rubrics can also be a powerful tool for professional development. In a small agricultural county in Florida, 30 teachers have been meeting on alternate weekends, attempting to improve their assessment practice.43 The county has a large population of migrant workers, and the students are primarily of Mexican-American descent. The teachers, who teach mathematics at levels from second-grade arithmetic to calculus, are attempting to spend less time preparing the students to pass multiple-choice standardized tests. Working with a consultant, they have tried a variety of new tasks and procedures. They have developed a much greater respect for how assessments may not always tap learning. They found, for

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

example, that language was the biggest barrier. For students who were just learning English requests such as "discuss" or "explain" often yield little information. The teacher may need, instead, to ask a sequence of questions: "What did you do first?" "Why did you do that?'' "What did you do next?" "Why?" and so on. Working with various tasks, along with the corresponding scoring rubrics, the teachers developed a new appreciation for the quality of their students' mathematical thinking.

Advanced Placement teachers have reported on the value of the training in assessment they get from the sessions conducted by the College Board for scoring Advanced Placement Tests.44 These tests include open-ended responses that must be scored by judges. Teachers have found that the training for the scoring and the scoring itself are useful for their subsequent teaching of the courses because they focus attention on the most important features and lead to more direct instruction on crucial areas of performance that were perhaps ignored in the past.

Assessment tasks and rubrics can be devices for communicating with parents and the larger community.

Assessment tasks and rubrics can be devices that teachers use to communicate with parents and the larger community to obtain their support for changes in mathematics education. Abridged versions of the rubrics—accompanied by a range of student responses—might accomplish this purpose best. Particularly when fairly complex tasks have been used, the wider audience will benefit more from a few samples of actual student work than they will from detailed descriptions and analyses of anticipated student responses.

Teachers are also playing an active role in creating and using assessment results. In an increasing number of localities, assessments incorporate the teacher as a central component in evaluating results. Teachers are being recognized as rich sources of information about what students know and can do, especially when they have been helped to learn ways to evaluate student performance. Many students' anxiety about mathematics interferes with their test performance; a teacher can assess students informally and unobtrusively during regular instruction. Teachers know, in ways that test constructors in distant offices cannot, whether students have had an opportunity to learn the mathematics being assessed and whether they are taking an assessment seriously. A teacher can talk with students during or after an assessment, to find out how they inter-

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

preted the mathematics and what strategies they pursued. Developers of external assessment systems should explore ways of taking the information teachers can provide into account as part of the system.

Teachers are rich sources of information about what students know and can do.

In summary, the learning principle aims to ensure that assessments are constructed and used to help students learn more and better mathematics. The consensus among mathematics educators is that assessments can fulfill this expectation to the extent that tasks provide students opportunities to extend their knowledge, are consonant with good instruction, and provide teachers with an additional tool that can help them to become better facilitators of student learning. These are new requirements for assessment. Some will argue that they are burdensome, particularly the requirement that assessments function as learning tasks. Recent experience—described below and elsewhere in this chapter—indicates this need not be so, even when an assessment must serve an accountability function.

The Pittsburgh schools, for example, recently piloted an auditing process through which portfolios developed for instructional uses provided "publicly acceptable accountability information."45 Audit teams composing teachers, university-based researchers, content experts, and representatives of the business community evaluated samples of portfolios and sent a letter to the Board of Education that certified, among other things, that the portfolio process was well defined and well implemented and that it aimed at success for all learners, challenged teachers to do a more effective job of supporting student learning, and increased overall system accountability.

There is reason to believe, therefore, that the learning principle can be honored to a satisfactory degree for both internal and external assessments.

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

ENDNOTES

1  

National Council of Teachers of Mathematics, Curriculum and Evaluation Standards for School Mathematics (Reston, VA: Author, 1989), 196.

2  

This statistic was compiled by using information from Edward D. Roeber, "Association of State Assessment Programs: Annual Survey of America's Large-Scale Assessment Programs" (Unpublished document, Fall 1991).

3  

Edward A. Silver and Patricia A. Kenney, "Sources of Assessment Information for Instructional Guidance in Mathematics" in Thomas A. Romberg, ed., Reform in School Mathematics and Authentic Assessment, in press; Edward A. Silver, Jeremy Kilpatrick, and S. Schlesinger, Thinking Through Mathematics (New York, NY: College Entrance Examination Board, 1990); Thomas A. Romberg, E. Anne Zarinnia, and Kevin F. Collis, "A New World View of Assessment in Mathematics," in Gerald Kulm, ed., Assessing Higher Order Thinking in Mathematics (Washington, D.C.: American Association for the Advancement of Science, 1990), 21-38; Thomas A. Romberg, "Evaluation: A Coat of Many Colors" (A paper presented at the Sixth International Congress on Mathematical Education, Budapest, Hungary, July 27-August 3, 1988), Division of Science, Technical and Environmental Education, UNESCO.

4  

Linda M. McNeil, "Contradictions of Control: Part 3, Contradictions of Reform," Phi Delta Kappan 69 (1998): 478-485.

5  

Lauren B. Resnick, National Research Council, Committee on Mathematics, Science, and Technology Education, Education and Learning to Think (Washington, D.C.: National Academy Press, 1987).

6  

Patricia Ann Kenney and Edward A. Silver, "Student Self-Assessment in Mathematics," in Norman L. Webb and Arthur Coxford, eds., Assessment in the Mathematics Classroom, 1993 NCTM Yearbook (Reston, VA: National Council of Teachers of Mathematics, 1993), 230.

7  

Thomas A. Romberg, "How One Comes to Know: Models and Theories of the Learning of Mathematics," in Mogens Niss, ed., Investigations into Assessment in Mathematics Education: An ICMI Study (Dordrecht, The Netherlands: Kluwer Academic Publishers, 1993), 109.

8  

Thomas A. Romberg and Thomas P. Carpenter, "Research on Teaching and Learning Mathematics: Two Disciplines of Scientific Inquiry," in Merlin C. Wittrock, ed., Handbook of Research on Teaching, 3rd ed. (New York, NY: Macmillan, 1986), 851.

9  

Education and Learning to Think, 12.

10  

Nancy S. Cole, "Changing Assessment Practice in Mathematics Education: Reclaiming Assessment for Teaching and Learning" (Paper presented at the Conference on Partnerships for Systemic Change in Mathematics, Science, and Technology Education, Washington, D.C., 7 December 1992).

11  

This constructivist view of learning is becoming increasingly prevalent. Analyses of learning from a cognitive perspective point to the centrality of the learner's activity in acquiring understanding [see, for example, John R. Anderson, "Acquisition of Cognitive Skill, Psychological Review, 89 (1982): 396-406; and Y. Anzai and Herbert A. Simon, "The Theory of Learning by Doing" Psychological Review 86 (1979): 124-40). Classroom-based studies such as the ones cited earlier (Paul Cobb, Terry Wood, and Erna Yackel "Class

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

   

rooms as Learning Environments for Teachers and Researchers," in Robert Davis, Carolyn Maher, and Nel Noddings, ads., Consructivist Views on the Teaching and Learning of Mathematics, monograph, no. 4 (Reston, VA: National Council of Teachers of Mathematics, 1990), 125-146; and Elizabeth Fennema, Thomas Carpenter, and Penelope Paterson "Learning Mathematics with Understanding: Cognitively Guided Instruction," in J. Brophy, ed., Advances in Research in Teaching (Greenwich, CT: JAI Press, 1989), 195-221]. Purely epistemological analyses [e.g., Ernst van Glasersfeld, "Learning as a Constructive Activity", in Claude Janvier, ed., Problems of Representation in the Teaching and Learning of Mathematics (Hillsdale, NJ: Lawrence Erlbaum Associates, 1987)], also lend credence to the conception of learners as constructors of their own knowledge.

12  

Lorrie A. Shepard, "Why We Need Better Assessments," Educational Leadership, 46:7 (1989), 7.

13  

There have been several reviews of the literature in this area, including Nail Davidson, "Small Group Learning and Teaching in Mathematics: A Selective Review of the Literature, in R. Slavin et al., ads., Learning to Cooperate, Cooperating to Learn (New York, NY: Plenum, 1985), 211-230); Thomas L. Good, Catherine Mulryan, and Mary McCaslin "Grouping for Instruction in Mathematics: A Call for Programmatic Research on Small-Group Processes" in Douglas Grouws, ed., Handbook of Research on Mathematics Teaching and Learning (New York, NY: Macmillan, 1992); S. Sharan, "Cooperative Learning in Small Groups: Recent Methods and Effects on Achievement, Attitudes, and Ethinic Relations," Review of Educational Research 50 (1980), 241-271; R. Slavin, ed., School and Classroom Organization (Hillsdale, NJ: Lawrence Erlbaum Associates, 1989). See also Yvette Solomon, The Practice of Mathematics (London, England: Routledge, 1989), 179-187.

14  

Linda D. Wilson, "Assessment in a Secondary Mathematics Classroom" (Ph.D. diss., University of Wisconsin-Madison, 1993).

15  

Dedre Gentner and Albert L. Stevens, eds., Mental Models (Hillsdale, NJ: Lawrence Erlbaum Associates, 1981); Lauren Resnick and Wendy Ford, The Psychology of Mathematics for Instruction (Hillsdale, NJ: Lawrence Erlbaum Associates, 1981); Joseph C. Campione, Ann L. Brown, and Michael L. Connell, "Metacognition: On the Importance of Understanding What You Are Doing," in Randall I. Charles and Edward A. Silver, eds., The Teaching and Assessing of Mathematical Problem Solving (Reston, VA: Lawrence Erlbaum and National Council of Teachers of Mathematics, 1988), 93-114.

16  

Robert Glaser, "Cognitive and Environmental Perspectives on Assessing Achievement," in Assessment in the Service of Learning: Proceedings of the 1987 ETS Invitational Conference (Princeton, NJ: Educational Testing Service, 1988), 38-40.

17  

Jan de Lange, Mathematics, Insight and Meaning: Teaching, Learning and Testing of Mathematics for the Life and Social Sciences (Utrecht, The Netherlands: Rijksuniversiteit Utrecht, Vakgroep Onderzoek Wiskundeonderwijs en Onterwijscomputercentrum, 1987), 184-222.

18  

Vermont Department of Education, Looking Beyond 'the Answer': The Report of Vermont's Mathematics Portfolio Assessment Program (Montpelier, VA: Author, 1991); Jean Kerr Stenmark, Assessment Alternatives in Mathematics: An Overview of Assessment Techniques that Promote Learning (Berkeley, CA: University of California, EQUALS, 1989).

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

19  

Mathematics, Insight and Meaning: Teaching, Learning and Testing of Mathematics for the Life and Social Sciences, 207.

20  

Oregon Department of Education, Student Directions: Guide to Completing the Problem (Salem, OR: Author, 1991).

21  

Douglas B. McLeod, "Research on Affect in Mathematics Education: A Reconceptualization," in Douglas A. Grouws, ed., Handbook of Research on Mathematics Teaching and Learning (New York, NY: Macmillan, 1992), 578.

22  

Marilyn Rindfuss, ed., "Mr. Clay's Orange Orchard," Mathematics Performance Assessment, Form I, Integrated Assessment System Mathematics Performance Assessment Tasks (San Antonio, TX: The Psychological Corporation, 1991).

23  

Moshe Zeidner, "Essay Versus Multiple-Choice Type Classroom Exams: The Student's Perspective," Journal of Educational Research 80:6 (1987), 352-358.

24  

National Research Council, Mathematical Sciences Education Board, Measuring Up: Prototypes for Mathematics Assessment (Washington, D.C.: National Academy Press, 1993), 11.

25  

The College Board, Pacesetter: An Integrated Program of Standards, Teaching, and Assessment (New York, NY: Author, 1992).

26  

Edward A. Silver, "Assessment and Mathematics Education Reform in the United States," International Journal of Educational Research 17:5 (1992), 497.

27  

Looking Beyond 'The Answer'.

28  

Daniel Koretz et al., The Reliability of Scores from the 1992 Vermont Portfolio Assessment Program, CSE Technical Report 355 (Los Angeles, CA: University of California, National Center for Research on Evaluation, Standards, and Student Testing, 1993).

29  

Pamela A. Moss et al., "Portfolios, Accountability, and an Interpretive Approach to Validity," Educational Measurement: Issues and Practice 11:3 (1992), 12-21.

30  

31. Adapted from A. England, A. Kitchen, and J. S. Williams, Mathematics in Action at Alton Towers (Manchester, England: University of Manchester, Mechanics in Action Project, 1989).

31  

"Sources of Assessment Information for Instructional Guidance in Mathematics.''

32  

Sunburst/Wings for Learning, Learner Profile (Pleasantville, New York: Author, 1993).

33  

In a sense this relates to the notion of generalizability, the extent to which inferences about performance on a totality of tasks can be inferred from performance on a subset. In the relatively informal milieu of internal assessment, of course, it is fairly easy for teachers to supplement an assessment with additional tasks if they are not convinced that they have sufficient data from which to make judgments. Nonetheless, the effectiveness of internal assessment is heavily dependent on the teacher's skill and acumen in task selection.

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×

34  

Shinya Ohta, "Cognitive Development of a Letter Formula" (in Japanese), Journal of Japan Society of Mathematical Education 72 (1990):242-51, in Ezio Nagasaki and Jerry P. Becker, "Classroom Assessment in Japanese Mathematics Education" in Norman L. Webb and Arthur F. Coxford, eds., Assessment in the Mathematics Classroom (Reston, VA: National Council of Teachers of Mathematics, 1993), 40-53.

35  

R. L. Bangert-Drowns et al., "The instructional Effect of Feedback in Test-Like Events," Review of Educational Research, 61:2 (1991), 213-238. This study reported a metaanalysis of 40 studies that showed that (a) immediate feedback is more effective than feedback that is delayed a day or more after a test, and (b) providing guidance about correct answers is more effective than feedback that merely informs students whether their answers were correct or not.

36  

California Assessment Program, A Question of Thinking: A First Look at Students' Performance on Open-Ended Questions in Mathematics (Sacramento, CA: California State Department of Education, 1989), 6.

37  

Mathematics, Insight and Meaning, 218.

38  

Margaret C. Wang, "The Wedding of Instruction and Assessment in the Classroom," in Assessment in the Service of Learning: Proceedings of the 1987 ETS Invitational Conference (Princeton, NJ: Educational Testing Service, 1988), 75.

39  

Maria Santos, Mark Driscoll, and Diane Briars, "The Classroom Assessment in Mathematics Network," in Norman L. Webb and Arthur Coxford, eds., Assessment in the Mathematics Classroom, 1993 NCTM Yearbook (Reston, VA: National Council of Teachers of Mathematics, 1993), 220-228.

40  

Examples include J. K. Stenmark, Mathematics Assessment: Myths, Models, Good Questions, and Practical Suggestions (Reston, VA: National Council of Teachers of Mathematics, 1991); Assessment in the Mathematics Classroom; Measuring Up; Assessing Higher Order Thinking in Mathematics; California Assessment Program, A Sampler of Mathematics Assessment (Sacramento, CA: California Department of Education, 1991); Judy Mumme, Portfolio Assessment in Mathematics (Santa Barbara, CA: California Mathematics Project, University of California, Santa Barbara, 1990).

41  

Gerald Kulm, "A Theory of Classroom Assessment and Teacher Practice in Mathematics (Symposium paper presented at the annual meeting of the American Educational Research Association, Atlanta, GA, 17 April 1993). Related papers at the same symposium were Bonita Gibson McMullen, "Quantitative Analysis of Effects in the Classroom"; Diane Scott, "A Teacher's Case of New Assessment"; James A. Telese, "Effects of Alternative Assessment from the Student's View."

42  

"A Theory of Classroom Assessment," 12.

43  

Gilbert Cuevas, personal communication, April 1993.

44  

The College Board, An Invitation to Serve as a Faculty Consultant to the Advanced Placement Reading (New York, NY: Author, 1993).

45  

Paul LeMahieu, "What We Know about Performance Assessments" Session (Presentation made at the annual conference of the National Center for Research on Evaluation, Standards, and Student Testing, Los Angeles, CA, 10 September 1992).

Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 67
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 68
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 69
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 70
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 71
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 72
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 73
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 74
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 75
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 76
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 77
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 78
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 79
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 80
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 81
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 82
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 83
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 84
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 85
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 86
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 87
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 88
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 89
Suggested Citation:"4 Assessing to Support Mathematics Learning." National Research Council. 1993. Measuring What Counts: A Conceptual Guide for Mathematics Assessment. Washington, DC: The National Academies Press. doi: 10.17226/2235.
×
Page 90
Next: 5 Assessing to Support Equity and Opportunity in Mathematics Learning »
Measuring What Counts: A Conceptual Guide for Mathematics Assessment Get This Book
×
Buy Paperback | $40.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

To achieve national goals for education, we must measure the things that really count. Measuring What Counts establishes crucial research- based connections between standards and assessment.

Arguing for a better balance between educational and measurement concerns in the development and use of mathematics assessment, this book sets forth three principles—related to content, learning, and equity—that can form the basis for new assessments that support emerging national standards in mathematics education.

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!