The following HTML text is provided to enhance online
readability. Many aspects of typography translate only awkwardly to HTML.
Please use the page image
as the authoritative form to ensure accuracy.
On Evaluating Curricular Effectiveness: Judging the Quality of K-12 Mathematics Evaluations
be a valuable means of feedback to writers and a source of professional development to the teachers.
The Selection of Standards or Comparative Curricula
At the most general level, the conduct of a content analysis requires identifying either a set of standards against which a curriculum is compared or an explicitly contrasting curriculum; the analysis should not rely on an imprecise characterization of what should be included. Common choices include the original or the revised NCTM Standards, state standards or other standards, or comparative curricula as a means of contrast. The strongest evaluations, in our opinion, used a combination of these two approaches along with a rationale for their decisions.
The choice of comparison can have a crucial impact on the review. For example, the unpublished Adams report succinctly showed how conclusions from content analysis of a curriculum can vary with changes in the adopted measures, varying goals, and philosophies. This report, prepared for the NSF, stood out as being particularly complete and carefully researched and analyzed in its evaluations. To appraise the NSF curricula, they evaluated Connected Mathematic Project (CMP) and Mathematics in Context in terms of the 2000 NCTM Principles and Standards for School Mathematics. These two programs were chosen, in part, because evidence by AAAS’s Project 2061 suggested they are among the top of the 13 NSF-sponsored projects studied.
An interesting and valued feature in the Adams report was that these programs were compared with the “traditional” Singapore mathematics textbooks “under the authority of a traditional teacher” (Adams et al., 2000, p. 1). To explain why Adams selected the Singapore program as the “traditional approach” measure of comparison, recall that the performance of students from the United States on TIMSS “dropped from mediocre at the elementary level through lackluster at the middle school level and down to truly distressing at the high school level.” On the other hand, of the 41 nations whose students were tested, those from Singapore “scored at the very top” (Adams et al., 2000, p. 1). We found that this comparison component of the Adams study with a top-ranked traditional program provides a valuable and new dimension absent from most other studies. Because the United States is at the forefront of scientific and technological advances, the Singapore comparison dimension cannot be ignored: content analysis studies that make comparisons across a variety of types of curricular material must be encouraged and supported.
The Adams report demonstrated the importance of the selection of the standards and the comparative curricula in their reported results. When