This information will be an important resource in efforts to improve mathematics and science education because it will enable better understanding of our own educational system.
International studies are sometimes viewed with suspicion.14,15 Skeptics argue that cultural and social factors vary so much across countries as to make comparisons impossible. They point also to the difficulties of creating instruments and conceptual categories that are usable across cultural contexts. In SMSO, substantial time and intellectual resources were invested in instrument development and research design to deal with these serious challenges. TIMSS researchers wanted to avoid the assumption that the critical educational elements were identical across contexts or that they meant the same thing. But they also sought to avoid tailoring the methods of research to each country on an individual basis because such tailoring would make comparative analyses virtually impossible.
What is the Survey of Mathematics and Science Opportunities (SMSO)?
Gathering cross-national data of this complexity and scale called for the development and piloting of innovative survey instruments. The Survey of Mathematics and Science Opportunities (SMSO) was originally designed to develop instruments and methods of analysis that would be suitable for the complexity of the TIMSS research. In SMSO, an international collaborative research team undertook the task of conducting close observations of classrooms in six of the TIMSS countries: France, Japan, Norway, Spain, Switzerland, and the United States.16 The central questions of SMSO were, "What does a typical mathematics lesson look like?" and "What does a typical science lesson look like?" The goal was to understand enough about teaching and curriculum within each country that the design of survey instruments for TIMSS would allow for comparisons and also be sufficiently sensitive to cultural differences. This pilot work was done to increase the potential of providing valid information for interpreting variations in student achievement. SMSO researchers observed classroom lessons in their own countries, read one another's written classroom observations, and discussed meanings of practices, terms, assumptions, and norms. They also developed, piloted, and revised survey instruments based on teachers' interpretations and responses. In studying the implemented curriculum—teaching practice—researchers observed 127 classrooms across six countries and three levels. These classrooms were never intended to constitute a probability sample of classrooms in the countries but, rather, to provide sufficient variation overall to ensure good instrument development for TIMSS. This report draws primarily on the results of SMSO as a means to introduce and set the stage for discussion of TIMSS.
A fundamental premise for the SMSO effort was that the variation in learning conditions across countries would be critical to development of TIMSS survey items and to sensible interpretation of the TIMSS student achievement data. The meaning of this premise became more complicated as the
international team of SMSO researchers discovered differences in taken-for-granted operational definitions for common educational expressions. For example, "seatwork" is a common part of U.S. classes, especially in mathematics, and in the U.S., the researchers found, the term was used to refer to independent practice or review work. In some other countries, although students did independent work at their seats, such work was intermittent and was woven into an ongoing whole group lesson. Researchers learned that periods of independent student class work and how this work fit into the larger class agenda varied enough to make the term "seatwork" as understood in the U.S. almost meaningless in the international context. Other examples of fundamental differences included the concept of a "lesson"—in terms of length, structure, and frequency—as well as the length of time that teachers work with the same group of students. In some countries, teachers stay with the same students for as many as six years. This may well affect what teachers know about their students. In yet another example, in Switzerland, primary schools do not have principals a fact that the researchers learned during a data collection planning meeting. The team realized that this fact might represent an important element of Swiss educational practice and that, indeed, the very concept of a ''school'' might be different across countries, with the relevant organizational unit different in different contexts.
These and other discoveries were important to the ultimate design of survey instruments for the larger TIMSS study. SMSO work proceeded through a series of iterations. Qualitative observations and interviews were discussed at length, and the insights reached by the team in turn shaped revisions of quantitative survey instruments. The SMSO research team found the process challenging and worthwhile, well beyond its contributions to instrument development for the larger study. Titling an entire section of their report, "Value of International Discourse," the researchers explained that the "discourse process . . . led to insights not possible with a more traditional model." 17 "Confusion and surprise" emerged often, and through discussions, research team participants clarified their ideas. They found that they had learned a great deal about educational practice and comparison in their efforts to reach shared understandings about instrument items and ideas. The group noted the subtleties of difference in practice underlying surface similarities of language. The SMSO researchers decided that the SMSO methodology and the results would be of interest and importance in their own right. The SMSO report will be released with the TIMSS curriculum studies.18