Click for next page ( 37


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 36
3 Committee Member Perspectives 3.1 ALFRED AHO Committee member Alfred Aho, a professor of computer science at Columbia University, commented on several topics: motivations for com- putational thinking in education, potential pitfalls in ineffectively teach - ing computational thinking, the need for investment in infrastructure and tools to facilitate learning of computational thinking, and the role of assessment. Motivation for Computational Thinking Echoing the sentiments of Matthew Stone, Aho described three com- mon motivations for explicitly introducing computational thinking into education. First, he argued, computational thinking has an impact on virtually every area of human endeavor, as illustrated by the first work - shop report’s discussion of computational thinking applications in fields as diverse as law, medicine, archeology, journalism, and biology. Second, he noted dangers in computational thinking done badly. He recounted a story—“A number of years ago when I was doing some consulting for NASA, I came to Washington and noticed an article in the Washington Post that said global warming wasn’t as bad as scientists feared because the empirical measure of the rate of rise of Earth’s oceans wasn’t as bad as the computer models had predicted. It turned out to be a software error. So if we’re going into this world of modeling and simulation, I would like to put in a plea for good software engineering 36

OCR for page 36
37 COMMITTEE MEMBER PERSPECTIVES practices and making sure not only that the data are correct but also that the programs are correct.” To underscore this point, Aho cited an article in Nature about bad software in computational science.1 Third, Aho suggested that computational thinking plays an important role in developing new and improved ways of creating, understand- ing, and manipulating representations—representations that can change, sometimes dramatically, the way in which people see problems. Humanization of Computational Thinking Aho observed that a number of workshop participants pointed to the humanizing effect of computational thinking. Recalling Idit Caperton’s thoughts that using information technology in an appropriate manner “engages people, engages their souls, their passion, and their productiv - ity, and people care,” Aho described similar experiences in working with undergraduates. He found that using creative programming projects to hone and develop computational thinking skills motivated students to pursue further education in computer science. Aho described classes in which students work in small teams to create their own innovative pro - gramming language and then to build a compiler for it, and he reported that “often the students say the most important things that they learned from this course are not principles of programming languages or compiler design but the interactions that they had with the other students and the fun they had in doing the projects.” Aho also suggested that this kind of response to the use of technology was an effective rebuttal to those who argue that computers and informa - tion technology are dehumanizing, as illustrated by Jaron Lanier’s argu- ments in You Are Not a Gadget. Computational Thinking as a Moving Target Aho acknowledged the community’s need for a common definition of computational thinking, development of which is inherently difficult given the rapidly changing world to which computational thinking is often applied. Any static definition of computational thinking would likely be obsolete 10 or 20 years for now, he argued, and thus, ”The real challenge for the entire community is to define computational thinking and also to keep it current.” With that thought in mind, Aho stated that he was particularly taken 1 Zeeya Merali, 2010, “Computational Science: . . . Error: . . . Why Scientific Program - ming Does Not Compute,” Nature 467(7317):775-777. Available at http://www.nature.com/ news/2010/101013/full/467775a.html.

OCR for page 36
38 PEDAGOGICAL ASPECTS OF COMPUTATIONAL THINKING with a point made during Deanna Kuhn’s presentation, that computer sci- ence and education communities should use computational thinking not just to teach old things but also to teach new things, both new methods and new ideas, to solve new problems, because that’s what the people we will be educating are going to be doing in the future. Need to Apply Learning Science to the Problem of Teaching/Learning Computational Thinking Echoing Jeannette Wing’s original charge for the workshop series, Aho said he also believed that educational theory and developmental psychology would help to inform the teaching of computational think - ing regarding what particular content to teach and when to teach it. For example, developmental psychology could help identify the specific concepts of computational thinking that would be most appropriate for young children. More generally, he argued that for computational think - ing to be taught effectively, any curriculum for computational thinking should be phased according to a developmental sequence characteristic of the students engaged with that curriculum. Finally, he also suggested that developmental psychology might have value in contributing to different pedagogical models for learners with different cognitive styles and in shaping the infrastructure and tools needed to teach computational thinking. Infrastructure for Computational Thinking Addressing the issue of the infrastructure needed to support a seri - ous educational effort to promote computational thinking broadly, Aho noted that such an infrastructure did not consist only of hardware but also necessarily included continuing funding streams, instruments for gathering data needed to analyze outcomes, and an ongoing data collec - tion effort. He added that the infrastructure would also require ongoing maintenance for and the development of new tools to support compu - tational thinking. A key element of infrastructure, Aho argued, is the ability to integrate applications. Aho warned that “unless these issues get resolved, we are going to find ourselves in a world of the future which may resemble the software world that we’re currently in, which is largely a Tower of Babel, [with] lots of incompatible infrastructures and a lot of expense.” This comment prompted Stephen Uzzo to argue that interoperability, access, usability, and portability of data are problems that can be explored through collaboration.

OCR for page 36
39 COMMITTEE MEMBER PERSPECTIVES How Do You Know What Students Are Learning? Aho reflected concerns shared by a number of workshop participants that determining what students are learning in computational thinking activities may be difficult. He noted that assessing how a student has internalized the abstractions of computational thinking may be challeng - ing, and even assessing programming skills can be difficult. For example, he indicated that although program correctness is an essential goal of good programming, a student who writes a correct program (i.e., one that exhibits the appropriate behavior) nevertheless may not have made the conceptual connections that one might expect from someone who has written a correct program. He illustrated this point in commenting on Walter Allan’s presentation, in which he observed that “[in thinking about] the kind of thought process that a student is following to get the bunny to eat these carrots, I am not sure what the student is actually learn- ing about some of these much deeper issues that a serious programmer would have to face.” 3.2 URI WILENSKY Committee member Uri Wilensky, a Northwestern University profes- sor and director of the Center for Connected Learning and Computer- Based Modeling, shared his observations on a number of key issues dis - cussed at the workshop, including the motivation and value in teaching computational thinking, the challenges arising from the continuing non- convergence on one definition of computational thinking, and identifi- cation of the best environment and tools for conveying computational thinking to different audiences. Motivation Wilensky noted that in recent years, many branches of science and engineering have changed in ways that require researchers to be facile with computational thinking. Disciplines such as biology, physics, math - ematics, and so on utilize computational methods to analyze problems and model phenomena.2 Computational thinking in many ways offers a new way to interact and learn about the world and scientific phenomena. According to Wilensky, in order to effectively engage and contribute to modern science and engineering, future scientists and engineers must be 2 National Research Council, 2010, Report of a Workshop on the Scope and Nature of Computa - tional Thinking, Washington, D.C.: The National Academies Press. Available at http://www. nap.edu/catalog.php?record_id=12840. Last accessed February 7, 2011.

OCR for page 36
40 PEDAGOGICAL ASPECTS OF COMPUTATIONAL THINKING able to do computational thinking. Thus early in their academic careers, key computational thinking concepts should be introduced and mastered. A second reason for encouraging computational thinking is the power that it affords for greater automation of tedious tasks and the ability to manage more complexity for all types of learning and discovery. Mechani- cal automation allows one to delegate certain tedious tasks and simple problem solving in favor of more complex tasks and problem solving. Indeed, as the problems at hand become more complex from a process and computational perspective, increasingly computational tools and abstractions are needed to analyze and understand them. A third reason is that computational thinking supports the capacity for complex design and simulation. It enables one to naturally create designs within a specific context that do not require access to different kinds of materials because the materials are represented computationally in the form of data and bits. Wilensky also noted that such simulations could be used to inform public debate and discourse about issues of public policy—simulations could be used as modeling tools to explore alternative scenarios for situations in which the interactions and feedback loops among the relevant elements (e.g., resources) are tightly coupled. Fourth, computational thinking (and computational tools) can enhance self-expression and collaboration, supporting the use of many different forms of expression and the easy sharing of those expressions. The potential for expression and collaboration can be very motivating to many individuals, especially children. Wilensky suggested that the use of computation in art, music, and other kinds of expressive media is under- explored in much of the available research. As a fifth reason to motivate computational thinking, Wilensky recalled Caperton’s argument that educators do not always have to start with kids but rather can focus on those in positions of leadership. That is, Wilensky paraphrased, “If we are thinking about the citizenship value of computational thinking, then it is shortsighted to not pay attention to the people who are actually empowered to make a difference and to try to change the discourse among that group so that they are computation- ally literate enough to be able to understand this complex world they are being asked to lead.” Last, Wilensky argued that computational thinking, much like the use of Arabic numerals, democratizes access to knowledge. He noted that the significance of Arabic numerals was not that they were essential to multiplication and division (indeed, there were algorithms for multiply - ing and dividing Roman numerals), but rather that because they were so much less cumbersome, Arabic numerals enabled many more people to perform multiplications and divisions. Wilensky then said, “The claim I was making is that we can now use computational representation—

OCR for page 36
41 COMMITTEE MEMBER PERSPECTIVES which similarly affords greater access to knowledge and new knowledge development.” As an example, Wilensky pointed to work he has done with seventh- grade students to use computational thinking and computer modeling to study segregation patterns in Chicago. They started by using some of the NetLogo variation models that were based on the work of the Harvard economist Thomas Schelling, who actually won a Nobel Prize for that . . . last year. Schelling, who was a very learned and skilled economist, took many months to build these segregation models by using lots of checkered boards and moving coins around and flipping them back and forth according to determinant rules. He had the basic thinking that was needed to do those models. What he didn’t have was tools that could actually do it quickly enough so that he could consider all kinds of alternative scenarios. Now these seventh graders were doing that and they were asking all kinds of questions that pushed well beyond Schelling, like what would happen if there were some Asians that desired only integrated neighborhoods or what would happen if you had many more sets of groups that had different criteria. All those things could be easily explored within the foundational framework—[but] really [were] pretty much impossible without compu- tational thinking and related tools. Epistemological Diversity Regarding Computational Thinking Wilensky said that although he saw a lot of common ground on cer- tain aspects of computational thinking among educators and researchers, there were a number of significant areas where workshop participants saw things differently. (This diversity of perspective was also reflected in the first workshop report.) Specifically, he thought that the different ways of understanding computational thinking discussed in the second workshop fell into several categories: ways of seeing and knowing, ways of doing or capacities, a method of inquiry, and ways of collaborating. He noted that some panelists talked about computational thinking as ways of seeing and knowing. For example, he pointed to Robert Tinker’s presentation in which Tinker talked about breaking up the world into dif - ferent simple processes or pieces as a way of seeing the world not just as objects but rather as various informational pieces that can be attached to objects and processes and manipulated. Wilensky said that in this view, computational thinking as ways of seeing and knowing really represents a different way of understanding the world. Ways of doing and capacities was another conception of computational thinking present in many of the different presentations. According to Wilensky, this view emphasizes the importance of building, design- ing, and going through various “constructionist” kinds of activities, as

OCR for page 36
42 PEDAGOGICAL ASPECTS OF COMPUTATIONAL THINKING Seymour Papert would call them. In this sense, ways of doing includes issues of modeling and thinking using computation as a way of repre - senting the world and being able to experiment and explore alternative scenarios within the simulated world. Computation as a method of inquiry was interesting to Wilensky because the computer is protean enough to afford users the ability to explore and manipulate all kinds of processes within a small space. Illustrating this view, Wilensky cited John Jungck’s presentation in which Jungck talked about the very small range in the evolutionary possibility scale that is actually represented in real creatures. Through computational thinking researchers can create simulated worlds in which one can explore evo - lutionary trajectories that never actually came into being, creatures that could have evolved but did not. Last, Wilensky noted that a number of speakers described computa - tional thinking as though it was a way of collaborating. These presenters focused on the ways in which collaboration can be extended as a result of computation and computational tools. New ways to connect and form different groups are no longer necessarily limited by geography. Wilen - sky held that “instead of a spatial model of collaboration, we have this kind of network model of collaboration where there are many different opportunities for synching up, and that capacity is becoming more and more important in our society, and computation is another way to facili - tate that.” A Diversity of Venues for Computational Thinking Represented at the workshop were a number of different perspec- tives regarding the most effective environment and tools for teaching computational thinking. Wilensky distilled the points of view as those favoring formal curricular learning versus extracurricular learning and those favoring lab-based learning versus in-the-field learning. The case for making computational thinking a part of the formal school curriculum was made by several speakers. Wilensky pointed to arguments made by Tinker that the right place for computational thinking is in schools, specifically within the science curriculum, because science already uses computational thinking and computers in major ways. With computational thinking, educators can facilitate all kinds of modeling activities in science that really represent ways of actually doing real-world science as opposed to just sort of learning about science. Wilensky argued that social science research may also be a fertile ground for computational thinking, saying that “social science is another very fertile area to inte - grate computational thinking because tools now enable us to be able to mine large data sets or to create models that were not possible.” The con-

OCR for page 36
43 COMMITTEE MEMBER PERSPECTIVES structs of new representational infrastructure and meta-representational capacities in computational thinking offer possibilities for substantial advances in social science. Others argued that computational thinking should be its own subject within the formal curriculum. Wilensky pointed to Caperton’s presenta - tion, which demonstrated that educators can actually design a curriculum around computational thinking. Tinker and others did note a concern that the fact of already packed school curricula may generate some push back on the idea of computational thinking taught as a separate subject. To this point, Wilensky responded that this may be more a strategic discus - sion rather than a pedagogical one. Work presented by Paulo Blikstein and others shows that there is room even in current science curricula to introduce computational thinking concepts in a way that they fit, and also mutually support learning of other complex concepts. Others further argue that computational thinking fits best in an extracurricular context. Wilensky argued that each option should be explored. Lab-based approaches were discussed, as were in-the-field approaches. Wilensky argued that this theme is an important one because it reflects the fact that the public in general and educators in particular “tend to think of computing as these kinds of things that are built into our computers and we tend to do them inside. But there were at least some hints of capacities beyond that.” Wilensky pointed to presenters Tinker, Jungck, and Uzzo, whose presentations discussed the use of various probes and sensors in the field to collect data for computational learners to analyze and manipulate. Wilensky stated that these options illustrate that “we are not necessarily limited by this [indoor] model of what computation is. Instead we can think of ubiquitous computation and all the different kinds of ways in which we can do things.” Thus in-the-field approaches to computational thinking education must be explored, just as lab-based approaches must be developed. Speaking for himself, Wilensky argued that computational thinking is important enough that it should not have to be squeezed in on the mar- gins or sneaked in on the side. He acknowledged the pragmatic benefits of such an approach but noted that it is perhaps inconsistent with a serious view of computational thinking as a major new mode of thinking that can be powerful for everybody, not just for an elite few. Wilensky also believed that it is sometimes a red herring to assert that there is no room in the standard curriculum to accommodate a serious examination of computational thinking. Indeed, he argued, sometimes important ideas in computational thinking can be introduced incremen - tally along with standard content in a way that makes the standard con- tent easier to learn (and vice versa).

OCR for page 36
44 PEDAGOGICAL ASPECTS OF COMPUTATIONAL THINKING Different Tools for Computational Thinking Wilensky indicated that another theme emphasized in the discussion was what kinds of tools are being used to enable computational thinking. The workshop revealed a range of approaches in use, many of which were related, although Wilensky noted that the main distinguishing factor in these approaches was the question of whether the tools were developed with a target audience in mind. Thus the question is whether, even with children, educators want to use these specially designed learning tools or whether professional tools should be used. For example, Wilensky proposed that maybe tools such as Scratch are designed more with an audience of children in mind—for a target audience especially tuned to learning and motivation through things such as games. That is, using games can be a major motivational tool for an audience of children learn- ing computational thinking. Other tools may have been designed specifically to target a profes- sional community. Caperton argued for use of professional-level tools such as Flash, one of the most prevalently used animations programs in the world, in computational thinking activities because this use of authen- tic tools can be a kind of motivation for students to continue learning. Wilensky pointed to modeling tools such as NetLogo, AgentSheets, and many others that particularly help in science. He reiterated that presenter John Jungck demonstrated the “extent [to which] biology has changed dramatically as a result of computation and all the different kinds of tools that are now in the regular toolbox of biologists that just were not there several decades ago.” He went on to say that “these tools have changed what the discipline is and made the science of biology much less a natural descriptive one and much more one that involves modeling and analy- sis with very large sets of data, for example,” suggesting that students interested in pursuing careers in biology may be motivated to learn com- putational thinking as a result of having access to authentic professional biology research tools. Create and Modify as Complementary Approaches Wilensky noted the difference between students writing programs starting from a blank screen versus students modifying existing pro- grams, but argued that both approaches have value in conveying concepts of computational thinking. However, he did caution that the canonical “use-modify-create” sequence is not the only viable approach to teaching the skills of computational thinking. In his words, It could be in the very first class that kids might create something, like it might be in a biology class where we might say, “Start with some kind

OCR for page 36
45 COMMITTEE MEMBER PERSPECTIVES of creature and give some rule of birthing and dying and see what hap - pens to the system.” That’s a very small creation, but nonetheless it’s a creation, and there will be a diversity of different possible choices that people will come up with, and a comparison of those can lead to lots of insights. So we can think of ”creating” in small bites as well, and some - times creation is a lot easier than modifying as a different kind of entry point, and all of the outcomes are ones that we want. An Affective Dimension Wilensky also noted an affective dimension to some of the presen- tations. Specifically, many of the participants in the activities that were reported in the workshop had done sophisticated programming work in developing genuinely useful applications but nevertheless did not believe that they were, in fact, programming. Wilensky saw this discon - nect between their capabilities and their self-reported assessments as a problem worth addressing, and pointed to the importance of boosting the students’ confidence that in fact they can master complex topics. He further drew an analogy to the teaching of reading—“I am struck by how much effort we as a school system put into reading. It is a really difficult process, yet we think it is so valuable that we invest enormous amounts of resources in it in the schools. I want to think of us as being bold enough to try to make the claim that computational thinking and computational literacies are becoming important enough that we ought to be investing major resources into it.” 3.3 YASMIN KAFAI Committee member Yasmin Kafai is a professor at the University of Pennsylvania Graduate School of Education. Her research focuses on the design and study of tools for learning programming. Her comments at the workshop focused on ways to articulate and teach computational thinking more effectively. They included a discipline- oriented approach for identifying key facets of computational thinking, a developmental progression approach for teaching, a real-world prob- lem-solving approach for identifying concepts and teaching, and a cycle approach (use-modify-create) for teaching and assessing learning. A discipline-oriented approach, Kafai said, means starting from indi- vidual disciplines to identify important and useful aspects of computa - tional thinking. This approach may allow the community to articulate more clearly what computational thinking is and what it is not. Kafai noted, “It is within the disciplines that aspects such as programming, visualization, data management, and manipulation can actually help us illuminate and understand processes.”

OCR for page 36
46 PEDAGOGICAL ASPECTS OF COMPUTATIONAL THINKING According to Kafai, the computer science and education communities have not developed what presenter Jill Denner termed a “developmen- tally appropriate definition of computational thinking.” Kafai acknowl - edged that [w]e all have examples of kids of many ages and adults who are being very courageous and interested in doing computational thinking, but we also know from prior experience in mathematics and science education that we really do need a more profound understanding of what kids’ engagement with computational thinking at different ages is, and then how we can kind of build pedagogies, examples, on it. I think we are far away from that point. These presentations here gave us some ideas about where to start looking and where the examples are. Kafai found the approach presented by Danny Edelson and Robert Panoff helpful. They focused on how computational thinking can help ask interesting questions and solve real-world problems, rather than simply develop algorithms. They used computational thinking to help students answer questions such as, What’s real? Where are the issues? Where are the anomalies and what do they mean? Kafai argued that these ques- tions point to a “kind of social aspect of computational thinking which we don’t talk enough about but which would be really important in the social relevance of bringing computational thinking into the disciplines and judging what the value is.” Kafai is a fan of the cycle approach to learning and teaching compu- tational thinking; she argued that the workshop’s presentations seemed to come together in favor of this approach as well. “I think we have some convergence here on a kind of cycle approach,” Kafai said, “and I know other presenters before us also alluded to this kind of use-modify-create as a kind of pedagogy to introduce students into approaches to compu - tational thinking.” Kafai added, “I don’t think it’s so bad that the kids get some pieces of code to start with, rather than . . . a blank screen and . . . [the expecta - tion that they] develop all the programming on their own, especially if they don’t have any prior competencies in it.” She argued that learning to use the code and manipulate it is a good way to try out strategies before designing one’s own programs. In addition, the cycle approach works across the disciplines and can be used to facilitate computational learn - ing based in data analysis, visualization, and game design approaches to teaching computational thinking. Kafai felt that the next step was to articulate some extensions and caveats to the cycle approach in order to build better assessment tools.

OCR for page 36
54 PEDAGOGICAL ASPECTS OF COMPUTATIONAL THINKING cognitive-type AI, are really good at breaking down problems to a set of functional components—the pieces each play useful roles as part of a process, and they can be fit together in a variety of ways to create other processes that perform bigger functions. It’s where we feel comfortable, and we can never understand why someone else would break a problem down any other way (but people do—how odd).” Kolodner found this characterization effective and useful in describing what computational thinkers do. Kolodner pointed out that this definition of computational thinking is far more constrained than simply thinking about computational thinking as problem solving. Rather, this definition regards computational thinking as “a certain kind of problem solving that computer scientists are pretty good at,” in particular, thinking in terms of processes to be carried out, imagining the functional pieces of those processes, and identifying which of those pieces have been used in solving previous problems and which might be used in solving later ones. Notice that this approach is not synonymous with programming. In fact, Kolodner pointed to the work of Richard Lipton of the Georgia Institute of Technology, in which he and several colleagues figured out a treatment for the AIDS virus in patients by mapping out the biological processes within a person’s body, the substances those processes use and create, the conditions under which they work that way, and how the pro- cesses are sequenced, and then identifying ways in which the sequence of processes might be changed or disrupted. In this way, he used a com- putational approach to address the problem, but without programming. This view of computational thinking is consistent with systems think- ing and with model-based reasoning, both of which play a huge role both in scientific reasoning and in engaging in computational sciences. Indeed, both Tinker and Panoff proposed integrating model building, simula- tion, and model-based reasoning into math and science classes as a way to engage kids in computational thinking as they are getting to greater understanding and raising and solving problems in mathematical and scientific domains. Kolodner added that she believes that computational thinking is a set of skills that transfers across disciplinary domains. She compared com - putational thinking to the processes involved in inquiry, noting that just as inquiry is not one specific skill but rather a collection of relevant skills specialized for different disciplines, so too is computational thinking a col- lection of skills that may be applied differently to different disciplines. As an example, Kolodner stated, “If you are a chemist, you are paying atten- tion to different things than if you are a physicist or a biologist, and you answer questions by different means. You might use experimental meth- ods or modeling methods or simulation methods or data-mining methods

OCR for page 36
55 COMMITTEE MEMBER PERSPECTIVES as you investigate. But in all sciences, you are, in general, attempting to explain phenomena and collecting evidence to help you answer ques- tions about those phenomena and develop well-formed explanations.” She believes computational thinking may or may not include quantitative elements, but it always includes, in some way, manipulation of variables, decisions about selecting “the right” representations, and decomposition of complex tasks into manageable subtasks, to name a few. Although Kolodner is partial to the problem-solving view of compu- tational thinking just described, she was also drawn to a second view of computational thinking put forth by Mitch Resnick. In his view, compu - tational thinking is not simply for problem solving. Rather, he believes that for most people, computational thinking means expressing oneself by utilizing computation fluently. For Resnick, computation’s power is in allowing people (everybody, not just those who are good problem solvers) to express themselves through a variety of media. In this view, computational thinking means being able to create, build, and invent presentations and representations using computation. This requires fluency with computational media. Relationship Between Two Views of Computational Thinking Kolodner argued that a deep understanding of computational think- ing may encompass a synthesis of these two views. She synthesized the Tinker/Edelson view and the Resnick view as follows: Computational thinking is a kind of reasoning in which one breaks problems/goals/challenges into smaller pieces that are doable by a stu- pid computational device. This, in general, means thinking in terms of functions that need to be carried out to achieve a goal or solve a problem (not functions in the mathematical sense, but rather in terms of how things work) and pulling apart those problems/goals/challenges into smaller pieces that are functionally separate from each other and where the functions that are pulled out tend to repeat over many different situ - ations. Computational thinkers tend to break problems into functional pieces that have meaning beyond the particular situation in which they are being used. These functional pieces can then be called on repeatedly in solving the problem or combined in new ways to solve new problems and achieve new goals and challenges. Resnick’s view of computational thinking comes into play when one thinks about the role the computer might play in helping to break prob- lems into pieces and compose the pieces in new ways. To the extent that the computer can help with this kind of thinking, we become capable of achieving bigger goals or solving more complex problems. But this requires two things: (1) that we develop tools to help people think com -

OCR for page 36
56 PEDAGOGICAL ASPECTS OF COMPUTATIONAL THINKING putationally (e.g., one could think about Scratch in this way) and (2) that we be able to use those tools fluently. A computational thinker is fluent in this kind of thinking and in using some set of tools that help with this kind of thinking. With respect to computational thinking for everyone, the implication is that all individuals should get as far as being able to use these types of tools well to help them solve problems, meet challenges, or express themselves. Some will become more proficient, being able to manipulate these tools and solutions to create, build, or invent better solutions or creations. At the highest level are those who will be able to use computa - tional media and thinking in the most sophisticated ways—as scientists, computationalists, and even artists. Yet, the relationship between these two views of computational think- ing is not entirely clear, and there may be a certain tension between the two. Certainly, Kolodner argued, there is overlap, for example, for those whose expression is of sophisticated complex systems. Those learning to be computational biologists and computational physicists and so on might need to have capabilities in both domains of computational think- ing: problem solving/modeling and expression. But beyond this point, the relationship between the two characterizations of computational thinking is not clear. It is not clear that beginning with developing capa - bilities within the realm of View 2 (expression) is necessarily the way to get students to develop capabilities within the realm of View 1 (problem solving/modeling). Similarly, it is also not clear whether those who are facile at the skills and practices of View 1 will automatically be facile at the skills and practices of View 2. Kolodner believes this blurred relation - ship is “a really interesting conundrum that needs more attention from the research community.” Helping People Learn to Be Computational Thinkers Presenter Derek Briggs of the University of Colorado, Boulder, put forth a question during one of the panel discussions that Kolodner found helpful in articulating how to promote computational thinking. Briggs questioned the goals sought with respect to learning computational think- ing. He wondered whether we want to be able solely to build tools that will help people reason better computationally, or rather whether we believe that computational thinking is something we want everybody to learn. He pointed out that if the latter is the case, then we seem to be going against the grain, because we know from the learning sciences and from education best practice that it is hard to learn skills disembodied from the contexts in which they are used. Kolodner argued that the community has both goals—tool building

OCR for page 36
57 COMMITTEE MEMBER PERSPECTIVES for better computational thinking and computational thinking as a core skill for everyone—and that Briggs’s warning about teaching computa - tional thinking in context is a key reminder of best practice. She went on to say that the education community should most definitely be aiming toward helping everybody learn computational thinking and that, yes, the community does seek to promote computational thinking as a set of necessary general-purpose skills. Kolodner believes it is important not to fall prey to the mistaken notion that if one learns computational thinking skills in one context, one will automatically be able to use them in another context. Rather, it will be important to remember that one can learn to use computational thinking skills across contexts only if (1) the skills are practiced across contexts, (2) their use is identified and articulated in each context, (3) their use is compared and contrasted across situations, and (4) learners are pushed to anticipate other situations in which they might use the same skills (and how they would). These four guidelines come from the transfer literature—the chapter on transfer in How People Learn3 makes them clear. Kolodner pointed out that following these guidelines is absolutely necessary in designing instruction—otherwise, we are only helping kids learn to program or learn to use some set of skills in some particular contexts. This is analo - gous, she added, to what we now understand about learning to be a scien- tific reasoner. Scientific reasoning, or inquiry, is not a simple skill that one learns in one domain and applies in a bunch of others. Rather, scientific reasoning is a set of complex cognitive skills that one must learn to carry out flexibly over a variety of domains, and the way to help kids learn that is to help them carry out scientific reasoning over a variety of situations, help them recognize what they are doing, and help them recognize how their reasoning is similar and different over a variety of situations. The workshop touched on these issues in the discussion, but the four guide - lines were not entirely articulated. This set of guidelines is really important for educators to remember with respect to computational thinking; if kids are introduced to com - putational thinking only in the context of programming and never think about how to use computational thinking, or never have opportunities to use computational thinking in other situations, then they may not develop computational thinking. Mike Clancy’s cases are interesting with respect to this—they make the computational thinking of experts visible as a way to illustrate computational thinking applied to a domain. Kolodner wondered to what extent students who use those cases take their compu - 3 NRC, 2000, “Learning and Transfer,” in How People Learn: Brain, Mind, Experience, and School: Expanded Edition, Washington, D.C.: The National Academies Press. Available at http://www.nap.edu/catalog.php?record_id=9853. Last accessed May 20, 2011.

OCR for page 36
58 PEDAGOGICAL ASPECTS OF COMPUTATIONAL THINKING tational thinking outside the computer science class, and what it would take to promote that type of cross-domain application. Several people, across both views of what computational thinking is, talked about teaching computational thinking concepts and skills through a learning progression paradigm of use, modify, and create. Kolodner thought that many of the examples of computational thinking learning discussed in the workshop reflected adoption of this approach to teaching computational thinking, with varying levels of success. One example was Tinker’s learning progression for learning compu- tational thinking in a science class, learning that involved the following: • umbers are associated with things and their interactions (e.g., N temperature), • Values change over time, • Changes can be modeled, • odels involve lots of little steps defined by simple rules (e.g., M molecular dance), • Models can be tested to find a range of applicability, • You can make models, and • Many applications of computers share these features. If using models is done repeatedly in science classes, and if kids gradually move from using to modifying to creating their own models, and if they discuss the features behind the models—why they are the way they are, why and how one might want to change them, and how they went about making changes and creating new models—then there is a good chance that kids will learn to think fluently about running, trust- ing the results of, revising, and maybe, designing computational models. If, in addition, they discuss how what they are doing is similar to what computer programmers do and/or how it is similar to other problem solving and design, they will broaden their understanding and capabili- ties with respect to computational thinking. Kolodner added, “The deal is that one develops the ability to broadly use cognitive skills to the extent that one has experiences using them in a variety of situations, considers how one was using them, and anticipates their use in other situations.” So, for example, one could start from science class and broaden out from there. Edelson’s analogy between computational thinking and geographic computational reasoning illustrated this point. If one helps kids reason geographically, helps them see that process as computational reasoning, and helps them anticipate other ways that reasoning might be useful, one can use that as a base and broaden knowledge and use of computational thinking from there. Kolodner was very interested in perspectives on learning progres-

OCR for page 36
59 COMMITTEE MEMBER PERSPECTIVES sions associated with older children. Specifically she wanted to under- stand at what point students were capable of creating their own compu- tational models using computer programs rather than just using existing models and manipulating them. She noted that around middle school age, students seem able to grasp increasingly sophisticated computational and programming concepts. This observation seems consistent with a point presenter Tinker made that at around fourth grade seems to be when a number of factors such as student development, teaching resources, and opportunity converge and make computational modeling more likely. Tinker also added that creating a computational model from scratch on a computer can require a great deal of time learning programming to real- ize that model. On the other hand, systems like NetLogo and AgentSheets allow students to manipulate models someone else built without neces - sarily having to master a whole lot of detail themselves, and then allow looking inside those models and changing them before beginning from scratch to build one’s own models. Presenter Christina Schwarz added some warnings to this discus - sion, pointing out again that one cannot just assume that students will learn computational thinking through model building. She pointed out that instructors have to be realistic about students and their motivations to build models. When projects have them focus on concepts that they already understand based on outside or prior knowledge, students may be more likely to explore and try more complex models. If concepts are brand new, however, students need to explore before they can do complex model building. And they certainly won’t be able to learn new compu- tational thinking skills or concepts while they are struggling with some new science concept. Kolodner agreed and emphasized that those creating curricula should be sure to think longitudinally—the focus should be on creating more opportunities to model year after year, helping learners to gradually build up their ability to model and their computational thinking capabilities. Their progress on both should be tracked over time. She also highlighted one more important caveat about the use and promotion of computational thinking in the classroom: simply programming, or even simply teach- ing students to program, is not necessarily promoting computational thinking. Kolodner expressed concern over a thread of discussion running through some of the presentations that seemed to presume that as a part of the process of learning to program, students would learn computa- tional thinking. For Kolodner, a big question is how an instructor can be sure that students engaging in programming activities are actually learn- ing computational thinking. Similarly, do students themselves realize they are learning thinking skills that can be applied outside the constraints

OCR for page 36
60 PEDAGOGICAL ASPECTS OF COMPUTATIONAL THINKING of the particular activity they are engaging in? Or are the students just becoming better programmers or model builders or game players? To get a clear picture of what is happening in a computational activity in terms of assessment and evaluation, one has to apply an entire toolbox of assessment and evaluation tools, according to Kolodner. One tool or method is not enough. Kolodner believes that a student’s reflecting on a computational activity, being able to teach or help someone else learn the concepts, or being able to effectively articulate the relevant computa - tional process at issue can be seen as likely indications that the student is learning computational thinking. As students are able to use increasingly elegant, efficient, and sophisticated approaches to tackle computational thinking tasks, this ability can also demonstrate learning and improve - ment in computational thinking, Kolodner believes. Another important point is that one cannot presume that just because one is programming, one is learning to be a computational thinker. Kolod- ner pointed out the importance of remembering that separating out the abstract processes from the specifics of what one is doing does not come easily to everyone. Referring back to points from How People Learn,4 she stated that to learn computational thinking from programming experi- ences, learners need to engage in thinking about what they are doing and under what other circumstances they might use the same type of thinking. Also, she was concerned that perhaps this assumption (that learning to be a computational thinker would arise simply from learning to program) reflected confusion over what computational thinking is. Although programming may be one tool that is used to teach or highlight computational concepts, it is not synonymous with computational think - ing, and Kolodner again warned that a good definition of computational thinking is needed—both so that curricula will be designed to promote computational thinking and so that achieving capability in computational thinking can be measured well. Pedagogy as a Criterion for Assessment: An Elegant Relationship Kolodner believes that assessment and pedagogy can be rather ele- gantly related to each other. She pointed to arguments from Clancy and Blikstein, who both talked about pedagogy as a lead-in to assessment. Clancy talked about how the case studies in his lab-centric approach, as well as the derivative pedagogy, provided lots of criteria for assessing how well learners are actually doing computational thinking. In Clancy’s 4NRC, 2000, How People Learn: Brain, Mind, Experience, and School: Expanded Edition , Wash- ington, D.C.: The National Academies Press. Available at http://www.nap.edu/catalog. php?record_id=9853. Last accessed May 20, 2011.

OCR for page 36
61 COMMITTEE MEMBER PERSPECTIVES approach, learners are learning to program (and could be learning com- putational thinking) through the use of case studies that show how oth - ers have solved similar programming problems. He pointed out that the decisions about what content to put into cases, and then how to evaluate and assess learners’ computational thinking, go hand in hand with each other. Blikstein talked about animated representations students develop in his activity and how when combined with the underlying pedagogy of the activity, analysis of the drawings allows certain kinds of assessments and ways of interpreting what the kids are saying and doing. Goals of Assessment In addition to knowing what one wants to assess, one must consider the purpose of the assessment, because the reason for any assessment plays a critical role in determining the data and process necessary to perform it. Kolodner identified three reasons for assessing computational thinking: (1) to judge the curriculum and related materials and pedagogy, (2) to judge the progress of individuals, e.g., for giving grades, and (3) to manage instructor training and support. Kolodner noted that the kinds of data relevant to each reason would not necessarily be identical. Assessment versus Evaluation Kolodner explained that assessment is not the same as evaluation, although the terms are often used interchangeably. According to her, assessment is about measuring what people have learned, how they feel about something, or their capabilities. Formative assessments deal with discovering what has been learned along the way to inform what comes next. Presenters Jim Slotta and Mike Clancy both noted the importance of capturing some of the reasoning learners are doing that otherwise would be invisible in a formative assessment in order to explore when and how one might change instruction along the way to improve learning. Sum- mative assessment occurs at the end of a module or semester or project to determine how much knowledge was gained overall. Evaluation, on the other hand, speaks more to how well a curriculum or a software tool is working—its efficacy, its costs, its usability, and so on. Kolodner agreed with presenter Cathy Lachapelle of the Museum of Science, Bos- ton, who also discussed evaluation, specifically with respect to the need for usability in a computational thinking project in order to incorporate computational thinking effectively into a curriculum and make it widely available. In response to discussion from Lachapelle, Kolodner said that the computational thinking community should consider at some point creat -

OCR for page 36
62 PEDAGOGICAL ASPECTS OF COMPUTATIONAL THINKING ing its own assessment framework. The National Assessment of Educa- tional Progress currently looks at subjects like science, math, technology education, pre-engineering, and so on, but does not assess computational thinking. Standards and Tactics for Assessment and Evaluation Kolodner echoed the sentiments of several presenters (Briggs, Clancy, and Schwarz) that assessment and evaluation are more than just collecting data points. They are about doing comparisons and analyzing outcomes. Sometimes those comparisons are as simple as what a researcher hypoth- esized versus what actually resulted. Presenter Derek Briggs argued that there must be some standard or baseline to which researchers must com- pare results. Briggs focused on learning progressions and constructs as one example of a standard or baseline for comparison. Kolodner called the process by which a researcher considers what standards and baselines to use and embeds those standards in the computational thinking proj- ect, the “tactics” of assessment used. In some cases, the researcher does not select his or her own baselines or learning progression but instead adopts them from an external source. Kolodner pointed to presenter Christina Schwarz’s experience dealing with her local school district’s biology learning progression guidelines for middle school students as an example of an external baseline. Repetition and Reflection as an Assessment Tactic One tactic Kolodner endorsed was repetition across disciplines com - bined with reflection. She argued that scientific reasoning and compu- tational thinking should be done in a number of different subjects and repeated over and over in order to help learners understand both the similarities and the differences in the ways in which scientific reasoning and computational thinking are done as well as develop general skills in computational thinking. To cross disciplines effectively, Kolodner argued, there should be some sort of reflection on what it is that has been done as well as some anticipation of other circumstances in which skills and lessons learned would be useful. Kolodner also felt that reflection on pedagogical content knowledge with respect to computational thinking is important for instructors of computational thinking. In response to Michelle Williams’s presentation, Kolodner asked for more information about how the reflection questions were developed that were posed to teachers after they had completed a teacher development computational thinking learning project. In essence, if the purpose of having the teachers complete the same project that their

OCR for page 36
63 COMMITTEE MEMBER PERSPECTIVES students would do later was to provide scaffolding in a systematic way, Kolodner wanted to understand the underlying system better. Embedded Assessments and Tracking/Logging Data Embedded assessments, especially those that capture online the thinking of learners, allow assessment of student understanding that a researcher may not have access to otherwise. Kolodner noted that Briggs talked about collecting performance data and Slotta mentioned the value of real-time reflections on threads of collaborative discussions among the students. They argued, and Kolodner agreed, that these embedded real- time assessments allow “getting in there and really dealing with the issues that the learners are having at the moment that they are having them. Maybe at the moment they are having them, maybe later, but the talking uncovers things that you might not see otherwise.” Kolodner believes that tracking of activities seems particularly impor- tant to analyzing computational thinking. Whether Blikstein’s log files, or Schwarz’s interviewing to help track thinking, or Clancy’s noting details of collaborative discussions, such tracking enables particularly important and informative project assessment and evaluation. Kolodner finds that it is hard to tell who to go to concerning com - munity building in the education community and the various disciplines. She stated that “people seek environments that align to their ways of thinking and working. We all do it, and this self-sorting process tends to create silos.” Kolodner argued that such silos will not help computational thinking have a wide impact. 3.7 BRIAN BLAKE Brian Blake is a professor of computer science at the University of Notre Dame and is associate dean for engineering. His research areas include software engineering and, more recently, methods to make advanced computer science techniques digestible for those who are not in the same specialty. The latter effort is intended to attract underrepre - sented minorities into computing. In his comments to the workshop, Blake expressed the evolution of his thoughts on computational thinking through dialog and interaction with various scholars over the course of the two NRC computational thinking workshops. In the first workshop, he explained, the committee sought to characterize computational thinking by first attempting to look for the existence of computational thinking in other fields, in other ways of thinking. From there the committee could then classify and describe it as computational thinking in a way that would enable researchers and

OCR for page 36
64 PEDAGOGICAL ASPECTS OF COMPUTATIONAL THINKING educators to re-embed it into training students or retooling teachers or professionals. Blake went on to explain that his experience over the past year, based on the first workshop and his own personal observations of his son’s learning progression from kindergarten to first grade, had caused his thinking to evolve. Now, the notion of developmental milestones is very important to him. He believes that the understanding of computational thinking should be thought of in terms of decomposing computational thinking “elements” into developmental milestones. Blake noted that during Peter Henderson’s presentation on the efforts underway at Shodor, Henderson’s example featuring Thomas the Train in solving a routing challenge demonstrates that there seems to be an opportunity to start to understand computational thinking at the lowest levels, and then as we move from K-12 into postsecondary education, we can explore increasing complexity within the milestones. Blake summarized several main points he had gathered from the sec- ond workshop’s presentations. There may be an opportunity very early in a child’s learning progression to identify significant computational think - ing talent. This might be done by looking at specific instances where com- putational thinking might fold into a learning activity, and then assessing a student’s competency with respect to these computational elements. To illustrate, Blake pointed back to Henderson’s Thomas the Train example and suggested that a simple activity with embedded computational think- ing challenges might be a means of identifying talent. Concerning the idea of training, Blake argued that by taking opportunities to identify and assess computational thinking talent in individual students, and to start to enumerate indicators of such talent, a researcher or an educator might be able to recognize when a student either is demonstrating a significant talent in computational thinking or is at least at the appropriate learning progression level for that age range. Blake argued that the next step would be to use this process of embed- ding, assessing, and identifying at the macro level over a longer period of time to identify learning progression baselines. This technique utilizes assessment and evaluation to determine where in learning development a particular baseline is situated. From the perspective of learning progression at the macro level, the types of concepts to be enumerated so as to identify potentially talented computational thinkers at young ages are not limited solely to concepts related obviously to computer science thinking, math thinking, or even scientific thinking. Instead, these concepts are likely to span all of these types of thinking and analysis. As the emerging computation community moves forward, scholars should perhaps target these sorts of concepts to specify them more clearly and possibly re-embed them for identification of talent and for determination of learning progression.