NSF Awards: 1502882
2017 (see original presentation & discussion)
Grades K-6, Grades 6-8
Computational Thinking, the set of ideas and practices considered vital for computer science skills, has been attracting increased attention over the past several years in K-12 education. Zoombinis, an award-winning adventure game, engages players in guiding little blue Zoombinis on a "fun but perilous" journey featuring 12 puzzles, each with 4 levels of difficulty, designed around logic and computational thinking. Our research involves educational data mining techniques to assess students' learning in conjunction with pre-post computational thinking assessments (external to the game), teacher interviews, classroom observations, and case studies of classroom use. The goal is to understand both students' learning of computational thinking and how to bridge the formal and informal learning via classroom implementation of the Zoombinis game.
Jodi Asbell-Clarke
Director, EdGE at TERC
Cyrus Shaoul
This is great stuff! Here are my questions: how can a game like Zoombinis be used as a formative assessment in a classroom? Are you looking at the meta-cognitive aspects of this game? How do you think the affect of the learner changes learning in Zoombinis?
Thanks so much for this wonderful video.
Jodi Asbell-Clarke
Director, EdGE at TERC
Hey Cyrus. We are looking at the behaviors that players exhibit during gameplay as a way to formatively assess their learning. This allows us to measure implicit learning rather than relying only on the explicit knowledge that learners can express on a test. In our previous studies we have looked at how the challenge in GBL impacts learning, but we have not connected that research to Zoombinis yet. thanks!
Cyrus Shaoul
Great! Thanks for the link. I will check them out.
Daniel Heck
Thank you for sharing your work. My kids and I have been playing the Zoombini's games for many years -- we love them!
I've always thought of the games as tests of logic and deductive reasoning, not computational thinking. Could you say more about the relationship and distinctions between CT and logic/deduction?
Jodi Asbell-Clarke
Director, EdGE at TERC
Great question. I think they are related but looking with a slightly different lens. CT is about the process of how to break down the problems and abstract general solutions that can then be designed into algorithms for a computer program (or any procedural representation). Boolean logic (AND, OR, NOT) are part of those solutions and so is deductive reasoning. If deductive reason is the process of putting evidence together into a solution, CT is the process of breaking down the problems to solve and then generalizing that process. Does that make sense?
Neil Plotnick
Teacher
Always nice to see how games can be used to enhance learning. I would like to know more about the teachers using explicit strategies in their classroom to track computational thinking with students. During the video, you briefly showed a classroom where students were navigating a grid pattern on the floor.
Jodi Asbell-Clarke
Director, EdGE at TERC
thanks Neil. We have teachers use a variety of bridge activities including video clips from the game with discussion points, acting out a puzzle, and using data tables and eventually pseudo code to represents the game's algorithms and their own algorithms for solving the puzzles. We also have created a few Scratch activities using the Zoombinis characters. In Fall 2017 we start our implementation study to study this in more detail.
Shuchi Grover
So cool, Jodi! I'm so keen on educators having ways of providing kids non-programming avenues for building these skills (as you've seen in our VELA STEM+C project).
Very interested in learning about what you find in your studies about what aspects of CT kids are learning. I'm particularly interested in how you define abstraction in the context of Zoombinis and how you measure it..
All the best to you all in your project!
Nicole Reitz-Larsen
Educator
I love the explanation you gave earlier of computational thinking. You've got some great discussion going on around the activities, students and thinking. I'm curious to know how you are working with teachers to help build their understanding of computational thinking so that they can facilitate those discussions with students as they are working on and off the computer to build their CT skills.
Jodi Asbell-Clarke
Director, EdGE at TERC
thanks Nicole. You tapped right into one of the trickiest part of our work these days. I think teachers are doing cool CT stuff in class, but they call it different things. There is a lot of discussions of variables in Math and Science, and we have similar problem-solving strategies in many areas....but it is figuring out how to help teachers see those connections that takes a lot of thought and time. We are developing Bridge materials to help teachers make these connections between the CT concepts developed in the game and what they are teaching in Math, Science, TechED, and even ELA and other subjects.
Nicole Reitz-Larsen
Nicole Reitz-Larsen
Educator
As you work on those resources, I'd be interested to hear how you connect CT concepts, the game and Math, Science, TechEd, ELA, etc.
Jodi Asbell-Clarke
Director, EdGE at TERC
thanks Nicole - we will publish our findings and our curriculum after our 2017-2018 study.
Jeremy Roschelle
Hi Jodi,
Thanks for the clear discussion of computational thinking and great example of how to use the "classic" Zoombinis to measure it. I was wishing for a closing "punchline" in the video -- suppose your program of research plays out (say in a couple of years) and Zoombinis is a valid assessment of CT. What's the biggest contribution of this work that you can imagine? What's the headline news 2-3 years from now?
best,
jeremy
Jodi Asbell-Clarke
Director, EdGE at TERC
thanks Jeremy - I think that line was left on the editing floor :) Implicit game-based learning assessments, like what we are building with Zoombinis, can include all types of learners, even those who may have barriers to traditional assessments. By measuring what people DO rather than just what the SAY or WRITE, we may be able to unleash the potential of all learners - and nowhere is that more important than in Computational Thinking. Our new line of work is looking at the connections between CT and Autism, ADD, and Dyslexia so that we can leverage the strengths of all learners.
Neil Plotnick
Teacher
I would be VERY interested in what your work on children with learning differences is showing. I am a licensed Special Education teacher (as well as a CS teacher) and have found that computers are a great tool for several reasons. They have infinite patience, some children will have greater focus with a screen and keyboard compared to a book or teacher, and some students naturally gravitate to computers over other learning scenarios.
Jodi Asbell-Clarke
Director, EdGE at TERC
thanks Neil - we just applied for new grants for this work....so stay tuned!
Katharine Sawrey
Jodi, I'd like to build on this thread. I love the idea of assessments being grounded in students' "doing" rather than their "reporting." I am curious about how you qualify computational thinking? The video mentions that your group used the results of 70 players to operationalize CT in terms of click actions in Zoombinis. What do you feel are the strengths and shortcomings of your analysis?
Mary Dussault
Chris Dede
We need performance-based ways to measure computational thinking, so this is an important topic
Mary Dussault
Hi Jodi,
It's great to dip back into your work these many years on! I'm also really interested in Katharine's and Chris's comments -- can you give an example of how your analysis of the "big data" of user actions has helped you develop a predictive(?) model of CT?
Jodi Asbell-Clarke
Director, EdGE at TERC
Hi Mary(great to hear from you!!), and Chris, and Katherine too,
We use the video analysis on our sample of 70ish just to provide groundtruth (human analysis) of what strategies players exhibit while they are playing. We are looking at just 5 of the puzzles right now, several levels each, for all these players.
While they solve the Zoombinis puzzles, we watch for indicators of systematic testing (such as holding one variable constant or trying one of each thing to lay out the domain space) and within those events of systematic testing we label evidence of problem decomposition, pattern recognition, abstraction, and algorithm design.
So if they are holding one thing constant they are in problem decomposition; when they see the particular value of the Zoombini (e.g. red noses) that works for the puzzle that might be pattern recognition; when they realize that the general rule is that the puzzle is sorting on noses that is abstraction; and when they repeat the same type of strategy in multiple applications that is consistent with algorithm design.
Once we can get good reliability among ourselves as human coders of these behaviors, we build data mining models to recognize those patterns within huge sets of play data (far more than we could ever human code). The more data we use, the more we can refine and train our models to be better predictors of those behaviors. Meanwhile we will study students playing Zoombinis under a variety of conditions to see if/how the behaviors they exhibit in their gameplay predict how they improve on pre/post assessments of the same CT facets. Make sense?
Nicole Reitz-Larsen
Educator
I like the description you give about students exploring concepts as they are learning. Are they journaling what concepts they are using during the game to identify how and when they were used and changes they may make after that?
Jodi Asbell-Clarke
Director, EdGE at TERC
Hi Nicole - we are not doing journaling with students (but we do have teachers in the study complete logs about their bridging activities). We are looking at implicit learning that is demonstrated by behaviors rather than by formal expressions (writing)....we will tie these findings to others more explicit assessments as the research goes on. thanks!
Katharine Sawrey
Yes, thanks.
Lien Diaz
Sr. Director
What a great project! It's interesting to me to hear about the different ways that computational thinking is being measured in this project. Can you share more about how computational thinking is defined in game-based learning? In other words, what are some of the "skills" that are elicited/observed/being researched that are considered computational thinking?
Kathy Perkins
Thanks for sharing, Jodi! Are you finding signature patterns in the back end data that help you identify those engaged in productive computational thinking? and those engaging in ways that are not productive?
Jodi Asbell-Clarke
Director, EdGE at TERC
Thanks Lien and Kathy. Zoombinis is about characters that have four different attributes (nose, eyes, hair, and feet) that have five different values. We are able to watch as players choose strategies to discern which values of which attributes are salient to the puzzles in the game. In some cases, it is not the Zoombini attributes that the players are testing, as in the case of Pizza Pass. Each Pizza troll want a specific set of toppings for their pizza and ice cream sundaes. Players often start by making their own favorite pizza or random trial and error, but we can watch as they learn to be systematic in their testing, choosing one topping at a time and keeping track of the results. We can also see as they abstract their individual test results into generalized rules. Also the game allows players to create their own packs of Zoombinis, they can use overall algorithms to help them solve puzzles like always keeping one attribute of all Zoombinis the same - that reduces the ambiguity of the puzzles for them overall. I hope that makes sense. It is hard to get more specific than that outside the context of each puzzle, but play the game and let me know if you want to know how we are analyzing any of the puzzles of interest. We are looking at Allergic Cliffs, Pizza Pass, Mudball Wall, Fleens, and Bubble Wonder. thanks, Jodi
Further posting is closed as the event has ended.