NSF Awards: 1503481
2018 (see original presentation & discussion)
Grades 6-8
This project embeds formative assessment into two games that engage students in applying scientific practices and the crosscutting concept of energy and matter flows to their understanding of ecosystem components, interactions, and population dynamics. The games are part of a suite of simulation-based formative assessments for middle school science, and include real-time feedback and coaching from virtual scientists, as well as opportunities to collaborate with virtual peers. Dialogs between the player, scientists, and peers generate a game narrative while also assessing the player’s scientific explanations and providing helpful feedback to the players.
In order to gather a wider range of data on players’ understanding, the game employs “stealth assessment” to track the natural actions that players make during gameplay. These data, along with players’ responses to chats from the virtual scientist and peer, are used in an evidence model for interpreting student's progress in mastering NGSS practices such as Developing and Using Models and Engaging in Argument from Evidence. Along with the other SimScientists modules, assessment data are provided to teachers to inform classroom instruction.
In the current phase of development, the project is using evidence from playtesting to refine the interface, dialogs, and evidence models in preparation for pilot testing in middle school science classrooms. Data from think-aloud studies, classroom observations, and logs of students' use of the games will be used to further improve the games and our understanding of how games can be used as formative assessments for science learning.
Matt Silberglitt
Senior Research Associate
Thank you for taking time to learn about SimScientists Games. We are investigating how digital games can be part of a middle school curriculum enhancement that includes simulation-based and game-based activities that teachers can use to review concepts, integrate practices, and gather data from student actions that can be used in a formative assessment process. We are also exploring ways to include collaboration with a virtual peer built into the game. We're developing two games and are conducting extensive play testing with individual students, feasibility testing in classrooms, and plan to conduct two pilot tests at increasing scales.
Scot Osterweil
Research Scientist
You do a very nice job of laying out the nature of your research, and I also appreciate your sharing of the challenges you are currently wrestling with in the development of this project. There is a lot of story to tell in only three minutes, but it would be interesting to learn a little more about what impact your seeing on student engagement. The video does an excellent job of describing your research design, and leaves me wanting to know more about how it's looking on the ground where you've been implementing it.
Matt Silberglitt
Senior Research Associate
In interviews, individual students have been generally positive about playing the game. Students say the game is fun and that they like many of the game mechanics, but they also indicate that they "learn more" [than a comparable activity with a more structured simulation]. They also tell us when there are aspects of the game that are confusing or frustrating. This feedback is helpful for revising the game.
Charlie Mahoney
To piggyback on Matt's comment, many of the students have been very comfortable talking about the game and offering opinions, either to students next to them while they play or to our research team. Many of them are likely bringing in a lot of prior experience with video games, and their feedback is often related to what they expect to see in a game. During the development process, we've tried to keep in mind this prior experience and notions of what a game is or should be/have in order to foster student engagement.
Jessica Hammer
Assistant Professor
I especially appreciated your point about how student goals may not always align with the designers' goals. I'd like to hear more about how you are creating the evidence model for your game, and in particular how you are taking student goal-setting into account. Do you think you will be able to tell the difference between a student who has set a playfully different goal (such as unbalancing the ecology as much as possible) and a student who is groping and confused? Or will you try to reduce / avoid students setting their own goals? If so, how do you think you might accomplish that?
Matt Silberglitt
Senior Research Associate
The games are currently providing more information about students' actions than we're interpreting from our evidence models. We will be iteratively revising the evidence model as we gather and review data from gameplay, along with other sources of evidence, such as simulation-based and static forms of assessment. We're looking forward to these challenges, but we're not there yet!
Robert Zisk
Graduate Student
Building off of Jessica's comment, in games like this, we don't always see the alignment between our goals and the goals that the students set. This becomes even more apparent when the students working on something that is engaging like this game. In your work, have you found evidence that students understand or sense that they are using science practices and developing those skills?
Bess Caplan
Thank you for sharing your project. Can you explain how the simulations are embedded in the middle school curriculum? How are they introduced, how long are the students engaging with the simulations and what happens in the lesson sequence after the simulations are over?
Naomi Thompson
Hello! Thanks for sharing this great video. These activities seem quite engaging and informative. What larger issues in middle school science do these games address? How do you see them fitting into pre-existing classroom structures?
Further posting is closed as the event has ended.