8163 Views (as of 05/2023)
  1. James Diamond
  2. http://ltd.edc.org/people/jim-diamond
  3. Senior Research Associate
  4. Presenter’s NSFRESOURCECENTERS
  5. Education Development Center, GlassLab
  1. Heather Kim
  2. Research Associate
  3. Presenter’s NSFRESOURCECENTERS
  4. Education Development Center

Playing with the data: Developing digital supports for middle school science ...

NSF Awards: 1503255

2017 (see original presentation & discussion)

Grades 6-8

In this video, we present an overview of the Playing with Data project, which is designing a set of interactive "educative materials" to help middle grade science teachers use data from gameplay to make decisions about formative assessment and differentiation. The video highlights the "teacher data dashboard" and "Report Helper," and includes commentary from a middle school science teacher that piloted the materials in December 2016.

This video has had approximately 817 visits by 523 visitors from 150 unique locations. It has been played 377 times as of 05/2023.
Click to See Activity Worldwide
Map reflects activity with this presentation from the 2017 STEM for All Video Showcase: Research and Design for Impact website, as well as the STEM For All Multiplex website.
Based on periodically updated Google Analytics data. This is intended to show usage trends but may not capture all activity from every visitor.
show more
Discussion from the 2017 STEM for All Video Showcase (10 posts)
  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Senior Advisor
    May 14, 2017 | 08:59 p.m.

    Great to see how the use of data contributes to differentiation of student learning.  Can you give examples of some of the ways pilot teachers have translated data into differentiated instruction?  Also, are there opportunities for teachers to talk with other pilot teachers about their experiences and contribute to or enrich the Report Helper that you mention?    

  • Icon for: James Diamond

    James Diamond

    Lead Presenter
    Senior Research Associate
    May 14, 2017 | 09:56 p.m.

    Hi Vivian. Thanks for the great questions. We've seen a few examples of pilot teachers differentiating based on gameplay data analysis. One example comes from the use of gameplay data about "supporting claims with evidence that is related to and in support of the claim." Students seem to struggle with this argumentation skill because the differences in the language of the evidence can be subtle (especially as players also need to relate evidence to one of four different types of "argument schemes"). Teachers have tended to use one of two sets of materials/questions depending on their diagnoses for why students are struggling: one set of materials that helps students focus on specific words to determine whether the evidence is irrelevant or contradictory; and one set of materials that gives students prompts they can use to think back to the point of their claim and then determine whether their evidence supports it. Teachers have then used these materials with small groups, or whole class. It's too early to tell whether these have an impact on student outcomes.

    Interestingly, most of the pilot teachers (middle school science teachers) have tended not to differentiate using small groups—rather, they differentiate by classes (e.g., period 1 vs. period 2). That is, they make decisions about changes to instruction at the class level, rather than at the individual student level.

    As for your second question about pilot teachers talking with each other: no, but it's an absolutely fantastic idea and I'm going to see if we try it in the next few weeks (we're just finishing up a pilot now and then heading into an impact study in the fall). Individually, pilot teachers have many opportunities to influence the design of the Report Helper, as we do three think-aloud sessions with them as they use the data dashboard and Report Helper, as well as follow up interviews.

    Thanks again for the great questions, and the excellent suggestion about having the pilot teachers talk with each other.

     
    1
    Discussion is closed. Upvoting is no longer available

    Heidi Larson
  • Icon for: Babette Moeller

    Babette Moeller

    Researcher
    May 19, 2017 | 02:56 p.m.

    Your observations about how middle school science teachers differentiate instruction/gameplay are VERY interesting! Do you have any ideas about how to shift their decision making from the classroom level to that of individual students.

    In our work with K-5 teachers as part of Math for All, we are asking teachers to pick a focal student (a student whom they have questions about regarding math) and we are asking them to use a neurodevelopmental framework to observe this student throughout the school year so they get a better sense of what this student's strengths and needs are and so they can use this information to provide alternative means to help this student achieve high-quality learning outcomes. Teachers find that by focusing their planning in this way often helps many students in their classroom to succeed. We are also having general and special education teachers who serve the same student collaborate on planning instruction, which also helps to focus them on individual students. You can find more information about our work here: http://stemforall2017.videohall.com/presentations/1011

  • Icon for: James Diamond

    James Diamond

    Lead Presenter
    Senior Research Associate
    May 14, 2017 | 10:08 p.m.

    Thanks for taking the time to watch our video!

    Playing with Data is concerned with helping teachers use data from video gameplay (which is often a "black box") for formative assessment and differentiation. By creating "educative materials" (in the form of a "Report Helper"), we hope to provide teachers with insight into how gameplay operationalizes targeted learning skills (in the case of this study, a set of skills for argumentation). We hypothesize that an improved understanding of how gameplay operationalizes those skills should lead to better-informed formative assessment and differentiated instruction. In our impact study, we will look to see whether the use of these educative materials is associated with greater gains in student argumentation skills (as compared to peers whose teachers do not have access to the Report Helper).

    Some questions that we'd be especially interested in engaging with you are:

    1. Have you had any experience using games (or watching teachers use games) for formative assessment? If so, how did you (or the teachers) use the gameplay to make inferences about student learning?
    2. It may be challenging for middle school teachers to differentiate instruction because of the large numbers of students (as compared to elementary school teachers, typically) they have across classes. What suggestions might you have for helping them do so?
    3. Are you familiar with any instruments that have been used to assess teachers' data driven decision making or formative assessment practices?  If so, can you share the titles?
    4. If you have any experience with using or designing "educative curriculum," what have you found to be most useful for teachers?

    Thanks for your interest and we look forward to engaging with you this week!!

    Jim

  • Icon for: Wendy Smith

    Wendy Smith

    Facilitator
    Associate Director
    May 15, 2017 | 06:09 p.m.

    I find myself curious about the game itself--if the game is supposed to help students develop argumentation skills, is this mostly students choosing among various arguments and then seeing the result? Do students have to generate original arguments? What kind of other data does your project collect on students' science achievement (e.g., does successful completion of game levels correlate with student achievement overall?)? Does the data game have multiple science content "levels", or would this be technology teachers would learn and use for a single unit, and then move to other games with other dashboards for other content? To what extent do teachers seem to find "space" for games like this in their existing curricula? What kind of training do you offer teachers to orient them both to the game (so they can troubleshoot with their classes) and the back side of the game where they can see student results?

  • Icon for: James Diamond

    James Diamond

    Lead Presenter
    Senior Research Associate
    May 15, 2017 | 11:05 p.m.

    Hi Wendy,

    Lots of questions here! Thanks. I won't do most of these justice, as you're asking some very good questions.

    For starters, you can learn more about the game here: https://www.glasslabgames.org/games/AA-1

    The game wasn't developed to help students "master" argumentation. Rather, it's designed to introduce students to the "parts" of an argument, and to evaluate the strength of arguments by asking critical questions and determining whether evidence is well-backed. Students do not generate original arguments specifically, but they can choose from multiple, "pre-packaged" arguments. (This certainly requires more discussion than is possible here.)

    Our study is not actually about gameplay and student achievement. SRI conducted studies on the relationship between gameplay and argumentation skills (as measured by ETS's CBAL, though I can't remember the specifics now). What we are looking at is the relationship between teacher content knowledge, pedagogical content knowledge, and how well they use the gameplay data for formative assessment. We are ALSO looking at student impact measures to see if there are differences between teacher groupings: those teachers who receive the "educative materials," which are designed to help teachers make use of the gameplay data for assessment and tying it to other classroom-based activities, and those who do not have those materials.

    The game does not include "science levels." As originally designed, the game was intended to be a stand alone exercise. In fact, it wasn't developed specifically for science classrooms. But we have developed an associated "mini-unit" on energy and argumentation that enables teachers to tie gameplay to argumentation in the classroom, using specific content objectives.

    Your question about other games is an excellent one. Should we find meaningful differences in this work, we hope to scale it out to other games, where the questions would relate to whether the presence of a "data dashboard with educative materials" helps teachers integrate the use of gameplay data into their formative assessment practices.

    As for "space in the curriculum," we have designed the intervention such that the game is tied (though not completely prescribed) to a three-week, supplemental mini-unit that helps teachers integrate argumentation practice into their regular materials for a middle school energy unit.

    Lastly, the "educative materials" that I noted above are the "training." That is, through iterative design, we seek to create materials that should help teachers become better at using the specific data from this game, as well as help to help them understand how games CAN operationalize targeted learning objectives (which isn't to say that they always do—that's for a teacher to evaluate, we hope).

    Thanks again for these good questions Wendy.

    Jim

  • Icon for: Miriam Sherin

    Miriam Sherin

    Facilitator
    Professor, Associate Dean of Teacher Education
    May 16, 2017 | 04:26 p.m.

    I enjoyed learning about your project! I think your approach of putting information in the hands of teachers and seeing what the teachers do with that information is incredibly important. Of course we want to design materials that teachers will find useful, but that does not mean teachers will use the materials exactly as a designer intended. Making that an explicit focus of your research is significant I think. Coming from that perspective, I'm interested to learn more about the revisions you made, that you discuss in the video - adding information about student progress on specific skills as well as the Report Helper - how did that process unfold? are you continuing to use similar methods to get feedback from teachers? I often think of "educative curriculum" as curriculum that is designed for students, but that provides for example, information for teachers on likely student areas of struggle or explicit information on why and how a unit is organized. You seem to be taking a slightly different approach - almost like the development of "educative resources" for teachers, resources that teachers can use to explore their students' learning.

     
    2
    Discussion is closed. Upvoting is no longer available

    Heather Kim
    James Diamond
  • Icon for: Noah Goodman

    Noah Goodman

    Researcher
    May 17, 2017 | 06:11 p.m.
    • Hello Miriam. Thank you for your comment and your questions. I am one of the researchers working specifically on the dashboard revisions and the educative materials. In response to your question about the process of making the revisions, we have used a design-based research approach for this project. We've conducted several cycles of data collection and iterations on the dashboard and accompanying materials, during which we have interviewed teachers, observed them in the classroom, and conducted think alouds where we asked them to use the dashboard and Report Helper to plan for upcoming instruction. This process has been very useful, if a little challenging to keep up with the timeline required by a research AND development project, while ensuring that our changes are driven by the data we’ve collected.
    • The game we’re using for this study already had a data dashboard prior to our project, so we started by interviewing teachers, asking them to explain how they understood each aspect of the dashboard. From these interviews, we generated three major findings: 1) although all the teachers we interviewed had already used the game with students, many only had vague notions of what gameplay entailed, 2) teachers struggled to make connections between data on the dashboard—which was organized around gameplay events—and student performance on the targeted learning objectives, and 3) teachers wanted to know how students were progressing more than whether they had struggled along the way.  
    • So we wanted to redesign the dashboard to make it more evident the ways in which gameplay related to argumentation skills. The game was built to operationalize four aspects of the Toulmin model of argumentation (using evidence of different argument schemes, supporting claims with evidence, querying others’ arguments with critical questions, and using backing to strengthen your argument), and so we decided to use these skills to organize student performance. This led us to articulate a learning sequence featuring these four skills, to create a new set of reporting rules that approximate student performance on these skills, and to disaggregate performance by game mission so teachers could better target students who were consistently struggling.
    • As for the educative curriculum, we have several goals around helping teachers to use gameplay data to inform instruction. Data from gameplay have specific characteristics—one of them being that since it is built on gameplay moves teachers often don’t understand what student action the data represent. This makes it difficult for teachers to assess the extent to which they trust the data and to develop plans to support student success.
    • So in the educative curriculum, one goal we have is to help teachers understand what it is that students are doing in the game, how the game operationalizes the four argumentation skills, and ways that students are likely to struggle. We’ve tried to organize this material in a way that makes the information actionable and helps teachers use the gameplay as a launching point for a more in-depth exploration of argumentation within the context of their content and curriculum. 
     
    1
    Discussion is closed. Upvoting is no longer available

    James Diamond
  • Icon for: James Diamond

    James Diamond

    Lead Presenter
    Senior Research Associate
    May 18, 2017 | 10:00 a.m.

    Miriam—

    Thanks for these fantastic comments. With respect to your comments about "educative resources"—that's exactly right! (Only we've been calling them "educative materials.") As Noah said, the idea is to present those materials in a way that teachers will find them immediately actionable. As we are trying to help teachers become better at "using gameplay for formative assessment," we present the material in a three-step process that broadly mirrors one framing of formative assessment: 1. What are the learning targets? 2. Where are my students now? 3. How do get from here to there?

    So, these materials are less linked to specific content (as is usually the case with educative curriculum, to your point) than they are to specific teaching practices. We'll see whether and in what ways they influence practice in the impact study...

    Thanks again for these great comments.

    Jim

  • Icon for: Heather Kim

    Heather Kim

    Co-Presenter
    Research Associate
    May 22, 2017 | 05:01 p.m.

    Thanks again for viewing our video and learning about Playing with Data! We are very excited about our upcoming national impact study and learning how teachers use gameplay data and educative materials to make decisions about instruction. Although this showcase is coming to an end, we hope to be able to continue the conversation with you!  We invite you to share questions, comments, or to learn more about our upcoming research by visiting us at  http://playingwithdata.edc.org   

  • Further posting is closed as the event has ended.