NSF Awards: 1417456
2017 (see original presentation & discussion)
Undergraduate
Many students with LD, ADHD, and ASD tend to struggle in STEM classrooms. Often their struggles are rooted in Executive Function (EF) and language processing difficulties. Advances in data mining techniques that allow for nonintrusive measures of implicit learning are being explored by this project to develop a prototype adaptive version of a science learning game that can help diverse learners improve their understanding of core science concepts.
Our collaborative research team including researchers at TERC and MIT is collecting click stream and eye movement data for students playing an existing particle simulator game that has been shown to be predictive of learning outcomes. Previous work done by the EdGE team at TERC shows that students who show sound particle differentiation in their gameplay patterns tend to perform better on standard Newtonian physics items. Students who do not exhibit this is implicit knowledge in their gameplay pattern correspondingly don’t exhibit similar performance gains. By analyzing the game log and the eye movement patterns synchronized down to the millisecond level we are able to look for differences in attention allocation that are indicative of different student learning trajectories. This in turn will be used to create an adaptive prototype of the game able to strategically guide student attention in order to maximize learning.
Brian Drayton
Very interesting, and what a challenge to take on!
I've love to understand a bit more about how you are recognizing learning when you see it — I am assuming that the data streams should (perhaps eventually) help you recognize learning that is starting to emerge -- maybe along the lines of microgenesis, in Vygotsky-speak.
Michael Stone
Cyrus Shaoul
Senior Academic Researcher
Thanks for stopping by! As you surmised, we are very interested in the development in implicit processing which occurs during the course of a learning session and leads to rapid change in the processing of information, which is very similar to the idea of microgenesis.
Some of the analyses that we intend to perform on our dataset include looking at the amount of time that the student is fixating on the particles and seeing if there is a relationship between particle density and fixation time. We also will be looking at the development of strategies: some students may start looking for particles that are on a collision course, which is a great strategy and also demonstrates implicit understanding of Newton's laws. We have many other ideas that we are working on, and we will soon see how these gaze analyses will align with the existing work on mouse click analysis in Impulse.
Thanks for your question, and please feel free to ask a follow-up question.
Michael Stone
Director of Innovative Learning
What an amazing challenge to tackle! Once the baseline student learning trajectories are identified, what are some of the anticipated benefits of the adaptive prototype of the game?
Additionally, does your team expect to be able to eventually bridge the eye tracking technology with non-digital interactions (i.e. a more traditional classroom setting)? Please pardon my naivety if this is a ridiculous projection.
Cyrus Shaoul
Senior Academic Researcher
Thanks for your questions, which are excellent ones.
First, let me describe some of the anticipated benefits of a future adaptive version:
1) In the current version of the software, the is no "hint" feature to help a student who is stuck. In the adaptive version, the software would analyze the eye movements or other in-game data and detect the ineffective deployment of attentional resources. It would then start providing hints (perhaps with a glowing ring around a particle) that would direct the student's attention to particles that are on a collision course. This would reduce student frustration and enable more efficient learning.
2) To help the student assess themselves during play, an adaptive version of this game could give students "brain points" each time they show from their behavior and eye movement that they are attending to the critical items in the game. This would mean that even though the student was not jumping to the next level, they would see that their implicit learning was progressing.
To answer your second question, we are hoping that the data we are collecting and analyzing will help other experimenters and learning theorists to better understand the power of implicit learning. We see great potential for non-digital experiences that are similar to our software in their avoidance of textual content and explicit instruction. Mixing digital and non-digital implicit learning may be the most effective method of all, but first more research needs to be done.
I hope I was able to answer your questions. If you have a follow-up question, please ask away!
Jodi Asbell-Clarke
great job you guys....the best partners ever!!! :)
Cyrus Shaoul
Senior Academic Researcher
Thanks, Jodi! We could not have done it without you and the other folks at TERC and MIT.
Chris Thorn
Director of Knowledge Mangement
So, if I understand where this is heading, one could imagine work helping students struggling with online homework to help detect the use of good strategies (students are looking at the relevant supporting information on the page, etc.). I supposed more adaptive versions of homework could actually deploy additional supports in response to students perceived struggle based on eye gaze and cursor movement. Thinking about the "bridging" to non-digital environments. This learning could be leveraged to improve the design of ofline representations as well.
Ibrahim Dahlstrom-Hakki
Senior Academic Researcher
Thanks for your comment Chris. Yes we see this work leading to a number of potential research directions. On the one hand we are interested in bringing additional neurocognitive tools online given that we now have the infrastructure to easily do so, but we are hoping to match the highly instrumented patterns with more easily gathered data that can be collected out in the field (e.g. mouse stream, webcam) to allow for the detection of non-productive play patterns and the introduction on non-obtrusive prompts to put the students back on track without explicitly guiding their play behavior.
Janet Kolodner
Regents' Professor Emerita
Very nice project -- important. I have some clarification questions:
1. What does it mean to track "implicit learning". What do you mean by "implicit learning." (I was thinking it means the same as tacit learning; what they are grasping in an intuitive way.)
2. How does eye tracking help you track it. I can see how eye tracking can help you track their attention and figure out what help they need to direct their attention in productive ways.
3. What do you mean by "student learning trajectories"?
My naive understanding of what you are aiming to do is to help game players learn to direct their attention to the important stuff, whatever it might be for the domain (in this case, particle behavior). I don't entirely understand the details of what you are studying in that context and how you are studying it.
Janet
Ibrahim Dahlstrom-Hakki
Senior Academic Researcher
Thanks for the questions Janet and for your interest in our project. As you know it is hard to provide details in a short three minute clip but I will try to provide a bit more detail for you here. Please let me know if you'd like me to expand on any of my answers below:
1. Implicit learning is similar to tacit learning and often used interchangeably. However there is a distinction between the two: Tacit knowledge is knowledge that is almost impossible to verbalize by its very nature, whereas implicit knowledge is knowledge that a given individual may not yet be able to verbalize but that can with instruction be verbalized.
2. By looking at a learner's game behaviors and eye movements, you can deduce the type of information guiding those events. For example, if a student implicitly understands that a particle will continue along its trajectory unless acted upon by a force, then I would expect to see that student looking along that trajectory for potential collisions. If a student does not understand this at an implicit level then we would not expect them to be able to predict the behavior of particles in the game.
3. The game our students play goes through multiple levels with increasing complexity that help students slowly develop their understanding of particle behavior in a particle simulator. As they interact with different particle types with differences in Mass, their behavior changes. We are interested in seeing if there is a predictable progression in terms of game behaviors and eye movement patterns observed for successful students and if the patterns of students who do not seem to develop behaviors consistent with this implicit knowledge differ in some uniform way.
So at the end of the day we are interested in a paradigm that: 1. allows us to measure whether learners are developing knowledge that is guiding their eyes and game behaviors, and when they are not, 2. intervening to non-intrusively guide their attention to help them develop that knowledge.
Teon Edwards
Janet Kolodner
Regents' Professor Emerita
Thanks, Ibrahim. Excellent answers -- because they are succinct, because I now understand better what you are doing and why, and because it seems you are on a really good track. Good work! I look forward to hearing more as the project progresses.
HENRY MINSKY
I wonder if you could make a system which helped with learning to read a foreign language, if the eye tracker could
tell when the student was lingering on a phrase, it could display the translation, and adapt to the amount of time the student tended to look at the same place. It would want to avoid training the student to just wait until a translation appeared, so might need to wait longer and longer to display the translation if the student was tending to pause too often or something.
Cyrus Shaoul
Senior Academic Researcher
Henry,
Thanks so much for your question. This type of adaptive learning system is where we think the research should be headed. Learning foreign languages is definitely an area where the interaction between implicit and explicit learning is not yet well understood. There has been a deep and broad body of research using eye-tracking to look at how language is learned, and there are some theories that would predict a benefit of providing a "just-in-time" translation.
From other theoretical perspectives, having the native version of the word appear while reading the foreign language might inhibit learning, possibly due to "blocking" caused by the extremely well-learned native word appearing and preventing learning of the unfamiliar foreign word. See this paper on which I am a co-author for more thoughts on this.
In any case, it would be an idea worth trying out! Thanks for the comment.
Ibrahim Dahlstrom-Hakki
Senior Academic Researcher
Interesting idea Henry. It is certainly something that is possible and there is in fact a large body of research on reading that includes fixation contingent display changes. Now as to whether it could be designed in such a way as to improve the efficiency of learning a foreign language I'm not sure... but it would certainly be an interesting pilot study.
Janet Kolodner
Regents' Professor Emerita
I was also thinking learning to read (for those who are having trouble). Jack Mostow (at CMU) used eye tracking and other means to figure out how to help young kids learn to read better; I wonder if your technology would add to what they have been able to do.
Ibrahim Dahlstrom-Hakki
Senior Academic Researcher
I would actually say that the vast majority of eyetracking research, particularly in the field of cognitive psychology, is on reading and understanding the reading process. Unfortunately, most of the work is on neurotypical readers and there is relatively little work on readers with dyslexia. I certainly believe that more eyetracking work on reading and reading interventions for students with dyslexia is needed.
Janet Kolodner
Regents' Professor Emerita
I'm sure at least some of Mostow's kids were dyslexic, and others, I'm sure, had other learning and reading difficulties. My guess is that a good percentage of them are not neurotypical, but maybe I am wrong.
Good work!!
Jenna Welsh
Wow what an amazing project. I personally don't know much about eye-tracking to track and help with learning, so this is very interesting to me.
Ibrahim Dahlstrom-Hakki
Senior Academic Researcher
Thanks for posting Jenna, I'd be happy to share more information if you are interested in learning more about our work.
Further posting is closed as the event has ended.