NSF Awards: 1724889
2018 (see original presentation & discussion)
Grades 6-8, Grades 9-12, Adult learners
Professor Heffernan explains his vision of how crowd sourcing will change the educational landscape. Computer-enabled Crowdsourcing as an idea is a recent innovation; Wikipedia, started in 2001, is built upon the crowdsourcing of information. The Mechanical Turk platform was started in 2009 to crowdsource human-required tasks. Another example of crowdsourcing, CAPTCHA (Von Ahn, Blum, Hooper & Langford, 2003), protects websites from bots while at the same time tagging images (a task with which humans excel while computers have difficulty). The premier place for academics to discuss and publish research on crowdsourcing, the Conference on Human Computation and Crowdsourcing (HCOMP), had its first independent conference in 2013. That conference is essentially comprised of works exploring machine learning (studying reinforcement learning issues) and design (studying how to design these systems).
Heffernan believes crowdsourcing is so powerful and yet underrepresented in the design of educational systems. This video will explain this vision as well as show the small step our lab has taken that leverages the 50,000 students using the ASSISTments system already.
Daniel Damelin
Senior Scientist
What kind of supports does this new tool provide for researchers interested in using ASSISTments as part of a study?
Neil Heffernan
Professor and Director of Learning Sciences and Technologies
Thanks for you interest, Daniel! I think you'll find our ASSISTments Research Testbed site is helpful: http://www.assistmentstestbed.org/
Daniel Damelin
Senior Scientist
Thanks. That was very helpful.
Betsy Stefany
Hi Neil,
This piece that supports beyond the classroom sounds like a neat extension to what we show in our video (http://stemforall2018.videohall.com/map/1321)&n... that also was developed at WPI. Once students are set with 1-to-1 computers that are allowed and expected to be used beyond the school day the tool seems critical to offer them. Also the idea of internal publishing and use of a "commons" idea is a step we are watching for to evolve for teachers and researchers in a Community of Practice structure. Thanks for sharing..and how do we prepare to be more involved?
Neil Heffernan
Professor and Director of Learning Sciences and Technologies
Carrie Willis
Technology Director and Teacher
Sounds like a great idea. How do teachers utilize the service? Is it a paid subscription? Have you seen great results from classrooms that have used it?
Neil Heffernan
Professor and Director of Learning Sciences and Technologies
ASSISTments is a tool teachers use for formative assessment (like homework) and is a great way to cultivate personalized learning for students. Here are some ways in which teachers use ASSISTments. It is a free public service so there is no paid subscription. We have seen incredible results so far. From our study conducted in Maine we saw 75% of learning increased. You can check our recent feature in the US News and World Report to read more about this. I invite you to also explore for yourself by creating an account.
Jim Hammerman
Co-Director
I read your author blurb, but I don't understand how the idea of crowd sourcing is playing out in the Assistments system that you describe in the video. Can you explain? Is it needed to give feedback for answers that are more complex than multiple choice options? If not, how do you score such responses?
Neil Heffernan
Professor and Director of Learning Sciences and Technologies
Jim,
We have many answer types in ASSISTments but they mainly fall into three categories. 1) Multiple choice and choose all that apply, 2) fill in that is graded by the system and 3) open response that is not graded by the system. We are currently crowdsourcing support for students who can not get the correct fill in answer. We call this TeacherASSIST (see a video about it here). Teachers can easily create hints or explanations to support struggling students, then if a teacher is designated a expert we share their support with all users. We have other systems for teachers to grade open response questions (see more about that here) but as of yet have not incorporated crowdsourced feedback for that, not for lack of ideas on how to do it, we have just not gotten there yet. NeilJames Diamond
Research Scientist
Thanks for sharing your work, Neil. Did the study that SRI did include the type of "explanation feedback" included in the Fyfe study? I'm curious what you mean when you say "we vastly improved student learning." What were the feedback conditions, and what was the content?
Thanks again.
Neil Heffernan
Professor and Director of Learning Sciences and Technologies
Further posting is closed as the event has ended.