NSF Awards: 1441358
2015 (see original presentation & discussion)
Grades 6-8, Grades 9-12, Undergraduate, Graduate, Adult learners
Stanford Professor Allison Okamura is using a build-it-yourself electronic kit that allows students to create physical sensations using a programmable device called Hapkit. Today she is using Hapkit to teach a class in haptics – the science of touch. Eventually she thinks Hapkit will be useful in online education for a variety of topics.
Video URL: https://youtu.be/20Y_OelYBIA
Vivian Guilfoy
Senior Advisor
Haptics is a fascinating topic (and new to me). Application to many different fields and education holds great promise. Can you say more or give examples of what it would be like to use haptics for frog dissection—or to interact with students around the globe using haptics? Who are the students currently using the Hapkit in your course? What physical sensations have they created? How do you plan to help more teachers and students learn about this area of work?
Allison Okamura
Associate Professor
Thanks for your questions!
Haptics can allow users to experience a virtual world through touch. So when you do a hands-on lab (any lab — from frog dissection to pouring chemicals) you have to ask: is it important to do it hands-on vs. in a computer simulation? Haptics can be used to bridge the physical and virtual worlds to both (1) test hypotheses about whether it is important to be hands on and (2) provide a safe, ethical, and long-term cost-effective way to do virtual experiments that feel real.
In our first online class (2013), there was an application process and the students ranged from middle school to professionals. That class had 100 people, and we delivered the devices to the students by mail. In our current “self-paced” online haptics class (started Fall 2014), students need to purchase the parts and assemble the devices themselves. Because of the cost and required resources, making the device is optional. So while thousands of people are registered, only a fraction are making the device. We’ll know by the then of the year how many actually made the device, and what difference it had in their experience of the course material.
Physical sensations created include springs, virtual walls, dampers, and textures. (The device is only one degree of freedom.) We are also developing software to allow haptic display of functions, as well as haptic video games.
We currently have a NSF Cyberlearning background to help us understand how to make the device and its interface better for dissemination. In fact, this week we are doing haptic labs in a local middle school. We are iteratively designing and testing.
Anyone interested in learning more can go to:
http://hapticsonline.class.stanford.edu
http://hapkit.stanford.edu
Vivian Guilfoy
Senior Advisor
The results of your course and the analyses of impact when actually making the device should be very interesting…as well as What happens with middle school student labs. I’m wondering whether businesses could be recruited to support the cost of constructing the devices for classrooms—as a way for business to demonstrate their interest not only in learning, but in cutting edge studies for young students—and future workers.
Debra Bernstein
Senior Researcher
I am also new to haptics, although certainly interested to learn more! Can you say more about the types of feedback the Hapkit can provide to students, and the role you think that feedback can play in science learning (for example, since you mentioned the frog dissection)?
Allison Okamura
Associate Professor
The feeling of material properties (okay, frog dissection might sound gross!) from simple springs and dampers to complex viscoelasticity can give students insight about physics, biology, etc. In another example, students could feel the atomic bonds in a “scaled up” force display for chemistry learning. The main idea is that feeling a virtual environment representing the science at hand will improve students’ intuition. Multi-modal interactions (e.g., vision + sound + touch) provides immersion and reinforcement that we hypothesize will improve understanding and retention.
Debra Bernstein
Senior Researcher
Thanks for your response. Since you mentioned multi-modal interactions, are you picturing that the Hapkit will be one of several modes of interaction used during a learning experience (i.e., used in conjunction with sound and visuals)? Also, for frog dissection I can see how haptic feedback would be useful there, to get a sense of the elasticity or thickness of different parts of the body… so, gross but very useful!
Allison Okamura
Associate Professor
And maybe smell. :) Really, it is still a research question how many senses are helpful in a learning experience, and it is certainly going to specific to the topic being learned. These are questions we are trying to answer through our long-term research.
Debra Bernstein
Senior Researcher
Those are great questions! It sounds like you’ve targeted a few different fields of study for your research, such as biology and physics learning. Are there others?
Allison Okamura
Associate Professor
We originally used the predecessor to the Hapkit, called the Haptic Paddle, to teach engineering undergraduate and graduate courses in Dynamic Systems as well as Haptics (of course) and Robotics. In our current research, we are focusing on physics and math.
Debra Bernstein
Senior Researcher
Thanks. This has been a great discussion. Thanks for your responsiveness, and all the best of luck with the project!
Nevin Katz
Technical Associate
This sounds like a fascinating endeavor. Could you talk more about how a haptics device would work when controlling a surgical robot?
Also, very interesting that it a haptics unit can be compared to an Arduino, which I’ve heard a lot about. How much computer science background would a student need to effectively program a haptics unit?
Allison Okamura
Associate Professor
Yes — most surgical robots today are teleoperated systems. That is, the surgeon manipulates a master “joystick” with many degrees of freedom, and those motions are used to command the behavior of the patient-side robot. Such teleoperation is useful because it allows for motion scaling between the master and patient sides, as well as image alignment, better ergonomics for the surgeon, and more dexterity inside the patient through a small incision (compared to manually minimally invasive surgical instruments). What haptics would do for such a system is allow the surgeon to feel their interaction with the patient’s tissues, via the master of the teleoperation system. This is not typically available clinically today.
The Hapkit Board used to control our device is based on the design of the Arduino Uno, we just added additional sensors and motor amplifiers so all the electronics is on a single compact board. Not much CS background is needed at all; but some minimal experience with writing code is necessary. We provide tutorials about this in the online course.
Nevin Katz
Technical Associate
Fascinating! I’ll take a look at the course. Could you talk a bit about how these surgical robots are tested in the medical field? Does it start in the lab and then get tested in hospitals by physicians?
Allison Okamura
Associate Professor
Indeed, they start in research labs and companies and are then tested in the operating room (typically first on tissue, then on cadavers and/or animals, then on human patients after approval).
Nevin Katz
Technical Associate
That sounds awesome. In the field, what is the level of biology background that a programmer needs for writing code that is used for the bots? Is there a lot of interaction between programmers and physicians when developing their programming?
Allison Okamura
Associate Professor
Close interaction with clinician collaborators is most important during the design and specification phase, and then during testing phases. The “programming” itself is usually done only by the engineering researchers. People in my lab do not have any biology background — rather we consult with clinician collaborators to learn about specific procedures. This is a little different in other areas of medical robotics, such as rehabilitation, where the engineering researchers need to understand human motor control quite deeply in order to design devices and algorithms.
Nevin Katz
Technical Associate
Fascinating! Thanks for entertaining all my questions!!
Allison Okamura
Associate Professor
Thanks!
Vivian Guilfoy
Senior Advisor
Following this discussion makes me think that it might be a great idea to produce a video which shows the process and integrates it with student descriptions of what it is like to use it for a particular exploration.
Allison Okamura
Associate Professor
Our online class includes a number of more “tutorial”-oriented videos about how to program and use the device, please see http://hapticsonline.class.stanford.edu.
Vivian Guilfoy
Senior Advisor
Thank you. Will check it out to deepen my own understanding of this fascinating field of study.
Vivian Guilfoy
Senior Advisor
I just watched your online course introductory video and loved the example of the vibration of our phones to alert us to incoming calls or messages. In your report on this project, are you planning to use student-generated responses to using the hapkit that helps to “demystify” this topic for others? Just curious.
Allison Okamura
Associate Professor
Certainly, this is part of our current Cyberlearning studies.
Vivian Guilfoy
Senior Advisor
Thanks for a rich discussion into this field of study. I learned a lot.
Allison Okamura
Associate Professor
Thanks!
Further posting is closed as the event has ended.