NSF Awards: 2017042
2021 (see original presentation & discussion)
Grades 6-8, Grades 9-12
This project explores how Augmented and Virtual Reality (AR/VR) can support human-centered, collaborative computing. We are developing an Embodied Coding Environment for the creation of 3d assets and artwork. This platform is envisioned as a merged digital/physical workspace where spatial representations of code, the artwork that is the output of the code, and user editing activities are simultaneously located in a persistent mixed reality environment. Wearing AR/VR headsets, learners manipulate virtual code blocks to assemble programs and debug their code by evaluating the 3d representations that they create. Our approach builds on the accessibility and sense of play found in successful visual learning technologies such as Scratch. We aim to increase participation and interest in groups traditionally underrepresented in the educational and career pathways of computer science (CS), including females and some minority students, who often exhibit lower confidence in STEM-related abilities relative to other students. We hypothesize that the highly interactive and engaging nature of our 3D AR/VR computing experiences will not only increase CS interest and engagement in these students, but will also increase self-efficacy and self-confidence relating to STEM, facilitate computational thinking, and result in high levels of computational concept learning.
Michael Chang
Postdoctoral Research
Thanks for sharing this project! I appreciate how this approach is driven by a desire to increase self-efficacy and self-confidence to non-dominant groups in computer science. Could you talk more about the specific affordances of the AR/VR space, and compare it against using physical embodiments of coding primitives (e.g., wooden blocks)? How does the AR/VR technology specifically augment the experience of the young people?
J. Adam Scribner
Robert Twomey
Ying Wu
Ying Wu
Project Scientist
Good questions, Michael. Part of our project centers on deciding what types of affordances are most likely to benefit learning. We have conducted a need-finding study, interviewing several coding educators in our region. Many of them rely on physical embodiments of coding primitives to teach core concepts (e.g. using paper airplanes passed between students to represent transfer of information from functions). In VR/AR, it is possible to visualize this sort of metaphor in a systematic way so that it gets reinforced every time a person uses a function. We hope that this type of systematicity can benefit learning.
Ying Wu
Project Scientist
Welcome to Embodied Coding, a project funded by the National Science Foundation through the program, Cybearlearning for Work at the Human-Technology Frontier. This video offers an overview of our work to develop whole-body approaches to learning to code, leveraging Augmented and Virtual Reality (AR/VR). We hypothesize that human understanding of abstract computational concepts (e.g. functions, data, variables, and so forth) is grounded in embodied experience – and aim to facilitate learning of these concepts through an AR/VR platform whose affordances allow users to exercise their embodied knowledge as they produce computer programs. Feel free to reach out with questions and comments! We look forward to your feedback.
Robert Twomey
Sara Kazemi
I am really looking forward to seeing your hypotheses tested with underrepresented students who are reluctant learners of CS. I was interviewed as a high school CS educator regarding this project! I’m moving on to specialize in interactive intelligence in a CS grad program, so I look forward to following this research.
Ying Wu
Robert Twomey
Karl Kosko
Very nice video and interesting project! I noticed in the video that you used both hand tracking and the controllers. When using the controllers to model/construct code, is there haptic feedback? Do you think the lack of haptic feedback with using one's hands may have any sort of 'negative' effect (i.e., use of the controllers may facilitate embodied interaction to a higher degree)? Or do you think the reverse may be the case?
I'm looking forward to seeing what you produce in this project!
Robert Twomey
Ying Wu
Project Scientist
Thanks, Karl! Your point about haptic feedback is intriguing. For now we are focusing on visual, auditory, and kinesthetic perception. Definitely haptic feedback is something to explore in the future.
Robert Twomey
Kimberly Arcand
I'm also interested in hearing about some of the pros/cons to using hand tracking vs. controllers. Following!
Robert Twomey
Assistant Professor
Hi Kimberly, thanks for your question! Though we have done initial design studies using the VR headset with hand controllers, we are gravitating to hand tracking for our both our VR prototypes and eventual AR coding platform. Camera-based hand tracking (with skeleton models) lends itself to a more granular sampling of user gesture (down to wrists, fingers, etc.), unencumbered by the need to hold a physical controller. We are still determining the role of user gesture within our platform, but expect this hand tracking will lend itself to gestural expressivity from users, a more natural mode of interface. Quest 2, hololens 2, and webxr, all tools/platforms we are developing with, have implemented hand tracking interfaces we are building off of. I do like Karl's point above about haptic feedback as being one positive affordance with the controllers.
Kit-Bacon Gressitt
Interesting, Ying. To me, one of the many uninitiated, it seems akin to VR entertainment. ... I hope outreach to BIPOC students works.
Ying Wu
Teon Edwards
Very interesting idea; I'm really interested in seeing how your hypotheses test out, as well as how the technology itself progresses. Thank you.
The video imagery focused on VR; I'm wondering about your thoughts on how embodied coding would be impacted by seeing the coding on top of a real-world situation related to what's being coded. e.g., what if (samples of) the variables are actually in the room, building on what you suggest about "dropping the variable into the container".
Ying Wu
Ying Wu
Project Scientist
Hello,Teon! Yes -- I agree that AR is a very powerful tool, and from the start, we conceptualized our coding platform in AR. For prototyping and development purposes, we have been using VR. The stand alone Quest 2 is certainly much cheaper than the Hololens. However, we will explore AR in the future as well. Please stay in touch! (ywu@ucsd.edu, https://insight.ucsd.edu)
Robert Twomey
Sean Strough
So cool when you can find a project that is both useful and fun. From what I'm seeing, VR is highly appealing to kids of all ages. I have no doubt that this can only make STEM and CS even more appealing to a wider audience! Good luck!!
Robert Twomey
Ying Wu
Sean Strough
So cool when you can find a project that is both useful and fun. From what I'm seeing, VR is highly appealing to kids of all ages. I have no doubt that this can only make STEM and CS even more appealing to a wider audience! Good luck!!
Robert Twomey
Andres Colubri
Assistant Professor
Very cool project, the idea of embodiment in problem solving is very powerful but overlooked. Two questions:
* the video seems to focus solely in VR, and I'd imagine that working in an AR environment would significantly expand the possibilities of mapping algorithmic entities to physical objects. Are you working on that direction as well?
* Also from the videos, it seems that you are creating some kind fo visual programming environment in VR… where you have containers representing variables, blocks for operations, and connectors for flow of information. I wondering if even more direct translations, like in puzzle games such as LittleBigPlanet, where a algorithmic rules are embedded into a virtual mechanism or system that more immediately tangible.
Ying Wu
Project Scientist
I appreciate your feedback, Andres! Yes -- we are planning ultimately to design our platform and activities for AR -- however, we are using VR for protoyping purposes at the moment. We will definitely check out LittleBigPlanet. Thanks for the suggestion.
Suzy Gurton
As a relative new user of VR, I find it fatiguing to use. Is there a limit to how long students are comfortable in the VR environment?
Ying Wu
Project Scientist
Good question, Suzy. There is variability in people's tolerance of VR. At times, it can induce motion sickness. Ultimately, our platform will be instantiated in Augemented Reality, which may prove less fatiguing.
Anita Crowder
Thank you for sharing this project. I have worked with high school students using VR, but not in this context. I taught programming and they built software for the HTC Vive. This is very intriguing from a meta-cognitive perspective. It is almost like physical recursion! I look forward to hearing more about the findings from your work!
Ying Wu
Ying Wu
Project Scientist
Thanks, Anita! Yes -- we are in the design phase now -- but hope to have the core components of our platform in place by next year.
Anita Crowder
John Schumann
I find this research very interesting. In my own work, I am looking at abstract concepts that do not have all the characteristics of physical entities. They lack mass, energy, and observability, but nevertheless, they can have causal effects on the world. They are concepts such as democracy, freedom, motivation, emotion, peace, obstruction etc. I'll be very interested in how this research with abstract computational concepts develops, and to see its possible relevance to the study of less-than-fully-physical concepts.
Ying Wu
Ying Wu
Project Scientist
Indeed, John -- embodied cognition is important for our ability to reason about many abstract concepts above and beyond the domain of computation. Great to hear from you!
Eric Hamilton
Ying, we are just starting a CS strand of activity with partners in the Middle East, and I would love to learn whether there are ways that we could connect with the innovation your group is advancing. I am not sure how it could play out, but my intuition is that there could be some powerful synergies. We are based up the street from you in Los Angeles and Malibu. I hope to connect post forum on this. Many thanks.
Ying Wu
Project Scientist
Thanks, Eric -- yes it seems that there are many intersections between our groups. I would love to talk further.
H Chad Lane
What wonderful ideas to explore for new coding interfaces. I can see how engaging this could be for many learners. I was specifically intrigued by the visual appeal of how control structures looked in the interface.... have you considered (or do you already) take advantage of the fact that you have a 3D space to work in? I'm curious how you would leverage that - in thinking about all scratch-like or text-based coding environments are 2D, the affordance of depth seems like a highly novel aspect for investigation. Perhaps there are some ways to represent abstraction in novel ways? Thanks again, this is incredibly creative work!
Ying Wu
Robert Twomey
Robert Twomey
Assistant Professor
Thanks Chad for your comments. Yes, we are particularly excited about the new possibilities that arise from extending visual coding into 3d space. For instance, users can attach code to particular locations in space, harnessing coders' spatial memory and spatial organization strategies to scaffold arrangement of code logics. With some programming tasks, code might be attached to particular objects. This becomes particularly interesting for programming robots or IoT, for instance, where we can display live code execution, debug, or pre-visualize future output that is spatially attached to the robot or device. Finally, we could even use the architecture of the room as a meaningful framework for code arrangement. We see many exciting avenues to explore!
Ying Wu
H Chad Lane
Jeremy Roschelle
Executive Director, Learning Sciences
Interesting video, team! The "container" metaphor reminded me of Boxer, a variant of Logo that I admittedly worked on circa 1985 or so. You might want to look up Andy diSessa's papers on Boxer -- here's one -- Andy was very thoughtful about principles like Spatial Metaphor and Naive Realism -- and although your tech is newer, the principles may be helpful to you.
Ying Wu
Ying Wu
Project Scientist
Thanks, Jeremy! It's great to get the perspective from a classical, old-school developer. It seems that some of the concepts motivating VPLs such as Scratch were already in their nascency with Boxer. I appreciate this paper.
Robert Twomey
Assistant Professor
Thanks Jeremy, this is a great reference for us! I wasn't familiar with Boxer. More evidence of the wealth of ideas and approaches in these lesser-known histories of programming languages. I look forward to digging into the paper. I see the second author is Harold Abelson who co-authored one my favorite programming books of all time, Structure and Interpretation of Computer Programs (SCIP).
Pendred Noyce
Very interesting video, and one that raises so many questions, such as how far can you get in teaching about abstract concepts through physical motion and extended, embodied metaphor? I will be fascinated to learn about your findings from this project, especially transfer of concepts from VR to a more traditional coding environment.
Robert Twomey
Ying Wu
Ying Wu
Project Scientist
Thanks, Pendred, for your insight. I agree that transfer of learning to 2D visual programming or text-based coding is an important question to address. For now, our focus centers on how the affordances of coding in 3D space can facilitate computational concept learning. But you make a good point -- and we will must definitely keep in mind the importance of transfer of knowledge and skills to 2D platforms as we design our coding environment. Also, it may not be clear from the video, but we are ultimately planning to design a coding system (akin to Scratch in some ways) for Augmented Reality. The examples shown in the video are just prototypes created in VR for development purposes.
Robert Twomey
Further posting is closed as the event has ended.