1792 Views (as of 05/2023)
  1. Ying Wu
  2. https://insight.ucsd.edu/our-team/
  3. Project Scientist
  4. Presenter’s NSFRESOURCECENTERS
  5. UC San Diego
  1. Amy Eguchi
  2. https://eds.ucsd.edu/discover/people/faculty/eguchi.html
  3. Associate Teaching Professor
  4. Presenter’s NSFRESOURCECENTERS
  5. UC San Diego
  1. Monica Sweet
  2. Co-Director of Research and Evaluation
  3. Presenter’s NSFRESOURCECENTERS
  4. UC San Diego
  1. Robert Twomey
  2. http://roberttwomey.com
  3. Assistant Professor
  4. Presenter’s NSFRESOURCECENTERS
  5. University of Nebraska Lincoln

An Embodied, Augmented Reality Coding Platform for Pair Programming

NSF Awards: 2017042

2022 (see original presentation & discussion)

Grades 9-12, Undergraduate

We aim to increase participation and interest in groups traditionally underrepresented in the educational and career pathways of computer science (CS), including females and some minority students, who often exhibit lower confidence in STEM-related abilities relative to other students.  This project explores how embodied experience in extended reality can support human-centered, collaborative computing through the development of an Embodied Coding Environment (ECE) for the creation of interactive experiences.  It is comprised of tools that support drawing and gestures, smart selection, a smart search/command bar, a text editor with syntax highlighting, grouping and movement tools for organizing elements in space, and the ability to save and share projects through the cloud.  The ECE allows users to be immersed in their problem space and contextualize their code within this space.  Further, it offers an abstract Annotation System where ‘annotations’ take the form of 3D drawn lines, hand gestures, spoken comments, quick 3D modeling, and more, as free form diagramming or whiteboarding can play a key role in understanding and breaking down problems and working through the design of algorithms to solve those problems. Finally, coders can use spatial signals from hand and controller-tracking to directly select locations in space, and directly specify movements, control signals, and other parameters through gestural time series input.  Controller and hand movements can be recorded and visualized as annotations or data within the ECE. These data can be linked as input to programming nodes to drive a variety of processes.  Our next goal is to field test this system with high school CS learners from underrepresented backgrounds.

This video has had approximately 202 visits by 159 visitors from 103 unique locations. It has been played 102 times as of 05/2023.
Click to See Activity Worldwide
Map reflects activity with this presentation from the 2022 STEM For All Video Showcase website, as well as the STEM For All Multiplex website.
Based on periodically updated Google Analytics data. This is intended to show usage trends but may not capture all activity from every visitor.
show more
Discussion from the 2022 STEM For All Video Showcase (8 posts)
  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 10, 2022 | 11:45 a.m.

    Welcome to the Embodied Coding Environment

    This NSF-funded project explores how the affordances of 3D space in Virtual and Augmented Reality can be leveraged to support computational concept learning.  This work is motivated by embodied learning theory, which centers on the idea that learners’ abilities to understand and reason about functions, algorithms, conditionals, and other abstract computational concepts stem in part from more fundamental sensorimotor and perceptual experiences of the physical world.  Key features of the Embodied Coding Environment include the following:

    1) a novel visual-spatial XR representation of coding allowing immersion in the problem and design spaces

    2) whiteboarding/annotation tools situated in a shared environment with code activities,

    3) gesture and movement paths for the direct specification of program instrumentation and data

    Our current goal centers on testing the impact of 3D spatial coding in the classroom.  The research team is building partnerships with San Diego high schools and developing customized coding lessons to be implemented in the Embodied Coding environment.

    Your comments and feedback are greatly appreciated!

    https://embodiedcode.net/

    https://insight.ucsd.edu/

     
    1
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
  • Icon for: Dan Roy

    Dan Roy

    Facilitator
    Research Scientist, Interest-based Learning Mentor, Learning Game Designer
    May 11, 2022 | 06:26 a.m.

     Thanks for sharing Embodied Code. It looks like an intriguing tool to explore. A few questions about the role embodied cognition plays is making learning more accessible:
    -How much of the value of embodied learning in CS comes from making the abstract concrete?
    -Is this a question you explored or found in the literature?
    -Does making the abstract concrete without embodiment capture much of the value?

    A few questions about the value of sensorimotor and perceptual experiences of the physical world:
    -To the extent that it makes learning easier, could it be from reducing cognitive load?
    -Maybe processing certain ideas without using the body takes more mental effort to the extent that it becomes less efficient.
    -Should we think of embodied cognition like adding a GPU to a task that's currently relying fully on a CPU?

    About the kinds of topics best suited to embodied cognition:
    -Does EC help with anything abstract and challenging? 
    -Does it mainly help with spatial concepts?
    -How much overlap is there between spatial concepts and challenging abstract concepts?

    Have you already tested Embodied Code with users? What reactions have you seen? Any insights into efficacy, and for which topics in particular?

    How would you like to build on Embodied Code going forward, if you get the chance?

    Looking forward to discussing more!

     
    1
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
  • Icon for: Victor Minces

    Victor Minces

    Researcher
    May 11, 2022 | 08:51 p.m.

    Hey Yin! Can't wait to try it. Should we have an in-person meeting now that we can? Our flow-based music programming language is ready to use, I can show it to you as well.

     

  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 12, 2022 | 11:22 a.m.

    Hi Victor!  Yes -- we should get together in person soon.  Perhaps next week or the week after that?  I will see if Robert, Tommy, and Amy are available for a meeting -- because I'm sure they would love to see your music programming language as well.

  • Icon for: Lorna Quandt

    Lorna Quandt

    Facilitator
    Asst. Professor, Educational Neuroscience
    May 12, 2022 | 09:38 a.m.

    Hello team! I love this project! I also work in VR development, using gestures and sign language, driven by theories of embodied cognition. So we are speaking the same language. 

    There is so much potential here to build embodied coding--in my team, we have discussed the idea of using sign language as well, so that deaf signers can code in VR space using a mixture of sign, text, and gesture. We have not yet developed anything like that, but we think it would be a very cool idea and I see you're thinking along those same lines already. 

    Your environment appears powerful and also rather complex. How do you teach users how to use the commands and execute actions in the space? Do you have any sense of how long it takes for people to become familiar with the coding environment? I would also be curious if principles of universal design can be applied to this framework to make it more accessible to people with disabilities and/or neurodivergence. Have you thought about that at all? Thanks for the interesting video!

     
    1
    Discussion is closed. Upvoting is no longer available

    Dan Roy
  • Ying Wu

    Researcher
    May 12, 2022 | 11:13 a.m.

    Thanks for these comments, Lorna!  We have created several tutorials offering guidance on the use of the system and have shared our platform with partners in the San Diego Unified School district.  We are hoping to gain feedback from teachers and other educators in order to ensure that our system is user friendly and suitable for the coding activities that we would like to engage learners with.   

    Your question about neurodivergence and/or disabilities is very interesting and opens up new considerations for our group.  Indeed, one of our developers is very interested in coding through sign language.  It would be great if we could bring our groups together to chat some time! Feel free to email me: yingchoon@gmail.com or message me on Discord:  YingChoonWu#9772.

     
    1
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
  • Icon for: Marcelo Worsley

    Marcelo Worsley

    Facilitator
    Assistant Professor
    May 12, 2022 | 04:16 p.m.

    That was a very informative video. I really appreciate the different modes of interaction with your coding platform. One question I have is whether or not you all incorporate sound into the  experiences? Would participants hear the created spheres fall? Also, how are you handling version control.  How easy is it for students to undo or redo previous actions?

  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 12, 2022 | 06:55 p.m.

    Thanks, Marcelo, for your suggestions and questions.  With respect to version control, we haven't implemented a version control system or undo feature yet, but the plan is to save the code to a server that uses git for version control so users can undo and redo along automated commits (when those commits will be made is still up for discussion). Additionally Unity Engine has its own undo system that, once utilized by our system, will allow users to undo smaller actions.  With respect to sounds -- you hit on an excellent topic that Victor Minces (above) and I have discussed.  In short the answer is yes -- sound can and will be incorporated.  Our longer term plans are to customize the environment for particular users and classroom objectives.  Depending on the activities that the platform will be supporting for a particular group, we will add sound and build out the environment.

  • Further posting is closed as the event has ended.