1. Lorna Quandt
  2. http://www.tinyurl.com/actionbrainlab
  3. Assistant Professor
  5. Gallaudet University
  1. Melissa Malzkuhn
  2. http://www.motionlightlab.com
  3. Creative Director, Motion Light Lab
  5. Gallaudet University
  1. Athena Willis
  2. Graduate Student
  4. Gallaudet University

Signing Avatars & Immersive Learning (SAIL): Development and Testing of a Nov...

NSF Awards: 1839379

2019 (see original presentation & discussion)

Adult learners

Improved resources for learning American Sign Language (ASL) are in high demand. The aim of this Cyberlearning project is to investigate the feasibility of a system in which signing avatars (computer-animated virtual humans built from motion capture recordings) teach users ASL in an immersive virtual environment. The system is called Signing Avatars & Immersive Learning (SAIL). The project focuses on developing and testing this entirely novel ASL learning tool, fostering the inclusion of underrepresented minorities in STEM. 

This project leverages the cognitive neuroscience of embodied learning to test the SAIL system. Signing avatars are created using motion capture recordings of native deaf signers signing in ASL. The avatars are placed in a virtual reality landscape accessed via head-mounted goggles. Users enter the virtual reality environment, and the user's own movements are captured via a gesture-tracking system. A "teacher" avatar guides users through an interactive ASL lesson involving both the observation and production of signs. Users learn ASL signs from both the first-person perspective and the third-person perspective. The inclusion of the first-person perspective may enhance the potential for embodied learning processes. Following the development of SAIL, the project involves conducting an electroencephalography (EEG) experiment to examine how the sensorimotor systems of the brain are engaged by the embodied experiences provided in SAIL. The project team pioneers the integration of multiple technologies: avatars, motion capture systems, virtual reality, gesture tracking, and EEG with the goal of making progress toward an improved tool for sign language learning.

This video has had approximately 589 visits by 468 visitors from 236 unique locations. It has been played 327 times.
activity map thumbnail Click to See Activity Worldwide
Map reflects activity with this presentation from the 2019 STEM for All Video Showcase: Innovations in STEM Education website, as well as the STEM For All Multiplex website.
Based on periodically updated Google Analytics data. This is intended to show usage trends but may not capture all activity from every visitor.
Public Discussion
  • Post to the Discussion

    If you have an account, please login before contributing. New visitors may post with email verification.

    For visitors, we require email verification before we will post your comment. When you receive the email click on the verification link and your message will be made visible.



    NOTE: Your email will be kept private and will not be shared with any 3rd parties