1802 Views (as of 05/2023)
  1. Kimberly Arcand
  2. http://kimarcand.com
  3. Visualization scientist & Emerging technology lead
  4. Presenter’s NSFRESOURCECENTERS
  5. Smithsonian Astrophysical Observatory, CFA Harvard &...
  1. Kristin A DiVona
  2. Visual Information Specialist
  3. Presenter’s NSFRESOURCECENTERS
  4. Smithsonian Astrophysical Observatory, CFA Harvard &...
  1. April Jubett
  2. http://chandra.si.edu
  3. Video Producer
  4. Presenter’s NSFRESOURCECENTERS
  5. Smithsonian Astrophysical Observatory, CFA Harvard &...

NASA Astrophysics Data Sonifications

NAS8-03060; NNX16AC65A

2022 (see original presentation & discussion)

Informal, All Age Groups

A Universe of Sound is an accessible digital project to help connect users - particularly those who are blind, have low vision, or have different learning needs - with the science of NASA’s Chandra X-ray Observatory, as well as NASA's Hubble Space Telescope, Spitzer Space Telescope and others. Translating astrophysical information into sound, sonification is a growing area in astronomy, for research, discovery, and communications.  The Universe of Sound project was created as a rapid response due to pandemic related issues in reaching our existing communities and supporting users. Impacts range from tens of millions of listens (or views) worldwide, with features in media broadcast to tens of thousands of shares across social media platforms. We  conducted a research study on the perception & response to the sonifications across the expert-to-nonexpert spectrum with 4500 responses and highly positive results overall (paper in preparation).  This project has greatly transformed the way our program "looks at" and considers how to process our scientific data and share it with community members.  In short, when visualizing - or vivifying -  the Universe, an “availability of data does not equal accessibility and equity of data."

 

Sonification Credits: NASA/CXC/SAO/K.Arcand, SYSTEM Sounds (Matt Russo, Andrew Santaguida); Video Credit: NASA/CXC/SAO/A.Jubett et al.; Narration Credits: Kimberly Arcand, April Jubett, Christine Malec, Garry Foran & Matt Russo.

This video has had approximately 332 visits by 246 visitors from 179 unique locations. It has been played 168 times as of 05/2023.
Click to See Activity Worldwide
Map reflects activity with this presentation from the 2022 STEM For All Video Showcase website, as well as the STEM For All Multiplex website.
Based on periodically updated Google Analytics data. This is intended to show usage trends but may not capture all activity from every visitor.
show more
Discussion from the 2022 STEM For All Video Showcase (31 posts)
  • Icon for: Kimberly Arcand

    Kimberly Arcand

    Lead Presenter
    Visualization scientist & Emerging technology lead
    May 9, 2022 | 05:10 p.m.

    Hi everyone! Welcome! And thank you so much for stopping by. To briefly introduce myself, I am a visualization scientist for NASA's Chandra X-ray Observatory at the Center for Astrophysics, Harvard & Smithsonian.  I like to say that I tell stories with data, and these days we go well beyond "just" making a 2D image with that data  - we have branched out into 3D prints, extended reality and now sound.    This project on sonification that we present here began during the pandemic as I was particularly frustrated at not being able to work with community partners that we had built relationships with in the blind and low vision communities.  But the project exploded (ooo, pun!) into something much bigger than I had pictured in my mind originally (happily so of course), both in content/technique and in impacts Just last week we released a new sonification that sonified the sound waves from a black hole for example.  It has been exciting to see the response.  We also have a paper we are about to submit that does show the positive user response (experts-non experts and BVI-sighted folks).  Anyway, I would really like to hear what everyone thinks about this project!

     
    1
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
  • Icon for: Clara Cogswell

    Clara Cogswell

    Community Support Hydrologist
    May 10, 2022 | 09:23 a.m.

    This projects strikes me as a true foray into accessibility in traditionally inaccessible stem fields in a way that also brings new perspective to visual data. I wonder how we can adapt these methods to other fields that blind and low vision scientists and potential students have traditionally not been made welcome in via lack of accessibility? I would love to see innovation like this in terrestrial geology as well. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Matt Russo
  • Icon for: Kimberly Arcand

    Kimberly Arcand

    Lead Presenter
    Visualization scientist & Emerging technology lead
    May 10, 2022 | 11:01 a.m.

    Thank you Clara! I completely agree it would be great to study how these techniques can be used across many scientific disciplines.  There is a knowledge-share group called Sonification World chat that has members across a number of different kinds of science, though astronomy is most heavily represented. Geology would be a fantastic place for such kinds of data representation truly, the data would easily translate to these formats if it isn't being done already.  I would love to chat more on this!

     
    1
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
  • Icon for: Lexi (Elizabeth) Phillips

    Lexi (Elizabeth) Phillips

    Researcher
    May 10, 2022 | 10:32 a.m.

    Great video! I like how you focused on sensory for the viewers, having us watch and listen however we have the ability. This is an amazing gift you're giving to those unable to experience new discoveries in the typical ways.

    Our video tells a heartfelt story of a STEM teacher's journey.

    https://videohall.com/p/2511

     
    1
    Discussion is closed. Upvoting is no longer available

    Matt Russo
  • Icon for: Kimberly Arcand

    Kimberly Arcand

    Lead Presenter
    Visualization scientist & Emerging technology lead
    May 10, 2022 | 11:02 a.m.

    Thanks Lexi for the support! We are so interested in seeing how these techniques can be applied in the classroom too. And thanks for sharing your video link - inspiring!!

  • Icon for: Karen Royer

    Karen Royer

    Graduate Student
    May 10, 2022 | 10:51 a.m.

    Your work is inspiring. I was pleased and surprised by viewing the beauty of the images while experiencing them visually at the same time. I was reminded of other ways that people have told stories with data or rather interacted with computers. Your work reminded me of the work of the Tangible Media Group at MIT. Imagine what this interaction could sound and feel like. Ken Nakagaki created a project call Trans-Dock which enabled transducers to respond electronically and move. I imagine placing my hand over a surface through which I can hear your audio representation of space while feeling it at the same time through my fingers. This is excellent. What are you plans for the future? What are you thinking about combining other interactions with your work?

     
    1
    Discussion is closed. Upvoting is no longer available

    Matt Russo
  • Icon for: Kimberly Arcand

    Kimberly Arcand

    Lead Presenter
    Visualization scientist & Emerging technology lead
    May 10, 2022 | 11:06 a.m.

    Karen, thank you so much! I really enjoy hearing about the MIT project (I've just looked it up - super interesting!) as truly that is the sort of thing that inspires me to think outside the box, outside the sense of sight.  

    We are now starting to research some techniques to help us potentially compare the geometries of astrophysical objects through sonification, whether radial slices moving through a cluster, or stepping along an edge-on galaxy, etc.  Also we have been working on attaching sound to geospatial (astrospatial?) data, eg, in extended reality experiences to expand the multimodality of them.  Essentially we are looking for places where sonification can add to the information quotient, the meaning-making, the overall experience of the user.  

     

     

  • Icon for: Karen Royer

    Karen Royer

    Graduate Student
    May 10, 2022 | 11:37 a.m.

    You are welcome, and thanks for your response.

  • Icon for: Susan Letourneau

    Susan Letourneau

    Researcher
    May 10, 2022 | 07:11 p.m.

    This is such a fascinating project. I'm a researcher in a science center, and we often think about using multiple modalities to help children connect to STEM concepts on multiple levels. Do you have any insights about how these kinds of approaches make STEM concepts more accessible or engaging for both sighted and BVI folks, or people of different ages?

     
    1
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
  • Icon for: Kimberly Arcand

    Kimberly Arcand

    Lead Presenter
    Visualization scientist & Emerging technology lead
    May 11, 2022 | 08:46 a.m.

    We do, yes. We have a manuscript in preparation that does talk about the positive benefits for learning, enjoyment, and engagement for sighted and blind or low vision groups across the range of expertise in astronomy, but particularly on the more non-expert side. There seems to be some triggering of emotional response with this multimodal approach, which perhaps can help make the more esoteric seeming concepts of astrophysics a bit more approachable. There is perhaps also some signifiers towards increased trust in science, and a bit of data to show that multimodal projects can help raise awareness of how other people access information. More to come in the paper though! Thanks so much Susan for your comments and questions.

  • Icon for: Matt Russo

    Matt Russo

    Lecturer, Sonification Specialist
    May 11, 2022 | 10:16 a.m.

    I can also add that I've used these sonifications in presentations and planetarium shows for people all ages, sight levels, and levels of scientific background and get similar responses every time. Although groups of BVI people have been more engaged and reactive. It seems to provoke active listening and hence active learning, since people know there is some information they can extract from the sound (kind of like a puzzle to be solved). My foray into sonification actually started when I was preparing a presentation on TRAPPIST-1 for 12 year olds. I knew that explaining the pattern of resonances or showing images or graphs wouldn't have much effect, but by sonifying the orbits they were able to experience it themselves and it got an instant reaction. We've had similar experiences creating a sonification-based exhibit for the Ontario Science Centre. Parents had a hard time pulling their kids away from it!

  • Icon for: Lorna Quandt

    Lorna Quandt

    Facilitator
    Asst. Professor, Educational Neuroscience
    May 10, 2022 | 09:11 p.m.

    Wow, I am really moved by your video. I find it incredible to think of listening to the tracks and forming a detailed perception of the same underlying data. Given my background in educational neuroscience, I would be interested to know if this type of sonification could be used to teach new scientific content, in a way that would transfer to a content knowledge task of some sort. At this time, do you have any evaluations like that in mind, or is the project more in a development phase right now? Thanks again for a really fantastic presentation. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Matt Russo
  • Icon for: Kimberly Arcand

    Kimberly Arcand

    Lead Presenter
    Visualization scientist & Emerging technology lead
    May 11, 2022 | 09:16 a.m.

    Thank you! And wonderful question! For our first study we did a quick look at user perception/response with basic questions on engagement/learning/etc., looking for differences between expert and non expert, and BVI and sighted, audiences.  I'm particularly inspired by my colleague Dr. Wanda Diaz's research that showed astronomers can become better listeners and use that skill for their data.  So I imagine it would be a useful study to look at this type of multimodality's potential in what you're thinking. I would be happy to chat more about this, really interesting. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
  • Icon for: Dan Roy

    Dan Roy

    Facilitator
    Research Scientist, Interest-based Learning Mentor, Learning Game Designer
    May 11, 2022 | 07:19 a.m.

    Congratulations on such a creative approach to broadening sensory experiences and audiences. I'm curious to hear more about BVI astronomy communities and how they're using these sonifications. Is it mainly a source of inspiration? A way to feel included? A tool for accessing and analyzing data? All of the above? 

    I'm also curious how sonifications might create connections between BVI and sighted communities. Are those connections already well-established? If not, did this project help build those connections? What are the main points of discussion between communities?

    Keep up the good work!

     
    1
    Discussion is closed. Upvoting is no longer available

    Matt Russo
  • Icon for: Kimberly Arcand

    Kimberly Arcand

    Lead Presenter
    Visualization scientist & Emerging technology lead
    May 11, 2022 | 09:54 a.m.

    Hi Dan, Thanks for stopping by!  My colleague Christine Malec, who is blind, said recently that for sighted folks these sonifications are like dessert, but for her, they're like a main meal.  To me, those are both perfectly valid forms of eating (I love dessert) but the nutritional value- and necessity - is going to be higher in the meal than the dessert.  So yes it can be inspiration but its also very much about access to the data and meaning-making of that data.  There are a number of projects now in astronomy that are also trying to establish sonification as a method for scientific data analysis as well.  

    With our project, I was rather surprised at the large, positive response from sighted communities as I had thought primarily BVI communities would engage with it.  But like what can sometimes happen when you build something better for someone you can often make it better for everyone (eg cut curb effect).  In our preliminary study of this project we saw evidence that sighted audience members learned about accessibility of data for BVI communities, so there are some promising nuggets in that.  I really like framing this project as having potential to create more connections between communities and will think more on that. Hope that helps. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
  • Icon for: Matt Russo

    Matt Russo

    Lecturer, Sonification Specialist
    May 11, 2022 | 09:56 a.m.

    Hi everyone, Matt Russo here. I worked on these sonifications along with Andrew Santaguida, my partner in SYSTEM Sounds. Thanks for stopping by and for all of the positive comments and support! Feel free to ask me any questions about the sonification process, or about anything at all! 

     
    1
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
  • Icon for: Cali Anicha

    Cali Anicha

    Researcher
    May 11, 2022 | 01:51 p.m.

    Sonification -- "Brilliant!" to use a term often associated with light ;) - to translate light to sound - and to experience the content-richness of that sound was... something!! And - in agreement with the 'curb-cut effect' comment -- this project is an awesome demonstration of the ways that universal design benefits us all - my experience of this short video was that this is just the beginning of a vital source of scientific knowledge and meaning - thank you for sharing your work.

    Our project also speaks to universal design - focused on how that mindset promotes disability equity for STEM academics.

    https://stemforall2022.videohall.com/presentati...

  • Icon for: Kimberly Arcand

    Kimberly Arcand

    Lead Presenter
    Visualization scientist & Emerging technology lead
    May 11, 2022 | 02:09 p.m.

    Thanks Cali - good pun! ;)  Your project looks great, I'm such a fan of universal design.  Thanks so much for sharing!

  • Icon for: Victor Minces

    Victor Minces

    Researcher
    May 11, 2022 | 09:21 p.m.

    This sounds so beautiful. It would be great if we could upload a picture to your website and have it sonified. Thanks for sharing!

    V

  • Icon for: Matt Russo

    Matt Russo

    Lecturer, Sonification Specialist
    May 11, 2022 | 09:32 p.m.

    Thanks Victor! You can actually do just that on this web-based sonification app we're developing. It uses the same sonification mapping used in some of the Chandra sonifications (a left to right scan, mapping height to pitch). It's originally by an artist named Olivia Jack. We modified it, added accessibility features, and did testing with sighted and BVI people as part of the Sonification World Chat group Kim mentioned earlier. 

  • Icon for: Victor Minces

    Victor Minces

    Researcher
    May 11, 2022 | 09:48 p.m.

    I just tried it. It is wonderful. I too create tools connecting science and music. I actually wanted to include a somehow similar feature in our musical spectrogram www.listeningtowaves.com/spectrogram 

    If you have ideas for making the musical spectrogram even more musical I'd love to hear them. Thanks so much!

  • Icon for: Matt Russo

    Matt Russo

    Lecturer, Sonification Specialist
    May 12, 2022 | 11:44 a.m.

    Wow, very well done, that's the real deal! Very fun, intuitive, and educational. Some kind of rhythm loop functionality (like a visual drum machine) might be cool, but I like the simplicity of the app as is. I'd love to use your app the next time I teach sonification/music, thanks for passing that along!

  • Icon for: Marcelo Worsley

    Marcelo Worsley

    Facilitator
    Assistant Professor
    May 12, 2022 | 07:27 a.m.

    As I watched this video, I wondered about what it might look like to include these features in a VR setting where the user controls which part of the sky they are looking at. This could be with immersive VR, or just by capturing head pose with a web camera. Have you thought at all about ways for combining this with VR? The sounds are so engaging that I could see wanting an immersive listing experience.

     
    2
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
    Teon Edwards
  • Icon for: Dan Roy

    Dan Roy

    Facilitator
    Research Scientist, Interest-based Learning Mentor, Learning Game Designer
    May 12, 2022 | 10:47 a.m.

    One partial answer, if you've seen the VR app Notes on Blindness, it does an interesting job visualizing a soundscape (sort of the inverse of hearing a picture). I wrote a blog post on it a while back. 
    https://www.playfulrealities.com/2021/01/creati...

  • Icon for: Matt Russo

    Matt Russo

    Lecturer, Sonification Specialist
    May 12, 2022 | 11:54 a.m.

    Thanks! The closest thing we've done is a 360 video of the first 5000 exoplanets: https://www.youtube.com/watch?v=OOwI3nTAHIc and the blind listeners we talked to did report enjoying it more than the non-360 version. The visuals were more engaging in the standard one so there's a trade-off, which is why we did both. But, I agree that full VR would really add a lot to the experience since it forces active exploration. I assume it would be more natural for rendered 3D scenes or full sky data (rather than small field of view static images) but would love to try either way. 

  • Icon for: Kim Holloway

    Kim Holloway

    May 12, 2022 | 11:13 a.m.

    Very cool! Are the sounds we're hearing throughout the video actual sonifications? Or just background music? If these are actual sonifications, that's miraculous! 

  • Icon for: Kimberly Arcand

    Kimberly Arcand

    Lead Presenter
    Visualization scientist & Emerging technology lead
    May 12, 2022 | 11:15 a.m.

    Hi Kim!  They are actually the sonifications threaded throughout the video yes :)  You can hear all of them in full including as separate pieces on our site at chandra.si.edu/sound.  Thanks for the nice note and for stopping by!

  • Icon for: Kim Holloway

    Kim Holloway

    May 12, 2022 | 11:25 a.m.

    Amazing. thank you!

  • Icon for: Kimberly Arcand

    Kimberly Arcand

    Lead Presenter
    Visualization scientist & Emerging technology lead
    May 12, 2022 | 11:18 a.m.

    Hi all, I will try to get back to this fantastic discussion soon but in case you haven't seen it yet, there was a big black hole story released today that we've been rather busy with, our Milky Way's supermassive black hole imaged by the NSFs Event Horizon Telescope and observed as part of a multiwavelength campaign with NASA's Chandra (https://chandra.si.edu/) & other telescopes too.  So apologies for any delay today :)  Related to this discussion however, you can listen to 2 new sonifications as part of our project there, released in real time with the science press release : https://chandra.si.edu/photo/2022/sgra/animatio...

     
    1
    Discussion is closed. Upvoting is no longer available

    Lorna Quandt
  • Icon for: Teon Edwards

    Teon Edwards

    Co-founder of EdGE & Game Designer
    May 12, 2022 | 12:52 p.m.

    Such a beautiful video, presenting such beautiful and important work. You seem to focus on broadening sensory experiences and audiences. I'm wondering about if/how you see different representations possibly being pathways to new scientific discoveries. When we "look at" things in new ways, we often have new, exciting ideas. And you've definitely offered a new sensory experience. Thanks.

  • Icon for: April Jubett

    April Jubett

    Co-Presenter
    Video Producer
    May 16, 2022 | 10:08 a.m.

    Yes, certainly, we have witnessed astronomers who use sound to discover patterns in the stars, etc., and sometimes the blips in the sonified data are detected more readily than they would have been with visual data. It's pretty fascinating.

  • Further posting is closed as the event has ended.