Sonification: Listening to Data
Space Telescope Science Institute
Welcome to the AAS Education Blog! This post is part of a biweekly series of blog posts from astronomers and educators in the extended AAS community, curated by the AAS Education Committee. Subscribe here to receive future posts directly into your inbox; also follow us on Twitter. We welcome feedback and topic suggestions, as well as guest article submissions! — AAS Education Blog Editors
Have you taught a student with a visual impairment who struggled with astronomy because data and plots were not accessible to them? How might you adapt your teaching in that situation? Lacking another data analysis option, the student may be forced to change majors. We are losing potential data scientists and astronomers early in their education because we currently only do data analysis in one way. Do you use any data analysis workflows that are cumbersome and could be reimagined? As teachers and practitioners, we can contribute to making astronomy more robust and accessible through use of sonification!
Though astronomy is a science that relies heavily on visual recognition and analysis, there is nothing inherent to the work that means it must be done that way. Modern astronomers do not spend hours with their eyes glued to a lens peering at objects in the sky, carefully drawing what they observe. Today, much astronomy research is data analysis done with a computer. Often the information being studied isn’t visible to the human eye. Light curves, spectra, images captured in wavelengths outside visible light, or huge sets of data that can only be analyzed through automation are all common in astronomy research. The workflow of this analysis often involves interacting with data using code, visualizing large datasets in graphs, interpreting the information, and repeating until a result is formed. Many of these steps are doable by anyone with computer skills, it is only reading plots that relies specifically on vision.
This paradigm of visual data analysis is so ingrained that it can be challenging to imagine trying something new, but we can incorporate other senses into our work — data can be translated into sound. Analogous to using location, shape, and color to convey data trends in visualizations, pitch, tone, and volume can be used to analyze data in sonifications. Rather than looking at a chart or graph, data can be analyzed by listening. In some scenarios, the qualities of sound can be more useful in recognizing data trends than visual clues. To get a sense of what this can sound like, you can listen to example sonifications of time series data from the TESS telescope where pitch correlates to brightness. You can listen to even more sonification examples that span different science fields, data types, and sonification mappings at the Data Sonification Archive. Check out the animation below for a quick explanation of sonification and its relevance for astronomy.
Sonifications have the potential to be incorporated in every stage of an astronomy career: engaging with the general public, teaching data-oriented science classes, data analysis, and publishing. Though currently in an early stage of adoption, sonifications have already enabled several blind astronomers to pursue careers in the astronomy field when it may have been denied to them previously. (Check out the research of Garry Foran and Nic Bonne.) However, many people, not just those with print disabilities, can benefit from learning from materials that engage multiple senses — like the combination of sight and sound.
If you want to try sonifying data or sharing this option with students, the Mikulski Archive for Space Telescopes (MAST) has developed a Python library to generate sonifications of time series and spectral data by mapping pitch to brightness. If you are interested in sonifying data, especially MAST archival data, this is a great place to start. Learn to generate your own sonifications in the Astronify documentation and GitHub.
One way of becoming involved in the sonification community and meeting others working on sonification projects is to attend Sonification World Chat, a monthly meeting of data scientists, designers, and teachers. This community includes both blind and sighted practitioners, and those from a range of science fields, though with a core group of astronomers.
Are your eyes exhausted after hours of staring at a screen? Imagine closing your eyes, putting on headphones, and listening to data. Imagine teaching an astronomy class where the students can sonify telescope data as they learn how to analyze it. Imagine giving a lecture where every plot has a corresponding sonification to listen to. As a community, each one of us has the ability to contribute to how accessible or inaccessible astronomy is. I hope you feel inspired to use sonification in your work, whether you are teaching a class or presenting research.