Description
Organiser: James Trayford; co organisers: Nicolas Bonne, Chris Harrison, Sarah Kane, Daniel Ratliff, Rose Shepherd, Andrew Spencer
Visual approaches have long dominated in astronomy. This may not be so surprising - astronomy has produced some of the most scientifically rich and stunning imagery humanity has. However, as astronomy becomes more data intensive, with the volume and complexity (e.g. multi-dimensionality) of data exploding in recent years, traditional visual inspection methods have become increasingly untenable, with ever more ceded to complex machine-based approaches which can be difficult to interpret or verify. What's more, reliance on visuals excludes people who are blind or have low vision, and does not cater to those with different sensory preferences or learning styles. Educational research shows a multisensory approach can help reinforce learning more generally for better learning outcomes. By making use of the strengths of our diverse senses, we may develop new and powerful multi-dimensional interfaces with the data across a range of levels, and even gain new perspectives to discover โunknown-unknownsโ in the data.
With this session, we aim to unite a rapidly growing (but disparate) field of research into multi-modal approaches (e.g. including, sound & touch as well as vision), and how this can be used to better inspect and communicate astronomical data. Through both talks and practical sessions (divided into two parallel session blocks) we hope to investigate ways of combining approaches, while introducing attendees to these new perspectives on astronomy. In particular, a goal of the session is to move from disparate ideas towards unified approaches and practical tools that can be used in scientific and educational contexts alike.
I will introduce STRAUSS, a free and open-source sonification package for Python, designed to flexibly produce sophisticated sonifications. STRAUSS is intended to integrate into data analysis, and follows cues from tools like plotting packages, so that users can produce sonification for presentation and analysis. I will discuss how we try to design a 'low-barrier, high-ceiling' tool, intending...
For almost a decade, The Tactile Universe project has been creating multi-sensory resources to give blind and vision impaired students more options for accessing astronomy. While the project is well known for its work creating tactile images of galaxies, from 2020 - 2023 we worked with local VI students and gravitational wave research groups in England, Wales and Scotland to co-create...
Solar eclipses are some of the most spectacular astronomical events on our planet. However, they are usually framed in terms of visual phenomena, which excludes many in the blind and low-vision (BLV) community. The LightSound is a device that was developed for the 2017 North American solar eclipse to sonify the changing light brightness in real time. It has been used in several total and...
The reliability (and nature) of human senses in rigorous scientific observations has been questioned since antiquity, and contemporary philosophy of science continues to portray astrophysicists as narrow-minded scholars, fuelled by certainty.
Drawing on visual ethnographic fieldwork with astrophysicists and instrument scientists, (microscope labs and diverse medical settings) this talks...
Sonifying an image of a galaxy is not a straightforward task. There is a wealth of information encoded in the brightness of each pixel and how brightness changes as a function of location. Typically, brightness is mapped to pitch and spatial information is included via some form of panning across or around the image. While this approach successfully converts data from the visual to the aural...
Children with complex disabilities and cognition and learning needs are often overlooked in the provision of STEM (science, technology, engineering and maths) activities. Yet working with such audiences can be hugely valuable to all involved. Even where it will not lead to a career in STEM, STEM offers a unique way to inspire young people. Creating opportunities for disabled children to love...
Astronify is an open-source Python package to sonify one-dimensional datasets, like lightcurves or spectra, using a pitch mapping technique. The primary goal is to provide sonification used to preview and analyze astronomical data for research purposes. Each data point is translated into a tone with fully customizable properties like duration, spacing, and pitch mapping parameters. In this...
Due to its chosen detection techniques, astronomy has always been a visual science and produces some of the most beautiful images in science, engaging and awe-inspiring to many. However, scientists and communicators often overlook the need to communicate with blind and low-vision audiences who require different channels to experience this data. This study sonified NASA data of three...
Visual approaches have long dominated in astronomy. This may not be so surprising - astronomy has produced some of the most scientifically rich and stunning imagery humanity has. However, as astronomy becomes more data intensive, with the volume and complexity (e.g. multi-dimensionality) of data exploding in recent years, traditional visual inspection methods have become increasingly...