The sounds of space images


The Harvard Gazette reported earlier this year on a project by scientists, science communicators and sound engineers to turn astronomical images into music. The goal is to make scientific data more accessible: to let us experience the images with more senses than seeing alone. This process is called data sonification. Visualization researcher Kimberly Arcand is a science communicator for Chandra, the space-based X-ray telescope. She and a team are working on this project, and the video at top – and most of the other videos on this page – are examples of their work.

We don’t usually connect sound with outer space. There’s no medium such as air in space to convey sound waves, and all sci-fi movie buffs know that the noise we hear as rockets roar past on the screen is put there only to make the action more exciting and (paradoxically) more realistic. In reality, in the vacuum of space, we wouldn’t hear a thing.

But via sonification, data are given sounds of musical instruments. As you move through the data points, you can play through an orchestral piece of a nebula, a stellar cluster, a supernova remnant or even a galaxy cluster, as observed at different frequencies. Different brightnesses or distances would be sonically described by different volumes and pitches. For example, bright stars might get more intense notes and an image of an exploded star grows loud in an intense crescendo, to be evolved into calmer sounds as we move out of the blast wave that diminishes in intensity the farther out we move.

The video at the top plays the sounds of our Milky Way galaxy as we move through its central region from left to right, in the wavelengths of X-ray, optical and infrared light (data from the space telescopes Chandra, Hubble and Spitzer). The bright region in the lower right is glowing gas and dust in front of the central supermassive black hole, which can be heard as a crescendo in the sound. Arcand told the Harvard Gazette:

It is a way of taking the data and trying to mimic the various parts of the image into sounds that would express it, but also still be pleasing together so that you could listen to it as one mini-composition, one mini-symphony.

Blond smiling woman in black dress with white flowers on it.

Kimberly Arcand is a visualization scientist at Chandra and Harvard, working on making astronomical data accessible. Image via Chandra.

The following three videos are sonifications of three supernova remnants, that is, the gas spewed out after a massive star exploded.

Initially with the intention of assisting people with visual impairments to experience space through other senses such as touch and hearing via the use of virtual reality and 3D simulations, Arcand has noticed that it might be useful for everyone else as well. She said:

When you help make something accessible for a community, you can help make it accessible for other communities as well. That kind of more rounded thought into the design of the materials and the intended outcomes for audiences can have very positive benefits.

(As an aside, we actually do talk about sound waves in connection to gravitational waves, where space itself acts as the medium, contracted and extended as these waves move through it. That, however, is a somewhat different kind of sound that the kind we refer to in this article.)

The Pillars of Creation in the Eagle nebula are large clouds of molecular hydrogen and dust and the place of new star birth:

The next two videos are of clusters of galaxies. The Buller cluster provided the first direct proof of dark matter:

Sonification of a Hubble deep space image, where every speck is a distant galaxy (except those few with spikes that are foreground stars):

The Helix nebula is a planetary nebula, the remnant after an intermediate star came to the end of its life:

All the videos featured above were made by NASA/CXC/SAO/K.Arcand and SYSTEM Sounds (M. Russo , A. Santaguida), except the last two which are credited to SYSTEM Sounds (M. Russo, A. Santaguida).

The following video explains more about these videos and how they were made:

Watch and listen to more sonification videos at System Sounds.

Bottom line: Sonification is the process of applying sound to astronomical images. It enables us to use another sense – hearing – to experience the visual data as music. This post contains several sonification videos.

Via Harvard Gazette



from EarthSky https://ift.tt/3lbqaPB

The Harvard Gazette reported earlier this year on a project by scientists, science communicators and sound engineers to turn astronomical images into music. The goal is to make scientific data more accessible: to let us experience the images with more senses than seeing alone. This process is called data sonification. Visualization researcher Kimberly Arcand is a science communicator for Chandra, the space-based X-ray telescope. She and a team are working on this project, and the video at top – and most of the other videos on this page – are examples of their work.

We don’t usually connect sound with outer space. There’s no medium such as air in space to convey sound waves, and all sci-fi movie buffs know that the noise we hear as rockets roar past on the screen is put there only to make the action more exciting and (paradoxically) more realistic. In reality, in the vacuum of space, we wouldn’t hear a thing.

But via sonification, data are given sounds of musical instruments. As you move through the data points, you can play through an orchestral piece of a nebula, a stellar cluster, a supernova remnant or even a galaxy cluster, as observed at different frequencies. Different brightnesses or distances would be sonically described by different volumes and pitches. For example, bright stars might get more intense notes and an image of an exploded star grows loud in an intense crescendo, to be evolved into calmer sounds as we move out of the blast wave that diminishes in intensity the farther out we move.

The video at the top plays the sounds of our Milky Way galaxy as we move through its central region from left to right, in the wavelengths of X-ray, optical and infrared light (data from the space telescopes Chandra, Hubble and Spitzer). The bright region in the lower right is glowing gas and dust in front of the central supermassive black hole, which can be heard as a crescendo in the sound. Arcand told the Harvard Gazette:

It is a way of taking the data and trying to mimic the various parts of the image into sounds that would express it, but also still be pleasing together so that you could listen to it as one mini-composition, one mini-symphony.

Blond smiling woman in black dress with white flowers on it.

Kimberly Arcand is a visualization scientist at Chandra and Harvard, working on making astronomical data accessible. Image via Chandra.

The following three videos are sonifications of three supernova remnants, that is, the gas spewed out after a massive star exploded.

Initially with the intention of assisting people with visual impairments to experience space through other senses such as touch and hearing via the use of virtual reality and 3D simulations, Arcand has noticed that it might be useful for everyone else as well. She said:

When you help make something accessible for a community, you can help make it accessible for other communities as well. That kind of more rounded thought into the design of the materials and the intended outcomes for audiences can have very positive benefits.

(As an aside, we actually do talk about sound waves in connection to gravitational waves, where space itself acts as the medium, contracted and extended as these waves move through it. That, however, is a somewhat different kind of sound that the kind we refer to in this article.)

The Pillars of Creation in the Eagle nebula are large clouds of molecular hydrogen and dust and the place of new star birth:

The next two videos are of clusters of galaxies. The Buller cluster provided the first direct proof of dark matter:

Sonification of a Hubble deep space image, where every speck is a distant galaxy (except those few with spikes that are foreground stars):

The Helix nebula is a planetary nebula, the remnant after an intermediate star came to the end of its life:

All the videos featured above were made by NASA/CXC/SAO/K.Arcand and SYSTEM Sounds (M. Russo , A. Santaguida), except the last two which are credited to SYSTEM Sounds (M. Russo, A. Santaguida).

The following video explains more about these videos and how they were made:

Watch and listen to more sonification videos at System Sounds.

Bottom line: Sonification is the process of applying sound to astronomical images. It enables us to use another sense – hearing – to experience the visual data as music. This post contains several sonification videos.

Via Harvard Gazette



from EarthSky https://ift.tt/3lbqaPB

Aucun commentaire:

Enregistrer un commentaire