"There's not the smallest orb which thou behold'st But in his motion like an angel sings..." — William Shakespeare, from The Merchant of Venice

I recently stumbled upon some of NASA's sonifications of the data they've collected over the years from their many interplanetary space probes. In general, these consist of raw sensor information that they have translated into an audible range of frequencies (roughly 20 Hz to 20 kHz). One audio clip in particular caught my attention. It was captured in the vicinity of Jupiter's moon, Europa.

Here we'll dive into the data, the sounds, and at the end, an opportunity to participate in a music cognition experiment.

The Juno Spacecraft and Europa

On August 5, 2011 NASA launched its Juno space probe from Cape Canaveral, Florida. The broad purpose of its mission was to increase our understanding of our planetary system by performing a detailed study of its largest planet, Jupiter. Because of its enormous mass, Jupiter has largely retained its composition from the time of its creation. This makes it an ideal subject for learning about the creation and evolution of our entire solar system.

With its enormous mass (318 times that of Earth) and its constellation moons (95 at last count), Jupiter is often thought of as a miniature planetary system of its own. Discovered by Galileo Galilei around 1609, the four Galilean moons are the largest and most famous of its moons. The closest to Jupiter is Io, followed by Europa, Ganymede, and Callisto. Of these, Europa has attracted major attention for its water-ice surface and the possibility of a vast ocean of water beneath it. For reference, its diameter is roughly 90% that of Earth's moon.

None
Ice rafting on Europa. (Public domain: NASA, enhanced by Author)

Plasma Waves

One core component of Juno's mission is to study and map the powerful magnetic fields of Jupiter which are driven by the movement of metallic hydrogen in its interior. The fields give rise to an enormous magnetic bubble called the magnetosphere that surrounds the planet and traps particles from both the solar wind and from volcanoes on Io. These particles, mostly electrons and ions, form an electrically charged gas called plasma that interacts with bodies passing though it.

In the case of Europa, it experiences an induced magnetic field which interacts strongly with the surrounding plasma. This interaction produces plasma waves which vary in intensity as the density of the particles changes. They propagate throughout the magnetosphere.

Juno measures these waves using an instrument called… yes, "Waves." It was designed to measure both the electric and magnetic components of the plasma.

The Data

On September 29th, 2022, Juno flew past Europa and was able to record plasma data for roughly an hour and a half. Here is a spectrogram for the electric field data:

None
NASA's spectrogram of the Europa flyby on September 29th, 2022. Time of day in on the x-axis and frequency in kilohertz is on the y-axis. (Public domain: NASA, modified by Author)

The x-axis is Coordinated Universal Time. The y-axis show the frequency of the waves in kilohertz. Warmer colors indicate higher intensity. For example, at just before 9:40, there is a strong peak at around 90 kHz. This corresponds to a density of roughly 300 electrons per cubic centimeter.

You might notice that the entire range of frequencies on the y-axis is well above the human hearing range. NASA scientists therefore translated the data into the audible range for listening. They also compressed the time frame from 1.5 hours to 12 seconds. Here is what the result sounds like:

Interesting, yes. Spooky, perhaps. According to my daughter, disturbing. It is certainly telling us something, but it's not the kind of audio file most of us would listen to more than once or twice.

Interplanetary Jazz

Many people have a built-in capacity to identify patterns in music that they might otherwise miss when listening to audio of raw or mildly processed data. The challenge for me, when sonifying data, is to find a truly musical expression for it without compromising the integrity of the data. Specifically, even simple choices like note range and choice of musical scale can have the effect of filtering the data and introducing bias to the listener.

To get at the essence of this signal, I tried several approaches based on Fourier analysis. In this case, the most fruitful technique was to look at where the majority of the energy occurs in each time window. I then quantized the range of the resulting time series to give me 50 notes above and below middle C for a total of 100 notes. For reference, a grand piano has 88 notes.

Using the Western 12-tone scale does, necessarily, reduce the resolution of the data. However, I avoided the bias that arises from mapping the quantized values to a typical five or seven note scale by mapping the converted data directly to a chromatic scale (the Western scale that includes all twelve notes in the octave).

Here is what the processed data looks like:

None
The 12 seconds of NASA data after processing.

When I slowed the processed data down to a little less than half-speed, the resulting sonification reminded me of jazz from the era of the group Weather Report which featured the legendary musicians Herbie Hancock on piano and Jaco Pastorius on bass. I heard two reasonably distinct lines happening (you can see the suggestion of this in the image above). I therefore split the data into two voices, an upper range and lower range, and assigned them appropriate instrument sounds. At that point, I could not resist completing the sonic picture by adding a drummer.

Here is the video I produced to accompany the sonification:

I believe that this audio interpretation of the data represents a case where loosening one's focus helps to discern patterns. I've written extensively on the subject of binning musical data to detect power-laws — here we seem to have a similar phenomenon where reducing our resolution casts the picture in a clearer light.

Making Sense of What We're Hearing

I don't know enough about plasma fields and astrophysics to propose an explanation for the observed stratification of intensity levels. However, it's clear that the interactions between Jupiter and Europa with regard to their respective magnetic fields and plasma fields are complex (Europa releases ionized water plumes). We can also add the potential interaction with neighboring Io's plasma torus. Then there are the interdependent effects of these macroscopic effects with microscopic wave-particle interactions to consider.

In the paper referenced below by William S. Kurth, et al., they discuss structure in the upper hybrid frequency band from which the sonification is derived. Here is a zoom into the time period around closest approach and highest density:

None
Figure 3B from Kurth paper. "E SD" refers to "electic spectral density." (W. S. Kurth, et al.)

It shows evidence of discrete steps in frequency. The authors attribute this step-like behavior to the electron cyclotron frequency, f_ce, which refers to the rate at which electrons spiral around magnetic field lines. From the paper:

"The expanded time scale allows details of the upper hybrid frequency-time structure, particularly near the peak density ∼09:35:30 where these emissions appear to step down in regular frequency increments similar in magnitude to f_ce. This is a common feature of upper hybrid emissions that are organized by harmonics of f_ce and happens when the plasma density, hence f_pe​, decreases (or increases) with respect to the cyclotron frequency."

The plasma frequency they refer to, f_pe, is the natural frequency at which electrons oscillate when displaced from equilibrium. It depends on the electron density. Together, f_ce and f_pe define the upper hybrid frequency f_UH by relating electron density and magnetic field strength:

None

This as far as I can reasonably get by way of explanantion. The paper does not mention a broad binary stratification of frequencies that occurs in the upper hybrid frequency band over *almost* the entire 1.5 hour recording. I say "almost" because the piano drops out for about the last 11 minutes of data (actual time), leaving us with just the funky bass line groove. Perhaps with a few more minutes of data, the piano would return. On the other hand, if the pianist indeed left the bandstand, Juno's position at that time could offer us a clue about the structure we are hearing.

If you happen to be an astrophysicist, it would be terrific to hear your thoughts on possible explanations for the apparent stratification!

Coda: The Challenge

This investigation took an interesting turn when, after producing the music, I went back and listened to the original NASA file at the somewhat slower speed at which I had rendered the music. I had the definite impression of listening to the Herbie and Jaco sonification embedded in the signal! My wife experienced the same impression (she has also had musical training).

Now I'm curious as to whether the phenomenon of not being able to "unhear" the jazz in the signal is a result of consistent and prolonged exposure to the music or whether this is a more interesting cognitive phenomenon.

If you're interested, we can perform an informal experiment right here on Medium.

Here is a mix of the sonification without drums along with the raw audio. Try listening to it twice (or more, if you like):

Now, here is the raw audio by itself:

If you can hear the suggestion of the piano or bass lines while listening to the raw audio, please share your experience. I'd be really interested to know!

Down the Rabbit Hole…

Thank you for reading! If you enjoyed this piece, please consider following me and hitting the "Applause" icon as many times as you'd like. You can also subscribe to get my latest content straight to your inbox. I write regularly on fun topics in math, music, and science.

Related Articles