Evolve Research Blog

Data Sonification: Hearing Information

Written by EvolveKev | Sep 14, 2020 3:13:58 PM

Recently we visualized a month's worth of global earthquake data in a short minute-and-a-half spatial animation. I was happy with the result, but I felt it was missing something. The visualization felt clinical, and it didn't convey the unfolding drama I saw in the dataset. So, we upgraded our visual by adding another layer of information storytelling - a soundtrack.

 

The soundtrack - or sonification - was built using various data points relating to earthquakes and carefully arranged to synchronize with the animated visualization. The result: everything you hear is a piece of data reflected in the visualization. Take a listen below.

 

 

Here's a basic overview of the sonification.

  • Each bass-note represents an hour (which serves to keep a tempo).
  • The volume of the hi-hat correlates to the combined magnitude of earthquakes happening during the hour.
  • The number of sounds you hear at any given time reflects the actual number of earthquakes happening during the hour.
  • Vocal chants indicate a high volume of earthquakes occurring (above six individual 'quakes in an hour).
  • The drone sound (you can hear in the video intro and outro, but it persists through the entire video) is the audio profile of an actual earthquake passed through a synthesizer to add harmonic content.

 

How Was the Data Sonified?

 

I'll save the particulars for another blog post, but we used Tableau to build a data-map from data provided by the European-Mediterranean Seismological Centre, Excel to create a "score," and Ableton Live to record the arrangement. We used various synthesizers  to design the individual sounds.

 

We added a dramatic tone to the soundtrack to provide drama and intensify the data - a buddy described it as a John Carpenter-esque vibe, so we'll go with that. It's a little ominous and 80's-sounding.

 

Let us know what you think!