Make music live again!

Alexander Shevelev
3 min readMar 24, 2019

What it is we’re going to achieve

In this short article, I’m going to show how we can provide a visual representation of the currently playing audio. It’s not very hard, especially when you know how to do it — Android SDK has all the tools for this.

First of all, let’s take a look at what it is we’re going to achieve:

We’re going to implement a simple visualizer of a currently playing audio (the central one on an above screenshot, but you can find rest two in the source code for the article).

A quick look at Visualizer class

For this purpose, we are going to use android.media.audiofx.Visualizer class, which was introduced in API 9 (aka “Gingerbread”). It requires the android.permission.RECORD_AUDIO permission. And it makes a sense because we are capturing an audio stream. How to get this permission is a topic for a separate talk, and it is beyond the scope of the article (for example, you can find it out in official documentation).

So, let’s start from the beginning and take a quick look at a utility class for playing sound.

It’s very easy to understand, but pay attention to the last function: getAudioSessionId(). It’s not hard to guess it returns the id of a current audio session, and we’ll use this id to link playing audio stream to the visualizer.

Let’s start visualizing

And here it is — the base class for all visualizers.

Pay attention, please, to some key points.

First of all, we inherit this class from the View, because we’re going to draw a visual representation of a sound on a canvas.

At second, as mentioned above, we use android.media.audiofx.Visualizer class to get source data for representation. And we should release its resources when we don’t need to visualize audio anymore. We do it via calling release() method.

And here the magic begins

All the magic is going on in setAudioSessionId(…) method. Let us take a look at it more closely.

At first, we must create an instance of Visualizer class (pay attention to audioSessionId param, it lets us link out audio stream and visualizer).

After this, we should define the size of an output array with audio data by setting some value to a captureSize property (I use a maximum possible value for it, it’s 1024).

Then we are setting up a listener for audio data via setDataCaptureListener(…) method. Let us take a look at it more attentively. First of all, you should know, there are two methods for retrieving audio data in the listener. The first one — onWaveFormDataCapture — returns waveform data (I consider, this data based on sound amplitude) and the second one — onFftDataCapture — returns data based on audio frequency.

We should use only one of them and my personal choice is onWaveFormDataCapture. So we must turn it on and turn off onFftDataCapture method. To do it we use the last two parameters of setDataCaptureListener() method.

Let’s dive into the implementation of onWaveFormDataCapture. It’s simple. You get sound data, make a copy of it (it’s a key moment because the data lives only inside your listener), and call invalidate() method to redraw your view.

Let’s start to draw

Ok, we have source data to draw and let’s do it! It’s actually pretty simple:

We need to create a paint object for drawing, calculate drawing points and connect them with lines. That’s all, folks! The source code for this article can be found here.

--

--

Alexander Shevelev

An Android developer from Yandex LLC, Moscow, Russia. That’s all.