2 min read
.
on Sep 29, 2015
How to create a music visualizer that processes an Mp3 music file and streams the data in realtime to trigger multiple smart phone flashes to the music.

In this tutorial, we’re going to build a captivating music visualizer using smartphone flashes. The distributed application, called CymaticRadio.xyz, takes advantage of the free API’s from PubNub and Echo Nest to allow any number of iOS phones to subscribe to a specific channel playing a song controlled through a web music player, and the web application will coordinate strobes to each subscriber such that all phones will work in harmony to create a potentially giant music visualizer.

Lastly, the full GitHub repository for the project is available here.

With that, let’s review the APIs used for the project, then dig deeper into how the application works.

Echo Nest for Processing

Echo Nest is an API that allows you to upload a song and receive a JSON file for all the music’s data points. Easy to use, you don’t need to understand music technology to create an application that does something with it.

PubNub for Data Streaming

The PubNub Data Streams API powers all communication between the web application and smartphones. The core concept of the application is that users can create “channels,” which aligned perfectly with PubNub’s architecture (where applications can subscribe and publish to any number of channels in realtime).

How the Music Visualizer Works

Each phone or “subscriber” registers to a user-input channel that corresponds to a user-defined channel on the web music player. Upon establishing a connection, the web server will respond with a different channel for the phone to subscribe, corresponding to a subset of possible pitches in a given song. As more phones subscribe, the pitches are divvied out into more channels.

For example, if one phone were to register, all possible pitches would be directed to one channel, and if hundreds of phones subscribed, each possible pitch would be directed to a single channel (that several phones subscribe to).

smartphone camera flash to music beat

Users upload an MP3 to the web application which sends it to Echo Nest for analysis. Upon completion, the web player can easily consume a JSON message over time that triggers messages to be sent on the different pitch channels that have been registered. That message contains the intensity of the flash (which also corresponds to pitch) and its duration.

Messages are sent 2 seconds before the music is played to allow for an arbitrary buffering time to account for possible latency due to network conditions. Client phones will receive the messages with the flash information and subtract the delivery time of the message from that arbitrary 2 seconds. This ensures that all client phones are synchronized and messages are played in the correct order.

Scott Lobdell is a software engineer. Check out Scott’s blog for a wide variety of technical tutorials and demos.

More From PubNub