I'm curious how he was able to synchronize to the audio. I have found with at least Chrome on OSX, the audio tag's callbacks have a varying delay with the audio. Seems to be at the OS level and beyond Chrome's control. The delay mostly goes away the second time through, as if whatever's actually playing the mp3 is cached and primed.
very nice. good to see this as an open js library which uses canvas. seeing these kind of visualisations always makes me wonder though... is the amplitude of the spectrum all the info we can obtain and use for visualisation? How difficult would it be to make an api that can detect the different types of sound in a sample and put out different data streams for each?
Someone has gotten it working with the soundcloud api to pull in songs from soundcloud and visualize it, getting an example up on the repo :) or do you mean natively on the soundcloud website? (something else I'd love to do!)