I’m currently working on a music app for tablets and needed a metronome to synchronize musical events. Simple enough, right ? Well, it turns out that creating an accurate time-reference in ActionScript is not as easy as I initially thought it would be.
If you are facing similar timing problems, I encourage you to read on…
My initial thought, which seemed reasonable, was to use a
Timer object. After all, you can set the delay in milliseconds. It has to be precise, right ? Well… not quite. As I discovered, there is one major issue with timers: they are tied to the frame rate. Wait, what?!
What this means is that the
TIMER event will actually fire in sync with the closest
ENTER_FRAME event. This can mean missing the mark by as much as 50-100 milliseconds depending on the frame rate. For casual uses, this might be fine but for music, this is a BIG problem…
Even worst, the effect of that inaccuracy is cumulative. After a minute of such errors, your timing could be off by seconds… Here are the result of a simple test I ran. I triggered the timer 240 times at 250 ms intervals (the equivalent of 240BPM) for a total duration of one minute. Here is the total drift observed over that 60 seconds period:
- 0.9 seconds (with a frame rate of 120 fps)
- 1.7 seconds (with a frame rate of 60 fps)
- 4.0 seconds (with a frame rate of 30 fps)
- 7.2 seconds (with a frame rate of 12 fps)
However, fixing this drift was actually not that hard. I simply saved the start time and adjusted the interval to compensate for the drift each time the
TIMER event fired. Still, I did not expect to have to do that.
Fixing the accuracy problem proved to be much harder. I ran more tests and found out that the timer accuracy’s standard deviation was around 45 ms at a “normal” frame rate of 60 fps. In other words, completely unacceptable for music. What could be done ? With the
Timer object depending on the frame rate, it seemed the only hope was to increase it high enough to fix the problem. But, at 120fps, I could still measure a standard deviation of about 25 ms. Furthermore, using such a high frame meant a much higher (and unecessary) hit on the processor. Hmmm…
I dug around and found a hint on StackOverflow from Joa Ebert one of the authors of the amazing AudioTool. He was suggesting looking into the
SampleDataEvent of the
object. The idea is that, unlike timers, the sound playback is not tied to the frame rate. So we could use that to our advantage. Basically, you would create a new, empty, sound and play it. The player, not having any data to play, would then trigger the
SampleDataEvent requesting us to feed it playable data. By carefully providing it with the right amount of samples to play, I was able to use it as a timer.
I ran new tests and was pleased to see that I could now achieve a timing standard deviation of around 10 ms (no matter the frame rate). Yeah! However, while this is much better, I still find it a bit high and would welcome any ideas on how to improve it.
In any case, I decided to share the resulting
Metronome class which uses the approach explained above. I hope it can be useful to you too.