Anyone who knows my programming style knows that I live and breathe with the enterFrame event. Even in AS3 projects, with not a timeline in sight, I still addEventListener(Event.ENTER_FRAME, onEnterFrame). It’s mostly due to old habits dying hard, not being able to teach an old dog new tricks, maybe a few other cliches.
Most of the new kids are using timers these days, not that there’s anything wrong with that… But I want to clear up one vital misconception, that almost everyone uses when explaining why they use timers instead of enterFrame:
“Timers are more accurate than enterFrame.”
In other words, using enterFrame, you are at the mercy of a potentially slow cpu, which can destroy frame rate, whereas with timers, a millisecond is a millisecond, it doesn’t depend on frame rate. OK, I’m not saying that’s a totally false statement, but there’s a concept behind it that needs some investigation:
“Timers are accurate.”
In other words, people believe that if they set a timer with 1000 millisecond interval, their handler is going to run pretty damned close to every 1000 milliseconds. Well, let’s do a simple test. We’ll set up a timer to run every 1000 milliseconds, and trace out how much time has elapsed since the last time the function ran. The expected result is that it’s going to trace out 1000 each time it runs.
[as]package {
import flash.display.Sprite;
import flash.events.TimerEvent;
import flash.utils.getTimer;
import flash.utils.Timer;
public class Tests extends Sprite
{
private var start:Number;
public function Tests()
{
var timer:Timer = new Timer(1000);
timer.addEventListener(TimerEvent.TIMER, onTimer);
start = getTimer();
timer.start();
}
private function onTimer(event:TimerEvent):void
{
trace(getTimer() – start);
start = getTimer();
}
}
}[/as]
OK, I get an average of about 1020 milliseconds. 2% over. That’s not bad. I could actually live with that and say that is damned accurate. But now let’s add some stuff into the function.
[as]package {
import flash.display.Sprite;
import flash.events.TimerEvent;
import flash.utils.getTimer;
import flash.utils.Timer;
public class Tests extends Sprite
{
private var start:Number;
public function Tests()
{
var timer:Timer = new Timer(1000);
timer.addEventListener(TimerEvent.TIMER, onTimer);
start = getTimer();
timer.start();
}
private function onTimer(event:TimerEvent):void
{
trace(getTimer() – start);
start = getTimer();
for(var i:uint = 0; i < 1000000; i++) { var j:Number = Math.random(); } trace(“elapsed time in function: " + (getTimer() - start)); } } }[/as] Whoa. Now we are up to around 1600 milliseconds! That’s 60% over. Horrible. OK, we have an exaggeratedly huge loop there, but still, it was pretty easy to blow the illusion of accuracy out of the water. What’s happening here? I don’t have the technical details behind it, but it seems that timers work in pretty much the same way that setInterval worked way back in AS2. The interval itself is pretty accurate, but the new interval doesn’t start until it is done processing all of its handlers. In other words: [as] Start timer. 0 Timer completes. 1000 Start handlers. 1000 Handlers complete. 1600 Start timer. 1600 Timer completes. 2600 … [/as] So the time between 1 and 2 is perfectly accurate, as far as I can tell. But the timer doesn’t start timing again until 3 and 4 are done, so the elapsed time of the handler function gets added to the overall interval. At first, this kind of surprised me. I’d done the same tests with setInterval in AS2 and knew it was happening there, but had kind of assumed that the new Timer class was different. Looks like it’s built on the same foundation though. I haven’t tried it yet, but I assume for a timer, which can have multiple handlers, it waits until all handlers are complete before restarting. I’m not complaining. In fact, after thinking it over, it seems like it might be a necessity. What if you have a timer handler that takes longer than the timer to execute? You’d run into a situation where the next interval was called before the last one was finished, and multiple simultaneous handlers running, which isn’t even possible, I think, because AS is not threaded. Forcing all handlers to finish before running them again avoids this situation. So I’m not saying it’s a bad thing, just realize that timers are susceptible to being slowed down by intensive code and slow CPUs, just like enterFrame is. So if you are expecting to get a perfectly accurate “frame rate” by using timers, forget about it. If you are doing some kind of timing crucial simulation or game where the accuracy of the speed of animation is vital, the way to do it is to use a time-based animation set up. This checks the actual time elapsed since the last frame, and adjust the motion or animation based on that. I covered that technique in Foundation ActionScript Animation: Making Things Move!, and it’s been covered countless other times in countless other places.