Anyone who knows my programming style knows that I live and breathe with the enterFrame event. Even in AS3 projects, with not a timeline in sight, I still addEventListener(Event.ENTER_FRAME, onEnterFrame). It’s mostly due to old habits dying hard, not being able to teach an old dog new tricks, maybe a few other cliches.
Most of the new kids are using timers these days, not that there’s anything wrong with that… But I want to clear up one vital misconception, that almost everyone uses when explaining why they use timers instead of enterFrame:
“Timers are more accurate than enterFrame.”
In other words, using enterFrame, you are at the mercy of a potentially slow cpu, which can destroy frame rate, whereas with timers, a millisecond is a millisecond, it doesn’t depend on frame rate. OK, I’m not saying that’s a totally false statement, but there’s a concept behind it that needs some investigation:
“Timers are accurate.”
In other words, people believe that if they set a timer with 1000 millisecond interval, their handler is going to run pretty damned close to every 1000 milliseconds. Well, let’s do a simple test. We’ll set up a timer to run every 1000 milliseconds, and trace out how much time has elapsed since the last time the function ran. The expected result is that it’s going to trace out 1000 each time it runs.
[as]package {
import flash.display.Sprite;
import flash.events.TimerEvent;
import flash.utils.getTimer;
import flash.utils.Timer;
public class Tests extends Sprite
{
private var start:Number;
public function Tests()
{
var timer:Timer = new Timer(1000);
timer.addEventListener(TimerEvent.TIMER, onTimer);
start = getTimer();
timer.start();
}
private function onTimer(event:TimerEvent):void
{
trace(getTimer() – start);
start = getTimer();
}
}
}[/as]
OK, I get an average of about 1020 milliseconds. 2% over. That’s not bad. I could actually live with that and say that is damned accurate. But now let’s add some stuff into the function.
[as]package {
import flash.display.Sprite;
import flash.events.TimerEvent;
import flash.utils.getTimer;
import flash.utils.Timer;
public class Tests extends Sprite
{
private var start:Number;
public function Tests()
{
var timer:Timer = new Timer(1000);
timer.addEventListener(TimerEvent.TIMER, onTimer);
start = getTimer();
timer.start();
}
private function onTimer(event:TimerEvent):void
{
trace(getTimer() – start);
start = getTimer();
for(var i:uint = 0; i < 1000000; i++)
{
var j:Number = Math.random();
}
trace("elapsed time in function: " + (getTimer() - start));
}
}
}[/as]
Whoa. Now we are up to around 1600 milliseconds! That's 60% over. Horrible. OK, we have an exaggeratedly huge loop there, but still, it was pretty easy to blow the illusion of accuracy out of the water.
What's happening here? I don't have the technical details behind it, but it seems that timers work in pretty much the same way that setInterval worked way back in AS2. The interval itself is pretty accurate, but the new interval doesn't start until it is done processing all of its handlers. In other words:
[as]
Start timer. 0
Timer completes. 1000
Start handlers. 1000
Handlers complete. 1600
Start timer. 1600
Timer completes. 2600
...
[/as]
So the time between 1 and 2 is perfectly accurate, as far as I can tell. But the timer doesn't start timing again until 3 and 4 are done, so the elapsed time of the handler function gets added to the overall interval.
At first, this kind of surprised me. I'd done the same tests with setInterval in AS2 and knew it was happening there, but had kind of assumed that the new Timer class was different. Looks like it's built on the same foundation though. I haven't tried it yet, but I assume for a timer, which can have multiple handlers, it waits until all handlers are complete before restarting.
I'm not complaining. In fact, after thinking it over, it seems like it might be a necessity. What if you have a timer handler that takes longer than the timer to execute? You'd run into a situation where the next interval was called before the last one was finished, and multiple simultaneous handlers running, which isn't even possible, I think, because AS is not threaded. Forcing all handlers to finish before running them again avoids this situation.
So I'm not saying it's a bad thing, just realize that timers are susceptible to being slowed down by intensive code and slow CPUs, just like enterFrame is. So if you are expecting to get a perfectly accurate "frame rate" by using timers, forget about it.
If you are doing some kind of timing crucial simulation or game where the accuracy of the speed of animation is vital, the way to do it is to use a time-based animation set up. This checks the actual time elapsed since the last frame, and adjust the motion or animation based on that. I covered that technique in Foundation ActionScript Animation: Making Things Move!, and it's been covered countless other times in countless other places.
I strongly agree, specially with the last two paragraph: if you need accurate time control, the best way to do it is by checking the time spent since the last update. Trying to do it with timers just won’t cut it, specially on screen updates, where you’d probably be wasting processing time by upscaling the ammount of iterations needed for accurate animation. Updating on redraws (onEnterFrame) is the best way to go; it adapts to the player speed, whichever it is.
Overall, the way I see it, many people seem to think that onEnterFrame is some kind of evil programming event, one that spends a lot of time doing nothing and the mere replacement of it by some event fired every once every X miliseconds will magically make things much faster. This is a common misconception that’s not easily cleared up.
Hi Keith!
I always use onEnterFrame too – I think any programmer with an eye for animation will easily see the advantages! But very surprised about the innacuracies in setInterval, thanks for the information!
See you at Flash on the Beach!
cheers
Seb Lee-Delisle
PS your email address seems to be bouncing at the moment!
onEnterFrame is natural to work with, with cpu issue with Flash Player (specially math calculations). for accurate timing in flash there is no great solution. If I would just counting seconds I would still use onEnterFrame
Some very interesting info on Timer behavior that I’m glad to know. Personally, I tend to avoid “onEnterFrame” events, not because they’re inherently bad, but because they require the use of an extra object (i.e. MovieClip in AS2) that can’t always be logically justified for that purpose. Reading this may make me change my tune, though. But one thing I do love about the AS3 Timer is the repeatCount argument. If you call a method once per element in some given array and use that array’s length as the repeatCount, it’s almost like a for..each with a specified amount of delay. No more testing some conditional to clear an interval.. That’s quite a handy little feature..
Yes, Zeh, I didn’t even get into any points about why enterFrame might be better, but there is that issue of screen updates, which are completely tied to the frame rate. Sure you can force it with updateAfterEvent, which is now a method of TimerEvent, but now you have this other whole set of screen refreshes, which is out of sync with the screen refreshes that Flash is already taking care of, based on frame rate. So say your frame rate is 31, meaning you are getting a screen refresh about every 32 or so milliseconds. Then you set a timer to run every 30 milliseconds, with updateAfterEvent. I think that means you’re going to get a new screen redraw 64 times a second (31 for your enterFrame and 33 for you timer). Doubling your screen redraws can’t be good. You can crank down your frame rate, I guess, and just rely on the timer, but in the past, you could only get 10 intervals actually firing per frame, or something like that. I’m assuming there is still some kind of limit. So you can’t set it too low.
Timers seem so clean on the surface, but once you dig in, there are ugly issues like this.
Anyway, once again, not saying timers are bad, just that there is more to know about them before you go gleefully using them thinking they are the answer to all your problems.
hi keith, i also tried timers and did some tests, i use them sometimes but for animation, me.addEventListener(Event.ENTER_FRAME, onEnterFrame) too.
Devon: Actually you don’t need to create a Movie Clip to harness enterFrame in AS2. The _root timeline is already a MovieClip. Just declare an onEnterFrame function at _root level and you’re set.
Completely agree. Frame sync is essential to any graphic intensive application. With code designed around building the next frame, the ENTER_FRAME event is the best solution we’ve got today. I don’t see timers being useful in this situation, in fact it’d be great to have some frame redraw info and of course, double buffering.
I’ve actually been using an adaptation of mx.transitions.OnEnterFrameBeacon for all of my enterFrame needs. This allows me to listen to the enterFrame of a single clip for all of my needs.
I modified it so all I have to do is say BrentsEnterFrameBeacon.addListener(fDelegate).
Cheers,
Brent Bonet
EnterFrame is usually the best for rendering — but it’s better with a checked time between renders … sometimes an enter_frame event won’t fire for a long time (especially at the beginning of an plug-in start up) …
var t:int = getTimer() – lasttime;
lasttime = getTimer();
move = somevalue * (t / frame_interval);
yes daniel, that’s exactly what I was referring to in my last paragraph of the post.
I agree that timers aren’t perfectly accurate, however, to me the benefit of using them for something like an animation is to have the ability to use ‘updateAfterEvent’ to refresh the display on command. I have seen maximum performance with my personal animation system calling an interval at 15 milliseconds with a framerate of 31 frames per second. Even though the timers aren’t accurate, you still get around twice the display updates as you would with onEnterFrame which results in noticably smoother animation without effecting performance.
I do want to emphasize that you need to be smart about how many different objects are calling ‘updateAfterEvent’ to avoid performance issues. In the end, it all comes down to good code architecture. There is a time and a place for both enter frame and setInterval/Timer, knowing when and how to use each is the key.
I prefer intervals to update data instead of animation. I have used it a few times for animation and noticed that if the callback isn’t too cpu intensive it actually works better than an onEnterFrame event. There are downfalls to both methods and you did a great job of illustrating that. It’s always great to get some more insight into the inner workings of actionscript.
I guess using one over the other is a mather of preferences.
A basic hint: when you need Event.ENTER_FRAME ticks under AS3 you can get them off of a Shape instance without it being on the display stack. Normally for a DisplayObject (like Sprite) to generate Event.ENTER_FRAME Events it has to be added as a child of some DisplayObjectContainer, but a Shape() can be instantiated and used to generate ticks without ever being added to any container/display list.
As an aside, there is one way to generate extremely accurate clock timing in Flash: use an onSoundComplete() to generate the ticks. Basically, if you add a listener to onSoundComplete of a playing sound it will align the event to the nearest audio buffer cycle, which is an exact factor of the sample rate, depth and number of channels.
This works in Flash8 as well. Basically, you create/attach a sound with (for example) only 1 sample of data (1 byte of sample data), or a ‘duration’ of sample data that corresponds to the nearest buffer cycle that will be ‘over’ your intended clock interval (i.e. if each buffer cycle was exactly 100 milliseconds and you wanted a 254 millisecond timer interval, you would create/attach a sound with 300 milliseconds worth of audio data in it). You set the sound to play and assign a callback/listener to its onSoundComplete (and, as long as you need the clock generator active, have it also re-play itself onSoundComplete).
The handler receiving the callback/event would then ‘know’ that it is getting an accurate tick that would be X milliseconds ‘longer’ than the intended event start time, and could then fire off the event, cutting into its start position by X. For example if you knew the buffer cycle was going to be every 300 ms, and you wanted a 254 ms timer interval, you would know all your events would need to be adjusted back (or ‘cut into’ when they are triggered) by 46ms.
So, for example if you were trying to play a sound or trigger a video file to play, or a swf timeline… you would derive the ‘overlap’ amount and then start that sound, video or movieclip with a start point offset (sound would just use the secondsOffset value of sound.start(), video would use a seek(), and a movieclip you would basically translate the offset to the nearest frame and jump to that frame).
For an example of this approach applied to an audio buffer see:
http://code.google.com/p/popforge/
http://popforge.googlecode.com/svn/trunk/flash/PopforgeLibrary/src/de/popforge/audio/output/AudioBuffer.as
Neil,
“Normally for a DisplayObject (like Sprite) to generate Event.ENTER_FRAME Events it has to be added as a child of some DisplayObjectContainer”
Not true, as easily demonstrated.
package {
import flash.display.Sprite;
import flash.events.Event;
public class Test extends Sprite
{
public function Test()
{
var sprite:Sprite = new Sprite();
sprite.addEventListener(Event.ENTER_FRAME, onEnterFrame);
}
private function onEnterFrame(event:Event):void
{
trace(“hello world”);
}
}
}
As far as using sound to sync frame rates, yes, various hacks like that have existed for years, back to flash 6 I think. Maybe 5. I consider them hacks. I would never consider using a sound to force my animation to play at a specific frame rate.
nice overview, i would have liked to have seen a few more issues with using enterFrame, in order to compare easier.
enterFrame limits flash to 50% of the computers CPU, and is more likely to have slowScript error when running complicated functions
intervals use up to 100% of the computers CPU, and thus have a bit more elbow room before they get bogged down by frame rate
-intervals are far more important to clean up, they can exsist in the parent swf even after the child swf that called it has been removed
using the two together gets things out of sink quite a bit.
in the end i do believe its best to use on enterFrame due to the fact that it is tied directly to the refresh rate (even most calculations wont need to happen before the frame refreshes)
intervals would be an interesting work around for a for() statement, some times your loading a lot of data from a database and evaluating it in a for statement…. using a very fast interval and a completion checker would prevent this from slowScripting
for the record, im still in AS2, at least for a few more weeks.
I’m interested to know where you got the 50% vs. 100% data. I haven’t heard that before.
If you are doing anything in an enterFrame that gets you anywhere near the slow script error, I think you need to refactor to break the function up over time. You shouldn’t be writing functions that grab the cpu for 15 seconds in a single threaded environment.
Guys, event priority plays a role in this, try setting it to a higher priority to decrease, not eliminate, your time descrepancy.
This forces the elastic racetrack model of the avm2 to give your timer events greater priority, hence accuracy, over other stuff.
I haven’t been bothered to test this out at all but instead of using setInterval (you briefly mentioned it) or Timers can one not just use setTimeout?
Most of the time people use it like this:
var a = 0;
function repeatMe() {
trace(getTimer() – a);
a = getTimer();
setTimeout(repeatMe, 1000);
}
repeatMe();
and that’ll call repeatMe and then do the stuff and then call repeat me in 1000ms, then stuff, then call in 1000ms. Not good if ‘stuff’ takes 600ms. How about this:
var a = 0;
function repeatMe() {
setTimeout(repeatMe, 1000);
trace(getTimer() – a);
a = getTimer();
}
repeatMe();
What this one does is calls repeatMe which calls repeatMe in 1000ms then does stuff. So the 1000ms is ticking away while ‘stuff’ happens.
Like I said I haven’t bothered to test this (as I don’t need it – I happened upon this page while looking for a way to do a synchronous LoadVars-like thing (unfortunately not)) and it either won’t work or will run into those problems you mentioned (two possible calls at the same time if one lasts too long), but it’s worth a try if desperate maybe? Having said that, if this works and setInterval/Timer doesn’t, that’s just CRAZY!
Can you explain why reducing the frame rate would affect my use of timers?
Scenario:
1. frame rate of 2 frames per second
2. timer fires off every 1000ms
3. timer listener updates the display objects and calls the timer’s updateAfterEvent method
4. display objects don’t update every 1000ms
if I up the frame rate to above 20, it seems fine (to the eye anyway).
oshevans: timers are somewhat tied to frame rate as well. I know that in AS2, using setInterval, you could get about 10 setInterval calls per frame. No more. I’m not sure how much of that carried over to AS3 but I assume it’s a similar situation.
I came accross another issue regarding the accuracy between the getTimer and date.getTime functions.
I use this simple script (AS2)
stop();
//START TIMER
var tObj:Object = new Object();
tObj.startDate = new Date();
tObj.startDateTime = tObj.startDate.getTime();
tObj.startTimer = getTimer();
var gTimerID:Number;
gTimerID = setInterval(this, “checkT”, 100);
function checkT() {
tObj.msTimer = getTimer()-tObj.startTimer;
var my_date:Date = new Date();
tObj.msDateTimer = my_date.getTime() – tObj.startDateTime;
tObj.diffTime = Math.abs(tObj.msTimer-tObj.msDateTimer);
trace(“difference:”+tObj.diffTime);
}
What happens if you let this simple app running is that the trace will show that the difference between the getTimer and date.getTime results slowly increases.
I would assume some inaccuracy, but not one that would slowly increase during the lifetime of the application.
The reason I want to compare the two values is that I want to see if someone is tampering with the application speed with a program such as Cheat Engine. And I noticed the date.getTime function isn’t affected by that program, but the getTimer function is.
If you bother to read the AS3 language reference at all…
“Depending on the SWF file’s framerate or Flash Player’s environment (available memory and other factors), Flash Player may dispatch events at slightly offset intervals. For example, if a SWF file is set to play at 10 frames per second [fps], which is 100 millisecond intervals, but your timer is set to fire an event at 80 milliseconds, Flash Player will fire the event close to the 100 millisecond interval. Memory-intensive scripts may also offset the events.”
As you can see, Timer granularity is tied 1:1 to framerate. Hate to break the illusion, but Event.ENTER_FRAME is still the most accurate time interval in AS3.
Could someone please post the exact script of how to make function moving with ENTER_FRAME event and “rechecking” it with GetTimer() ? I understand that we need to get ticks, but what’s then ? I’m a begginer, and I just can’t work it out to the end.
You do realise how much of a pain it is to delete 1. all 2. your 3. line 4. numbering??
You do realize you can click “plain text” and get rid of the numbering?
@kp: Funny, I also used to manually remove the line numbers from any code samples I got from your site. It never occurred to me that “Plain Text” at the top was a button/link. I wonder how many other folks miss this?
P.S. Even when I “had to” remove the line numbers, I was very grateful for the sharing. 😉
You should have tested what influence the framerate has on this.
I see no point in using timers. Use onenterframe and check the time.
I’ve been avoiding onEnterFrame and using intervals to fire functions and get a smooth/fast functions and be indepedent of the fps.
Is this correct thinking? onEnterFrame is depending on fps right?
Well, one nice things about timers is that they can be reset. So say you want some event, event-A to hide a graphic (graphic-A), UNLESS event-B cancels that.
With onEnterFrame you have something being calculated all the time. If you set a timer to hide the graphic-A beginning with event-A. If event-B happens before the timer completes, it can reset the timer, canceling the event. Now because the timer has been reset, nothing is being calculated until the timer is once again started, saving CPU work. No?
Minor innaccuracies aside, I think the Timer class does exactly what it’s meant to. Unless you want to get into some sort of virtual multi-threading, you really don’t want the timer to fire again before your code finishes execution, as all sorts of hard to debug nasties are likely to occur. Thus, if your code executes longer than the interval, it will wait for it to finish before firing the next event. (I.e. Like any other single threaded application)
Dude; your my hero! I’m also sick of using Flash timers because of the terrible inconvenience they provoke where you could simply use a counter. Thanks for posting this, I thought I was the only one who felt this way