开发者

What is the most accurate way to track time in AS3?

开发者 https://www.devze.com 2023-02-04 14:33 出处:网络
The timer class seems to accrue a 开发者_JAVA百科lot of error over time so I\'m wondering what other solutions people have come up with for a more accurate Timer.

The timer class seems to accrue a 开发者_JAVA百科lot of error over time so I'm wondering what other solutions people have come up with for a more accurate Timer. Thanks!


Obviously you can measure delays accurately enough, so I assume you're trying to fire an event after a delay and your resolution isn't good enough.

Yes, the Timer class is a bit terrible. But you can use another trick:

If you need a delay of 2ms (for example), simply use a while loop:

var startTime:Number = getTimer();
while(getTimer() < startTime + 2) {
    // whee
}
doStuff();

It should be self-evident that doStuff won't execute until after 2ms.

Obviously don't use this for delays longer than the frame length as it will cause the player to become sluggish. Use the Timer to get close and then use this trick when you're close enough. You'll get a lot better resolution.

If you need more than one event to fire, you'll need to sort them ahead of time.


I think it isn't possible considering the flash architecture. Basically, each step in flash is divided into steps (there are much more than this and they might be named differently but overall let's say that's how it works):

  1. Enter Frame events execution.
  2. Lightening to and executing other events (OS, Timer, Input, stuff).
  3. Drawing to the screen.

So, let's say you have an application working in 20 FPS, which makes one frame per 50ms. Your enter frame events take 10ms, drawing the screen takes 10ms so in each frame you are only left with 30ms to catch Timer events. If your timer is 1ms, 20 times per frame it will have to wait for Enter Frame and Drawing to finish. Quite sucky, taking into consideration the fact that all these phases can take more time, and Flash has this tendency to lag from time to time without any apparent reason so you are in even worse situation.

Of course I might be wrong somewhere, but I don't think it is possible to do it unless you know of some secret ingredient.


Not totally sure where you are going with your question and situations vary, but a lot of time-related problems in Flash can be solved by doing the following:

  1. Add some frames to your main timeline. Typically, I add two or three seconds.
  2. Make the main timeline loop over this section.
  3. Create a audio file containing nothing but silence (true audio black). One second of silence is normally fine.
  4. Import this to the library.
  5. Add this to the main timeline in the area you padded out. Set the loop count so that it is always playing. Make sure you set to sync to stream.

This effectively causes Flash Player to honor the timing of the main timeline (because it has audio synced to stream), instead of trying to keep up with every graphic frame.

Normally I do this to make sure animations play at the same rate across different machines, but I have also used it to fix wonkiness with timers.


I've come across this some time ago in AS2, but I found AS3 to be a lo* more reliable.

What I've done with as2 was to set the timer to run about 100ms less than it otherwise would and only once. Then I would get it to stick in that getTimer() loop that was described earlier, until it gets to that desired time, then I would have it do whatever. This is not a good idea though in 99% of the cases. The loop is using a lot of cpu and if your event could still be over 100ms late depending on the CPU load.

All in all, after running the simplest Timer test in AS3 I've found that the offset was pretty consistent, and when run under a small CPU load quite reliable.

as far as the explanation goes, it's more of an opinion, an I take issue with the logic used to count the cumulative time drift.

The author adds all the differences, so let's say an event fires when getTimer is at 1000, 2002, 3002. that is 2ms late twice, which doesn't mean that the second event is 4ms late, it is 2ms late, and compared to the previous event it is exactly 1000ms apart, so not late at all. So while the math makes sense, 2 + 2 = 4, I really don't see how this is useful in any way.

When I ran my test: var delay:int = 1000;

var myTimer:Timer = new Timer(delay, 0);

function timerHandler($evt:TimerEvent){
    trace(getTimer());
}

myTimer.addEventListener(TimerEvent.TIMER, timerHandler);

myTimer.start();
trace(getTimer());

I get this a few seconds in while running some video:

15011
16011
17011
18054
19054
20054
21054
22054
23057
24054
25054
26054
27054
28011
29011
30011
31011
...
135047
136194 <- big one here
137246 - and here
138167 -
139242 -
140173 -
141016 <- back
142018 <- normalized
143018
144018
145019
146018
147018

It is interesting to see this "normalizing". It is sometimes off by 54ms consistently then goes down to 27ms and later up to 18ms etc. and stays there for a while before normalizing around another value. But this value is not cumulative, but is pretty close to the base. ie. the offset doesn't keep going up . You would see a cumulative difference if you repeatedly created an event for a single use.

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号