Allegro.cc - Online Community

Allegro.cc Forums » Programming Questions » New Timing System

This thread is locked; no one can reply to it. rss feed Print
New Timing System
gillius
Member #119
April 2000

I've been talking about this method of timing for some time on the forums, and I decided to back it up with some code. It is meant to be a replacement for the timing scheme in Allegro, and an answer to why I think any method using yield_timeslice and sleep is wrong. This version is Windows-only as it's a quick hackup, but I know how to make it work on Linux/UNIX with only a little bit of work since I use similar code in GNE for Linux.

This post could be enormously huge, so I will try to keep very short and to the point.

Justifications:

  • Allegro timers are bad because they require another thread for no real reason, and their resolution is extremely small at 10ms

  • The traditional Allegro timing system is fixed rate. Fixed rate can be good depending on your game, but this timer is for dynamic rate. For games where you don't need fixed rate (net games are usually good for fixed rate but don't need it), dynamic rate allows you to adjust your processing better to the computer, by allowing fast computers more accurate physics and slower computers less accurate, w/o resorting to frame skips.

  • This algorithm plays nice with laptops and multi-process environments.

Some people have been suggesting yield_timeslice or sleeping. Both which are very bad:

  • if you call yield_timeslice all the time, all you are doing is putting your process on the lowest priority since you will always yeild to everyone else.

  • yield_timeslice still will use 100% CPU, meaning you will eat up a laptop's battery by trigging the CPU's "performance mode."

  • sleep(1) usually will sleep longer than a millisecond, and if you call it unconditionally, you will impose a maximum frame rate. As your framerate decreases, CPU use goes up, but you will never be able to use 100% CPU if you need it.

The algorithm I have implemented extends a pure dynamic frame system and augments it by introducing three concepts:

  • Min graphics frame rate (ex: 10fps, pauseDt)

  • Min logic frame rate (ex: 30fps, maxDt)

  • Max logic frame rate (ex: 60fps, minDt)

Remember that smaller dt means smaller frame size. I use dt values in seconds, which makes implementing physics easier by using real-life metric units (for example meters per second rather than pixels per frame).

The algorithm takes into consider a minimum operating system sleeping time. It allows the game to run faster than real-time to show game state ahead of real time, then it will sleep.

The algorithm uses a high-performance counter attached to the system that increments based on phsyical time (QPC or gettimeofday).

The ideal situation with the 10ms Windows sleep resolution is to run the game until you are 5ms ahead then sleep 10ms so that you are 5ms behind. The sleeping allows the the CPU usage to decrease. There's no reason to render 500fps so why do it?

Here are the actions my algorithm takes:

  • If dt is less than minDt ( >60 fps ), the game is allowed to run ahead of time. When the game gets too far ahead, sleep. This has the affect of fixing the game to 60FPS and sleeping for the extra time, conserving laptop battery and allowing OS to function. 1:1 graphics:logic ratio.

  • If dt is between minDt and maxDt ( 30 to 60 fps ), the dt value is passed unmodified, and the game logic rate is 1:1 graphics/logic ratio.

  • If dt is between maxDt and pauseDt (10 to 30fps), the dt value is divided, and frames are skipped. Logic runs at a constant 30FPS but graphics are drawn between 10 and 30FPS.

  • If dt is greater than pauseDt, dt is extremely large. The game is considered paused for this time and maxDt is returned. If the user leaves the game and comes back, dt may be let's say 2 minutes. The game will act as if it was paused for that time.

The "pausing" functionality has a dual purpose. It pauses the game if the OS decides to steal the CPU for very long times (non-DMA I/O access for example). Instead of the game advancing instantly in time it acts as if it was paused.

On very very slow computers that can't render 10fps, it will force the computer to render at 10 frames per game second, but in this case game time is progressing at a speed less than real-time.

I've attached an Allegro 4.1.11 application to this post along with MSVC.NET 2003 project files.

Controls:

  • hold A to increase frame processing time by 10ms

  • hold S to increase frame processing time by 20ms

  • hold D to increase frame processing time by 30ms

  • hold F to increase frame processing time by 40ms

  • ESC to quit

Program shows game time and real-time since program start. If you click away from the window, the program will pause (because of how Allegro works), so when you go back dt will be extreme and the game will act paused. You will also notice the program not using 100% of your CPU. The default settings are 10, 30, 60 but you can change these settings.

The inner circle runs at 1hz, outer at 0.5hz and the 3rd circle revolves once a minute.

Gillius
Gillius's Programming -- https://gillius.org/

Frank Drebin
Member #2,987
December 2002
avatar

sounds nice but you only attached the sources, right? a compiled version would be nice.

miran
Member #2,407
June 2002

Wow, excellent, I think I'll use that from now on!

Frank: You can't compile a C++ program yourself?

--
sig used to be here

Frank Drebin
Member #2,987
December 2002
avatar

NO. currently at a pc without compiler or allegro.

gillius
Member #119
April 2000

Note that this method is experimental. I just invented this method and although I'm sure it's been used before by other games but it is new to me. The key thing with this algorithm is the sleeping, making sure that the game doesn't look like it is stuttering.

Originally I thought Sleep slept at least 10ms, but I was wrong. With Sleep( 10 ) I measured the sleep time using QueryPerformanceCounter. Now I know that neither timing scheme may not agree -- for example hardware timing is usually way off when measured in small periods I believe, but on the order of micro and not milliseconds.

Either way, my original algorithm I implemented assumed the ideal Sleep case, but I found that Sleep( 10 ) actually sleeps between 2 and 8ms. Sleep( 1 ) sleeps the same time. I'm not sure why as every sleep I've used has the semantics of sleeping at least the time you give it.

From experience I'm guessing that Windows and Linux both use some timer that beats every 10ms, and this timer is used for process/thread scheduling and preemption. It is also probably the same timer used by GetTickCount and _ftime, as they both increase once every 10ms. I'm guessing that the way Sleep is implemented, it waits for that tick, and when it gets that tick it leaves immediately. So because the timer fires every 10ms, I may call Sleep 2-8ms before it fires.

But because Sleep may not sleep for very long at all I had to change my if ( dt < minDt ) implementaiton to use a while loop, and I had to do a lot of tweaking of the values in that loop.

Originally I did a Sleep( 10 ) but I didn't assume how long it actually slept, but when it was only sleeping for 2ms a lot of the times the sleeping wasn't sleeping long enough to reduce the lead time because I would never sleep more than once per loop. I then tried a longer sleep but I'm afraid that longer sleep values may show more jitter. So instead I decided to compromise and use a short term sleep(1) in a while loop, and greatly lower the threshold of when I perform the sleeps to allow for an extreme variance in sleep times.

During some trials of tweaking the loop structure and constants I did see some jittering particularly when the timer rate was set below 60fps. The reason for this could be frame mismatch with my monitor's refresh, and the sleeping delay + refresh delay if caught both at the wrong times may produce a slight jitter.

So the point is that the code needs more testing in a more complex (ie more CPU intenstive) loop, and to see how it works on different systems. And likely when a Linux version is made there needs to be specific tweaking to accomodate for how the Linux scheduler works.

Gillius
Gillius's Programming -- https://gillius.org/

CGamesPlay
Member #2,559
July 2002
avatar

It still used 100% of my CPU, but dropped to 30 when I held f.

BTW, perhaps you could do something like the following to assure Sleep works?

void Rest(unsigned int ms)
{
    clock_t end = clock() * CLOCKS_PER_SEC / 1000f;

    while(clock() < end)
        Sleep(1);
}

--
Tomasu: Every time you read this: hugging!

Ryan Patterson - <http://cgamesplay.com/>

A J
Member #3,025
December 2002
avatar

Quote:

specific tweaking to accomodate for how the Linux scheduler works.

which one ?

depending on the threads priority your algorthym will be either very wrong, or suddenly very right, either way the user is going to get varied performance, and as you can not regulate when your thread will be called, all those Sleep()'s will be screwing with your precsion timing.

your intentions are good, but im not sure you will achieve much, except for maybe more complexity.

___________________________
The more you talk, the more AJ is right. - ML

gillius
Member #119
April 2000

I make no assumption about how long Sleep rests for. I know this. If you looked at the code AJ you would know. The tweaking I had to do is with the sleep time, and not making the assumption that Sleep( 10 ) would sleep on average near 10ms, rather than 3ms on average. So now I don't even make any assumption at all about how long Sleep rests. It could be 2 minutes, it could be 0ms, it could be 10ms, and my code will work fine.

I do use a loop as was suggested by CGamesPlay already, this fixed the problem of never being able to rest enough because calling Sleep even once per frame didn't do it.

        do {
          Sleep( SLEEP_TIMEI );
          afterTime = getTime();
          sleepTime = (double)( afterTime - currTime ) / freqd;
        } while ( sleepTime < targetSleep );

CGames: if it was running at 60 fps, it should not have been using 100% CPU. Anything less than 60fps it should use 100% CPU. Did you see the framerate fixed to 60 fps but still 100% CPU usage? Under which Windows version were you running if you saw 100% CPU use at 60 fps.

Gillius
Gillius's Programming -- https://gillius.org/

CGamesPlay
Member #2,559
July 2002
avatar

Oh, well, I had to lower the max FPS to 20 to see it use 50% of my CPU. So it definately is a nice timer system, I think I might modify it for use in my engine, however I don't like the API, I think it could use some work. I don't like how nextFrame will block, it doesn't match what the function name says. I think the thing should either be callback based, or just rename the function WaitForNextFrame or something....

--
Tomasu: Every time you read this: hugging!

Ryan Patterson - <http://cgamesplay.com/>

gillius
Member #119
April 2000

Yes the code is completely and purely experimental. I meant it to be a working version of my theory rather than production code. I didn't even write any documentation for it.

I was playing with the timer system I currently used to fit this timer into that scheme. I see how the function name is misleading. I was also trying to consider how to better handle statistics and how to handle the frame skips and whatnot. It's the logic of the nextFrame function that I really wanted to focus on.

Were I to finish this code up I'd write documentation for it and make it more portable.

I appreciate the suggestions though, as it helps me see what I need to do to clean it up. If there is enough interest I may release the code more formally on my website.

Gillius
Gillius's Programming -- https://gillius.org/

Thomas Fjellstrom
Member #476
June 2000
avatar

I'd test, but I'm too lazy to get it to compile in linux ;) maybe later. :D

--
Thomas Fjellstrom - [website] - [email] - [Allegro Wiki] - [Allegro TODO]
"If you can't think of a better solution, don't try to make a better solution." -- weapon_S
"The less evidence we have for what we believe is certain, the more violently we defend beliefs against those who don't agree" -- https://twitter.com/neiltyson/status/592870205409353730

Kitty Cat
Member #2,815
October 2002
avatar

I got a question about this(and I can't find you in #allegro, so I'm bumping the thread ;)). My game logic is made to run at a locked 60fps, and I only use Sleep(1) when there's extra time after drawing a frame(so, if it's right on time, or behind, it won't wait at all).

How much would my game, which is written in pure C mind you, benefit from such a system? I don't want to have to go through the trouble of converting your code to C and implementing it if there's not gonna be much difference. Also, isn't gettimeofday() a bit slow to use for millisecond accuracy?

--
"Do not meddle in the affairs of cats, for they are subtle and will pee on your computer." -- Bruce Graham

gillius
Member #119
April 2000

Kitty that should work fine, because you have some method of knowing when you have "extra time". As long as you realize that Sleep(1) can sleep up to 10 milliseconds (ie your algorithm does not assume how long you slept to do timing), you should be fine. If you are doing that, you are doing basically what I'm doing. I implement dynamic-range dt value (non-fixed frame length) and frameskipping, so if your game uses dynamic frame length, then you might want to look at my code if you want to adapt it.

Gillius
Gillius's Programming -- https://gillius.org/

Kitty Cat
Member #2,815
October 2002
avatar

This is basically what I do:

while(1)
{
  while(logic_timer > 1)
  {
    --logic_timer;
    do_logic();
  }

  draw_scene();
  while(!logic_timer)
    Sleep(1); //or yield_timeslice() on non-Win32
}

--
"Do not meddle in the affairs of cats, for they are subtle and will pee on your computer." -- Bruce Graham

gillius
Member #119
April 2000

That should work fine, except the while loop should test for > 0 not > 1.

You handle 3 of the 4 cases that I handled. You handle if the computer goes too fast you will sleep. You handle if the computer goes too slow you frameskip. You don't handle pauses.

One thing I noticed is that if when you sleep the counter goes up by more than 1 (I'm assuming the while is > 0 and not > 1 as shown), you will frameskip some frames even though the computer is fast enough to render them.

I wonder if I handle that situtation myself. Hmm yes thinking about it I do handle it since I start remeasuring time after the sleep returns.

One thing I didn't do because I was a little too lazy but an improvment to make is instead of frameskips I should first check my "lead" and subtract from the "lead". This would effectively implement fractional frame skips -- I will either draw every other or every 3 frames but I can't skip 1/3rd of a frame. I can add that by doing negative leads so I can let the game slide around realtime by a few ms so that I could draw 2/3 or 9/10 of all frames rather than only 1/n where n is a positive integer.

Gillius
Gillius's Programming -- https://gillius.org/

Kitty Cat
Member #2,815
October 2002
avatar

Quote:

That should work fine, except the while loop should test for > 0 not > 1.

Right.. simple mistake when I threw that code up there(the actual code is a bit different, but is to the same effect).

Quote:

You don't handle pauses.

I could easy enough probably. what I actually do is set a variable to increment every 1/60th of a second, and then for every logic loop I increment a second variable, and break out, draw, and wait while the two variables match. I have a catchup_timer() function that sets the real-time variable to whatever the game-time variable is(I call this after a particularly cpu-heavy operation, like loading a level, or if I've missed like, 60 frames I'll call it to force the next frame to draw).

Quote:

One thing I noticed is that if when you sleep the counter goes up by more than 1, you will frameskip some frames even though the computer is fast enough to render them.

I don't see any way to really do that without quickly and actively monitoring performance, and delaying the frame skip until the frame after. Besides, at 60fps, it's not likely that the sleep call will make the real-time timer jump 2 tics instead of 1. Sleep(1) sleeps a max of 10ms(unless another process steals more, but that's not Sleep()'s fault per se) and 1/60th of a second is 16ms. So there's about a 4-in-10 chance that a sleep call will rest too long. And if you're game's falling behind, a single lost frame, 1 out of 60, isn't going to be noticeable, it's when you lose more that it's noticed, and by that time, you want to make sure you catch up quickly.

--
"Do not meddle in the affairs of cats, for they are subtle and will pee on your computer." -- Bruce Graham

A J
Member #3,025
December 2002
avatar

edit: never mind.

___________________________
The more you talk, the more AJ is right. - ML

amarillion
Member #940
January 2001
avatar

I prefer a fixed timer rate in my games, so I won't be using this method soon. But your idea of "conditional sleep" is very good.

gillius
Member #119
April 2000

The system, of course, could be modified for a fixed rate system by making the max framerate and min framerate the same. It might be worth looking over the comparisons code again to make sure that would be ok. If moved to a fixed system, doing negative lead would be even more important of a feature to add to the timer system, so that it starts frameskipping only when it is too far (5ms) behind rather than immediately -- allowing for some jitter and also allowing for fractional frameskip.

Gillius
Gillius's Programming -- https://gillius.org/

Go to: