Allegro.cc - Online Community

Allegro.cc Forums » Programming Questions » Game running much smoother WITHOUT timer

This thread is locked; no one can reply to it. rss feed Print
Game running much smoother WITHOUT timer
jeronimo
Member #7,645
August 2006

Hey there,
I'm new to Allegro and game programming in general.
After doing a tutorial (allegro Vivace), I made a little Pong game for practice.
I implemented a regular timer system (as suggested by the FAQ), that executes the game logic several times until the timer/gamecycle counters are equal, and then updates the graphics.

However, graphics aren't very smooth, and they appear MUCH smoother when I comment out all the timer stuff. I tried double buffering, page flipping and triple buffering, same story for all techniques.

I think the main problem is that, one time, the graphics update shows, say, 5 steps of ball movement, and the other time, it shows just 3 steps. This makes the movement quite jerky. Whereas without timer, every frame update is just one step of game logic, and although there are somme missed frames, I found it impossible to detect them at sight (only noticed this by comparing counters).

So, what do you think, is it a Very Bad Thing not to use a timer, and adjust game speed based on the refresh rate of the user (obtainable by get_refresh_rate())?
Cause that seems to give very smooth performance, and still the same speed on all computers if get_refresh_rate is reported correctly.

George Foot
Member #669
September 2000

You do get smoother results if you accurately sync to the refresh rate of the monitor, but it's very hard to do reliably on modern systems. Calling vsync every loop gets you half of the way, but you'll get occasional frame drops when either your code takes too long to execute, or Windows decides to click the hard disk drive for no apparent reason. If you don't care about this then it's OK to just use vsync, though as you say you might want to adjust your game's processing according to what the actual refresh rate is, so it doesn't run faster or slower depending on display settings. I wouldn't trust get_refresh_rate too much - it might be worth timing it yourself.

It's worth mentioning that in many games it's quite hard to make the logic work in that way. It does depend on the type of game though. It also makes your game less repeatable - things like recording replays can be more awkward.

You can still work around the stuttering from missing the occasional frame if you have access to accurate timing routines. An Allegro timer firing at twice the frame rate might be enough. So long as you've got enough to tell how many frames it's been since you last updated the display.

jeronimo
Member #7,645
August 2006

Thanks.
A timer with 1 ms resolution gives somewhat reasonable results, but a timer twice as fast as the refresh rate (2*60 fps) looks really jerky, because one or two steps drawn at once makes quite a difference.

Still I wonder, why does a typical modern game (like Call of Duty 2) run much smoother on my laptop than this *** pong game... Such games also use timers I suppose... What is a typical timer interval for such games?

Jonatan Hedborg
Member #4,886
July 2004
avatar

I would say that most modern games (3d games mosly) use some form of delta-timing.
That is, instead of having a fixed timestep (like in the FAQ), you meassure the time it took from the last frame to the current one, and multiply all your movements by this. This of course brings up a host of other problems. Simple euler integrationg will not very very well for example (then again, it never did).
One of the problems with this system is that if you have a sudden spike in time-per-frame (if for example something happens in the background which causes the program to freeze momentarily), all your objects could jump across the screen.

Oh, and because the allegro timer is quite inaccurate on windows, i would recomend using queryPerformanceCounter instead :) This is however not platform independant.

George Foot
Member #669
September 2000

jeronimo, I didn't mean update your game twice per frame - I just meant that with a timer ticking twice per frame you can tell easily whether you missed a vsync. e.g. you'd normally expect the timer to advance either one or two ticks; if it advances three or four ticks, you need to run your update twice; five or six, three times. You can do something similar if the timer is accurately running at the same rate as the vsync, but doubling the timer frequency like this makes you less dependent on this accuracy.

Actually I think you'd want your timer running somewhat under twice per frame, but certainly more than once per frame. It still breaks down if you don't accurately know the vsync rate, but it should be good for compensating for skipping just one or two frames.

Kris Asick
Member #1,424
July 2001

You will want to get used to using timers because without them there's no way to sync your game to everyone's computer. Yes, VSync can work, but the refresh rate of your monitor will not necessarily be the same on someone else's, and you can't just set a timer to the refresh rate because get_refresh_rate() does not work on every system. (For instance, mine, at least, the last time I tried to use it, which would be Allegro v4.0.2)

The jerky movement you're experiencing can be helped if you average the change in time between frames. For instance, if you're using a 1 ms timer to time the game to milliseconds, you can keep two timer variables, then every time you draw a new frame you switch which timer to update and make your new timing value the average of the two.

For instance:

1volatile int *tref; // Timer reference pointer to make Timer_Routine() small.
2volatile int timer1, timer2; // Internal timer values for averaging.
3float t; // Timing Value to Use in Game
4 
5void Timer_Routine (void)
6{
7 *tref++;
8}
9END_OF_FUNCTION();
10 
11void Update_Frame (void)
12{
13 // Vsync then draw the screen.
14 vsync();
15 blit(write_page,screen,0,0,0,0,XRES,YRES); // Double Buffering Write to Screen
16 
17 // Update the game timer variable t.
18 t = (float)(timer1 + timer2) / 2.0f;
19 // Switch the tref pointer between the two timer samples.
20 if (tref == &timer1) tref = &timer2; else tref = &timer1;
21 // Reset the value of the referenced timer to 0 to count the ms for the next frame.
22 *tref = 0;
23}

Then you simply call Update_Frame() at the end of every game frame to draw your double-buffer to the screen and update the timer.

You'll also probably need to lock *tref, timer1 and timer2 using the LOCK_VARIABLE() command before you initialize the timer Timer_Routine() as well.

Averaging for more than two frames might work better, but probably won't. If you sample too many frames at a time you'll get speed alterations when the framerate changes for longer than a split second.

This code is totally untested and written just now, but I believe it should work.

--- Kris Asick (Gemini)
--- http://www.pixelships.com

--- Kris Asick (Gemini)
--- http://www.pixelships.com

Todd Cope
Member #998
November 2000
avatar

Quote:

i would recomend using queryPerformanceCounter instead

Ditto. I think Linux has an equivalent high-precision timer as well. Look into these if you like to use a fixed logic rate. Even if the timer and refresh don't match up it still gives the illusion of smoothness if the timer is precise. It's kind of like how they record movies which are 24 FPS and you can play them back on your TV which is 60 FPS and it still looks right.

Tobias Dammers
Member #2,604
August 2002
avatar

TV is 60 Hz interlaced. A TV (at least a CRT one) draws even scanlines on one pass, odd ones on the next, giving you a frame rate of 30 Hz. The interlaced approach is used to prevent flicker. Nowadays TV screens display at a higher rate, but that doesn't change the actual frame rate of the TV signal.

The linux high-precision timer is gettimeofday().

You don't need a delta-timer per se; fixed-delta usually works well for 2d games, provided you choose a sensible delta. It should be significantly higher than the frame rate, and it should be a multiple of as many refresh rates as possible. 360 fps works well for 60, 72 and 120. 300 works for 50, 60, 75 and 100.
OTOH, a delta-timer works for all refresh rates, and, depending on your logic, goes a little easier on your cpu because you do only 1 update per frame.

---
Me make music: Triofobie
---
"We need Tobias and his awesome trombone, too." - Johan Halmén

Jonatan Hedborg
Member #4,886
July 2004
avatar

While 360fps seems like a very nice logic update rate, it leaves only 2.77ms per logic step. That may be too little. If you have some form of AI with pathfinding, a bunch of particles and some fancy sorting perhaps, that will regularly take more than 2.77ms, which will lead to skipped frames, yielding a stuttering gameplay.

Tobias Dammers
Member #2,604
August 2002
avatar

Pathfinding is a b1t$%^ indeed, but there's no need to pathfind every single frame. Sorting is usually not much of an issue because of time coherency (the list from the last frame is very likely to be almost sorted already).
Anyway, for complicated logic I'd go for delta-time anyway.
One thing though that is nice about constant-time logic: It is perfectly predictable and repeatable, which is useful if you want to record games and replay them - all you need to do is store the initial state of the random generator, and record time-stamped controller changes. Then to play back, reset the RNG, and insert control changes where necessary. This won't work in delta-time, if only because of rounding errors.

---
Me make music: Triofobie
---
"We need Tobias and his awesome trombone, too." - Johan Halmén

Todd Cope
Member #998
November 2000
avatar

Quote:

TV is 60 Hz interlaced.

I am aware :P. My point is still valid.

Mr. Big
Member #6,196
September 2005

jeronimo
Member #7,645
August 2006

Thank you all.
When I have the time, I'll check some things out.

Go to: