Allegro.cc - Online Community

Allegro.cc Forums » Game Design & Concepts » Variable Time Steps – Vsync – Fixed Time Steps

This thread is locked; no one can reply to it. rss feed Print
 1   2 
Variable Time Steps – Vsync – Fixed Time Steps
GaryT
Member #14,875
January 2013

I’m Using Allegro 5.0.8 With Windows

My understanding of variable time steps is to time the game loop off Vsync and then calculate the sprite position based on the Vsync rate.

Is it correct that variable time steps use Vsync? I’m guessing without Vsync it could still work, but the FPS would be the maximum any given computer could run at, since Vsync controls the game loop speed by waiting for the retrace signal, and so therefore FPS = Vsync. Please Comment.

After looking into Fixed time steps it seems the easiest way is to just use a timer to fix the rate, say 60 FPS (typical monitor refresh rate). The massive problem I have with this, as I’ve noticed others have had, is the jerking motion you get, which for me I hate even more when the FPS and Vsync are very close, but not absolutely exactly the same. Please Comment. Also here’s a great link: http://allefant.com/articles/vsync/

My observations moving at 600 pixels per second on a 60Hz monitor:

60 FPS = Smooth sharp motion for 2 to 3 seconds and then a severe jerk/pause/skip
120 FPS = As above but less noticeable and still quite bad
240 FPS = As above but quite a lot less noticeable
90 FPS = Smooth blurry motion but then only a slight jerk.

I’m guessing these results are entirely due to frames being skipped or displayed twice as explained in the above link. For 240 FPS moving at 2.5 pixels per frame the jerking is then only ¼ of that for 60 FPS moving at 10 pixels per frame. I wonder if for 90 FPS the jerking is hidden slightly amongst the continual smooth but blurry motion due to 60 and 90 being so out of phase. Please Comment.

I’ve researched interpolation and understand the basic concept, and even though I’m not sure how to implement it into code, I’m very sceptical of whether it could improve the situation where the typical 60 FPS and 60 Vsync are so very close (but not exactly the same) Please Comment.

I’ve used Vsync (without variable time steps) and the results are fantastic with stunning smooth continuous sharp motion. But is it true or not that it’s not a good idea to rely on Vsync as some graphics cards override any attempt to switch it on?

Please can somebody shed light on why some people are using the simple (no interpolation, but would this help anyway for 60 FPS and 60 Vsync) fixed time step system using a timer which results in awful jerking? I know sometimes you get lucky and the FPS and Vsync can be so close to being exactly the same (depending on who's computer the game is running on), that you might only get jerking every 10 or even 20 seconds or more. Please Comment.

To conclude is Vsync simply the very best and only way to get perfect motion, but with potential compatibility issues, and would you use it? Also if you do use Vsync, is there a 2nd or even 3rd method you use in case Vsync fails? Thank You Very Much. :)

Kris Asick
Member #1,424
July 2001

Vsyncing and interpolation are two very different methods for timing your game.

When you use vsync, all you need to do is access al_get_time() every frame and compare with the previous call to al_get_time() to determine how many seconds have passed. Then move all game objects by this amount.

Interpolation is a more complex process where your game logic and rendering happen at independent framerates and you must shift the positions of objects when you render them based on how much of the next logic frame has passed.

I know you were asking about this before, but unless you're aiming to make a commercial-quality product, interpolation is overkill and more work than you really need to do. Stick with straight vsyncing and simply move objects based on how much time has passed. IE: If you have an object moving 100 pixels per second and your FPS is currently 75, subtracting the return value of al_get_time() for the current frame from the previous return value of al_get_time() will return 0.01333333, thus your object would move a distance of 1.3333333 per frame.

I read your previous topic again too. Jerking when doing interpolation is often the result of interpolating backwards. (IE: Your object needs to be shifted by 25% of a frame, but you accidentally shift it a distance of 25% from its next position towards its current position, not from its current position towards its next position.)

--- Kris Asick (Gemini)
--- http://www.pixelships.com

SiegeLord
Member #7,827
October 2006
avatar

Kris Asick, you crazy. vsync has nothing to do with delta timing.

GaryT said:

I’ve used Vsync (without variable time steps) and the results are fantastic with stunning smooth continuous sharp motion.

I don't get this though. I'd like to see code comparing the different things you've tried. I have absolutely no clue how vsync would help or hinder stuttering in a classic game loop.

EDIT: Here's an implementation of that interpolation idea, incidentally... it's butter smooth with any value of FPS on my computer, with and without vsync:

#SelectExpand
1#include <stdio.h> 2#include <allegro5/allegro.h> 3#include <allegro5/allegro_primitives.h> 4 5int main() 6{ 7 al_init(); 8 al_init_primitives_addon(); 9 10 ALLEGRO_DISPLAY* d = al_create_display(800, 600); 11 ALLEGRO_EVENT_QUEUE* q = al_create_event_queue(); 12 13 float FPS = 59; 14 float dt = 1.0 / FPS; 15 16 ALLEGRO_TIMER* t = al_create_timer(dt); 17 18 al_register_event_source(q, al_get_display_event_source(d)); 19 al_register_event_source(q, al_get_timer_event_source(t)); 20 al_start_timer(t); 21 22 float old_x = 0; 23 float cur_x = 0; 24 25 float old_t = 0; 26 float cur_t = 0; 27 float offset = al_get_time(); 28 29 bool quit = false; 30 bool redraw = false; 31 32 while(!quit) 33 { 34 ALLEGRO_EVENT e; 35 while(al_get_next_event(q, &e)) 36 { 37 if(e.type == ALLEGRO_EVENT_DISPLAY_CLOSE) 38 { 39 quit = true; 40 } 41 else if(e.type == ALLEGRO_EVENT_TIMER) 42 { 43 redraw = true; 44 45 old_t = cur_t; 46 old_x = cur_x; 47 48 cur_t = e.timer.count * dt; 49 cur_x += 300 * dt; 50 51 if(cur_x > 900) 52 { 53 cur_x = -100; 54 old_x = -100; 55 } 56 } 57 } 58 59 float int_t = al_get_time() - offset; 60 float int_x = old_x + (cur_x - old_x) * (int_t - dt - old_t) / (cur_t - old_t); 61 printf("%f %f %f\n", old_x, cur_x, int_x); 62 63 al_clear_to_color(al_map_rgb_f(0, 0, 0)); 64 al_draw_filled_circle(int_x, 300, 100, al_map_rgb_f(1, 1, 1)); 65 al_wait_for_vsync(); 66 al_flip_display(); 67 } 68}

"For in much wisdom is much grief: and he that increases knowledge increases sorrow."-Ecclesiastes 1:18
[SiegeLord's Abode][Codes]:[DAllegro5]:[RustAllegro]

GaryT
Member #14,875
January 2013

Thank you both very much for your replies. :)

(For Windows Allegro 5.0.8, Vsync works for:)
Vsync called by: al_set_new_display_option(ALLEGRO_VSYNC, 1, ALLEGRO_REQUIRE); // Or ALEGRO_SUGGEST, Placed before the display is created.

Not Vsync called by: al_wait_for_vsync(); // Placed in the main loop (At least 1 other person has reported this does not work properly on windows. The motion goes very jerky)

I’ve found Vsync does indeed work at least on the 4 different computers I’ve been testing on:
HP HDX Laptop Windows Vista (Perfect)
Samsung Laptop Windows 7 (Perfect)
Acer Desktop Windows Vista (Perfect)
HP Pavilion Windows XP (Pauses a bit, not sure if it’s the computer though, or something else)

SiegeLord, Vsync used all on it’s own without any other form of time control causes the game loop to wait until the screen has finished refreshing. At that point the back buffer and frame buffer are switched in what must be some small time period available before the monitor starts to retrieve the next frame from the frame buffer.

As far as I can see this seems to be the absolute simplest method (Just 1 line) for producing perfect smooth motion on any computer that works compatibly with the Vsync request, which I’m guessing is most computers running Windows Vista or later and possibly XP as well.

I’m only a beginner to programming in general and have chosen C++ and Allegro because of their potential.

Kris it seems is confirming what I thought, which is to simply use Vsync to get perfect smooth motion, and then use delta time (just simply the time between each game loop) to calculate how far to move game objects. So SiegeLord, delta time and Vsync do indeed work logically together. Timing how long Vsync causes the game loop to wait based on the screen refreshing is not a problem, and using this measured amount of time to calculated how far to move objects so they appear the same speed on monitors of different refresh rates, also makes perfect sense.

Of course as I’ve read many times now during my research, there is the issue of collision detection and game physics becoming much more complicated using delta time in this way, as I know Kris has been involved in discussing in other posts.

SiegeLord thank you for providing your code example. I will attempt to understand the workings after work today. I have for now copy and pasted and run the program. It runs fine, but I am getting the significant jerking I get when just using fixed timing. :( I don’t doubt it runs as smooth as butter on your computer, but I strongly recommend you try it on several other computers to see if you get what I’m seeing. Sometimes around 60FPS game loop and 60Hz monitor refresh rate it’s just a fluke that the 2 are very close to being exact.

Just to point out one example of how it’s possible to be deceived relating to my last paragraph. My first language was Python which I started last August where I ended up using Pygame. Now I found from blitting the FPS to the screen, that in fact instead of getting exactly what you asked for by Clock.tick(), you got divisions of 1000 instead, being 1000, 500, 333, 250, 200, 166.6 and so on. For 60FPS you got 62.5 since 1000/16 = 62.5 and Pygame rounded up to the nearest division of 1000, and since 1000/17 was 58.8, 62.5 was what you ended up with. I also confirmed this by moving an object across the screen, and you could see the sudden increase in speed as you headed for Clock.tick(1000). Just imagine for anyone who hadn’t discovered this what the implications could be. It wasn’t mentioned anywhere on the web that I could find. I haven’t tested Allegro but does anyone know if you actually get the integer you ask for such as 58, 59, 60, 61, 62 and not just divisions of 1000 as in Pygame.

Finally Kris, I don’t yet understand how to implement Interpolation, but may be SiegeLord’s program will help me (Although clearly it’s jerking a lot on my HD HDX laptop). I’m going to take your advice Kris and stick with Vsync for now, and maybe use the simple delta timing method in addition, if I decide requesting a screen resolution accompanied with it’s available refresh rate is not as suitably compatible as I would like. Your right I’m just a hobbyist and a beginner and don’t want to over complicate things for myself at this stage.

Thank you both very much for your input, it’s all of great help for me.

Just one last question: Is it standard practice to use Vsync in combination with a Timer? If so please describe an example of how the two work together. :-*

Kris Asick
Member #1,424
July 2001

SiegeLord said:

Kris Asick, you crazy. vsync has nothing to do with delta timing.

True, Vsync and Delta Time are two different things, but can be used in tandem to avoid the need to use a timer. ;)

However, I feel I should point something out about Vsync and hardware acceleration: You cannot assume that Vsync will be allowed by the end-user's video hardware configuration!

If the end user's graphics drivers have disabled Vsync for all applications, then any game relying on Vsync for timing will hit extremely high framerates. Some people disable Vsync by default figuring that old adage "Framerate is everything" is more important. These people are idiots. :P

If the framerate is ever higher than the monitor refresh rate, then obviously, the monitor won't actually be physically capable of displaying the higher rate, and furthermore, it's entirely possible to end up with the monitor refreshing itself during a redraw, resulting in shearing.

This is why you have to use delta time in tandem with Vsync. If you don't, and simply rely on Vsync being a specific amount, the game will go insanely fast on any system where Vsyncing is not being performed. At least with delta time present, if Vsync is off and the framerate shoots up to 327, the game will still play properly.

And yes, set Vsync into the graphics driver, don't use al_wait_for_vsync(). It's simpler and more reliable to do it this way.

--- Kris Asick (Gemini)
--- http://www.pixelships.com

GaryT
Member #14,875
January 2013

Interesting Kris, ;D

So by using delta time, in addition to it keeping the graphics (linear at least) speed the same on all monitors, it does also act as a back up for when the Vsync request doesn’t work.

I know you are a fan of increased logic rate which clearly improves smoothness as many people have said and also suggested.

One more question: If Vsync fails does this mean the FPS will simply run as fast as the computer is able. I have tried running flat out with no timing control at all, and I sometimes get slight speed variation whilst the game is running, with objects speeding up and slowing down. This wasn't my speed control plan of course, but just a test.

Is this because other demands on the computer are continuously resulting in frame rate variation, which is only noticeable when running flat out, and that by using delta timing, if Vsync fails this speed variation is likely to be corrected/controlled.

Another question: Is it standard practice to use delta timing without also using Vsync. Does using Vsync or a Timer cause the game to rest() freeing up the system. If so does using delta timing without Vsync mean possible issues with overloading the system/CPU.

SiegeLord
Member #7,827
October 2006
avatar

GaryT said:

SiegeLord, Vsync used all on it’s own without any other form of time control causes the game loop to wait until the screen has finished refreshing. At that point the back buffer and frame buffer are switched in what must be some small time period available before the monitor starts to retrieve the next frame from the frame buffer.As far as I can see this seems to be the absolute simplest method (Just 1 line) for producing perfect smooth motion on any computer that works compatibly with the Vsync request, which I’m guessing is most computers running Windows Vista or later and possibly XP as well.

This only works if you absolutely guarantee that you game will never drop below the refresh rate of the monitor. Additionally, since refresh rates differ between different monitors, your game will run at different speeds on different systems! Maybe you should test your ideas on different computers before suggesting such a flawed algorithm.

Quote:

So SiegeLord, delta time and Vsync do indeed work logically together.

No, they don't. Delta timing does not require vsync, and vsync is not the only way to limit FPS with delta timing.

Quote:

Sometimes around 60FPS game loop and 60Hz monitor refresh rate it’s just a fluke that the 2 are very close to being exact.

As I said, I tested with all sorts of values of the FPS variable (from 20 to 70), and it was smooth for all of them. I'll try it out on other computers, but I bet it's your computer that's at fault here (something like Aero is on or something). On KDE if I don't turn off the desktop compositing is it also jerky... even with just vsync on, so it's absolutely the case that the user might have such a terrible graphics stack that no matter what you do it'll be jerky until the user fixes his damn system.

This is why you have to use delta time in tandem with Vsync.

My fixed timing loop works just fine with vsync, so I absolutely do not see your point.

There's a lot of conflation (even in my code, hah) between the number of updates per second and the number of frames drawn. The idea of interpolation is to decouple the two and still get smooth motion. The reason why you want to decouple the two is because you don't control the refresh rate of the user monitor, but at the same time it is very desirable to keep the number of updates per second the same for everybody. So in practice your frames per second will vary between computers, while the updates per second will be the same. And my code above (implementing ideas by Elias) guarantees that.

Lastly, delta timing is a non-starter. I'd rather have jerking that ever using that.

"For in much wisdom is much grief: and he that increases knowledge increases sorrow."-Ecclesiastes 1:18
[SiegeLord's Abode][Codes]:[DAllegro5]:[RustAllegro]

Kris Asick
Member #1,424
July 2001

SiegeLord said:

My fixed timing loop works just fine with vsync, so I absolutely do not see your point.

I didn't mean you couldn't vsync without delta time. I meant using delta time without vsync was a bad idea, as far as burning through tons of CPU power.

Anyways, back to Gary's situation...

If the speed of the game is changing at random when using delta timing then you're not doing it right. Consider the following:

Framerate = 100 FPS
1 / 100 = 0.01 Delta Time
All objects should move by their present speed value * 0.01

Framerate = 20 FPS
1 / 20 = 0.05 Delta Time
All objects should move by their present speed value * 0.05

Framerate = 250 FPS
1 / 250 = 0.004 Delta Time
All objects should move by their present speed value * 0.004

It also helps to ensure that you never actually call al_get_time() more than once per frame, otherwise it might be different with every call. Store the value and reference that stored value instead.

But yeah, if Vsyncing is disabled at the driver level, then using delta time ensures that even at super-high framerates, the game will still run the right speed, but it will burn as much power as it can in the process.

As for your other question, everyone has their own methods for implementing timing into their games and their own opinions about the best way to accomplish it. However, smooth-as-silk gameplay without shearing is impossible without vsyncing, and the lack of vsync is far more noticable in 2D games than in 3D, so while a lot of 3D games don't vsync by default, since vsyncing can kill the framerate if it can't be maintained at its maximum, many 2D games do.

As for freeing up CPU time, this is driver dependent. I recently found out when testing this with my current game project that normally my game burns about 50% CPU power across both cores... which seemed extremely odd, until I turned off a driver setting called "Threaded Optimization". After doing this, my game's CPU usage dropped so low I couldn't measure it. ;D

Basically, since the vsyncing in A5 is done at the hardware level, it's the hardware and drivers that will determine how much CPU power this wastes, if any. As for using rest(), it's a good idea to call rest(0) at the end of each frame to give up the remainder of the current time slice to other processes which may need it. This is mostly just to play nice with multitasking.

Using any number higher than 0 with rest() could give up more CPU time than you would want your game to part with and it may also screw with graphics driver optimizations. Plus, at least on Windows, the granularity of the event scheduler's timer can be as bad as 50 ms on some systems, so even if you called rest(1), you may lose 50 ms of time, not just 1. That said, most modern systems won't have it anywhere near that bad and may even be able to give you the rest() precision you want, but I still wouldn't rely on it.

--- Kris Asick (Gemini)
--- http://www.pixelships.com

SiegeLord
Member #7,827
October 2006
avatar

I meant using delta time without vsync was a bad idea, as far as burning through tons of CPU power.

Even that's inaccurate. You can set up a timer that will prevent you from drawing above a certain rate without using vsync. Vsync is ultimately only necessary for one thing only: removing shearing. The frame limiting aspect of it is an incidental benefit that you can't really rely on anyway.

"For in much wisdom is much grief: and he that increases knowledge increases sorrow."-Ecclesiastes 1:18
[SiegeLord's Abode][Codes]:[DAllegro5]:[RustAllegro]

Kris Asick
Member #1,424
July 2001

As for your other question, everyone has their own methods for implementing timing into their games and their own opinions about the best way to accomplish it.

;)

But yes, I'm aware that games that rely on delta time often have a fallback timer that "caps" the framerate to a particular maximum.

--- Kris Asick (Gemini)
--- http://www.pixelships.com

GaryT
Member #14,875
January 2013

SiegeLord, of course the frame rate will vary depending on the monitor refresh rate when using Vsync. That’s what using delta timing deals with. And for basic 2D games or any game where the “game will never drop below the refresh rate of the monitor”, using Vsync with variable time steps (at least for many people, especially beginner games programmers) seems like quite a good and easy method to use, or at least to get started with. :P

Also as Kris has already mentioned as well, I also did not suggest that delta timing required Vsync. In fact I was clearly asking about both methods and how they might work together, as stated in my first entry.

Initially you stated you have no clue how Vsync could help stuttering. This part was absolutely Bonkers. If not using any other form of time or smoothness control other than a fixed timer (as many beginner games programmers do), then when the FPS and Monitor Hz don’t match exactly, you get awful stutters. Again I’ve previously described this. Now by using Vsync (compatibility dependant I Know) you automatically (Yes potential FPS have to stay higher than monitor refresh) get beautiful smooth movement. :o

So not only does Vsync prevent shearing, it also potentially eliminates the profound and fundamental miss alignment of game loop FPS to monitor refresh Vsync. Please read through this article which explains with great illustrations: http://allefant.com/articles/vsync/ Again I describe all this in my initial entry.

I’ve just today at work tried my previous test program on the XP computer, but using full screen mode, and was absolutely 100% delighted to find that the quite bad stuttering I was getting before when using windowed mode, was totally gone. :)

So now as a beginner I’m happy to say I have a nice starting point to carry on learning, being equipped with a simple and basic solution that works with perfect 100% smooth movement on 4 different computers, ranging from 8 years old to only 6 months, and also across 3 versions of windows, XP, Vista & 7.

Incidentally, and I promise I’ve just tested this after getting home, I’ve tried your interpolation program on the other 2 computers at home. The results are: It moves completely smooth on the Samsung laptop, but still very jerky on the Vista desktop. I restarted both computers, fully let them boot up and settle down, but the jerking still happened. :'(

So to conclude and with respect, all I know and understand is what you see I have written. But the point is without a shadow of a doubt, from testing your program on 3 computers, I’ve found there to be a profound stutter on 2 of them (I wish there wasn’t but there just is). If you haven’t yet witnessed the jerking, just try 4 to 5 different computers using various versions of Windows and it’s very likely you’ll get to see. Even if you don’t, why on earth would I attempt to base my games on your example code given the results I’ve experienced and described. And remember my own test program works with beautiful 100% smooth movement on 4 different computers, ranging from 8 years old to only 6 months, and also across 3 versions of windows, XP, Vista & 7.

Finally to Kris, thank you for your last 2 entries, especially the 1st of the two. More very interesting advice. ;D

SiegeLord
Member #7,827
October 2006
avatar

So I tried my code on Windows, and yes it caused some jerking. Then I added the code you yourself suggested to add (the display option thing) and it worked just fine (only for Direct3D backend though, for OpenGL the display option was counter-productive and an explicit vsync performed better). Now why did you make that adjustment? Were you being purposefully obtuse? I now tested on 3 different computers, 4 different OSs (Windows 7 with and without Aero, Windows XP, Linux Mint and an older Linux Ubuntu) and on all of them it was butter smooth (given that you either removed vsync or correctly switched the mechanisms for vsync for OpenGL/Direct3D).

GaryT said:

That’s what using delta timing deals with.

Nobody should be using delta timing, period. Not beginners, not veterans, nobody.

Quote:

Now by using Vsync (compatibility dependant I Know) you automatically (Yes potential FPS have to stay higher than monitor refresh) get beautiful smooth movement. :o

Maybe you should learn to read. I specifically stated that my code produced smooth movement with and without vsync. Without vsync you just get tearing and 8000 FPS for no good reason. You didn't even try commenting that vsync call out to see if I was right.

Vsync does absolutely nothing for stuttering. You are completely misunderstanding how my and your code works, and the article by Elias. Look at the quote in his article:

In the following, we don’t talk about shearing any longer, but the problem of getting smooth animation with VSync enabled. So from now on, VSync is implicitly assumed to be always enabled.

His entire article after the introduction is about stuttering with vsync on. Vsync does nothing to stuttering.

Quote:

And remember my own test program works with beautiful 100% smooth movement on 4 different computers, ranging from 8 years old to only 6 months, and also across 3 versions of windows, XP, Vista & 7.

And it uses delta timing... so you've thrown out the baby with the bathwater.

"For in much wisdom is much grief: and he that increases knowledge increases sorrow."-Ecclesiastes 1:18
[SiegeLord's Abode][Codes]:[DAllegro5]:[RustAllegro]

Kris Asick
Member #1,424
July 2001

SiegeLord said:

Nobody should be using delta timing, period. Not beginners, not veterans, nobody.

I would argue that, for a beginner who wants smooth-moving gameplay, this is the simplest way to accomplish it. All other methods are more complicated in some way.

Plus, it really does help to know how delta time works because it DOES have some uses. My current project relies on an interpolation system, but it's much easier to time animations or fixed movement paths with delta time, and since the interpolation system needs to calculate delta time to work in the first place, I just borrow its delta time value to do those things. ;)

--- Kris Asick (Gemini)
--- http://www.pixelships.com

GaryT
Member #14,875
January 2013

1) From reading (including Kris’s input to this post) how many other people like to use variable time steps as a relatively easy way to achieve smooth movement, I don’t get why you feel so strongly against it. It’s just a choice. :-/

2) Your 2nd quote of mine was not aimed at your program at all. The only thing I said about your program was that it did not run smoothly on the computers I tested it on as I described, that’s all. In addition when I’ve declared Vsync to stop the stuttering, I’ve always been talking about when using it completely on it’s own. Having said that clearly using it in combination with variable times steps can be even better. I’ve never intended to give the impression I was talking about using it in combination with fixed timing of any sort, as yes I absolutely agree, how could Vsync work in the simple way I’ve been getting at, if some other timing system was to take charge of the FPS. However, without any other control method, Vsync on it’s very own does delay the FPS to match the Monitor refresh. That’s why I thinks it’s incorrect to declare using Vsync doesn’t help in any way to improve stuttering. Clearly not the case. :P

3) As for your references to the article stating that Vsync is assumed to be always enabled, yes I agree that’s totally true, that’s what it says. The first point here I’d like to make is: The specific examples you are referring to use fixed logic rates that would be set by timers. So as I’ve said in point 2) above, I completely agree that Vsync in these examples cannot stop frames from being skipped or displayed twice, and I’ve only ever right from the start of this post, been trying to identify what a wonderful job Vsync does to produce completely smooth movement when Vsync alone controls the FPS/Logic rate, and also how it might work together with variable time steps.

4) As for Solution 1: Variable Time Steps in the article, this is where the author explains that rendering at the time of Vsync and also using variable time steps to calculate the amount of movement required per frame, produces smooth movement. Even better to again clarify how using Vsync on it’s own produces smooth movement is Solution 4:

Solution 4: Clearly states that if you can obtain a refresh rate from available graphics modes close enough to your initially intended logic rate, then you might choose to simply use this to set your logic rate, resulting in completely smooth movement without the need for interpolation. It then continues to explain in the 2nd paragraph that if you do indeed choose this method, be sure to time your game off Vsync. It follows on to say if you let your game run at your initially intended logic rate that’s close to Vsync (or some multiple) but not the same, then you will still get a stutter. So in other words, don’t if you use this method, force your logic rate to be fixed at your initially intended rate by use of a timer or any other method, but instead time your logic rate/ game loop off the Vsync function all on it’s very own. To achieve this you can simply just not have any timer code in your program, but instead just use the single line of code initiating Vsync. This is exactly what I’ve been doing all along, Solution 4: ::)

So now to finally conclude:

Solution 1: From the article is a method which uses Vsync and variable time steps to produce smooth movement.

Solution 4: From the article is a method which uses Vsync all on it’s very own to produce smooth movement based on attempting to obtain an available refresh rate from the various graphics modes available.

Vsync definitely does do wonders for stuttering!

I think the baby and the bath water are still both ok. Only Joking. ;)

Just to let you know I commented out the Vsync Line and yes your program runs very smoothly now on my HP laptop, and no doubt on my other computers as well. I think on Windows that line is best replaced with: al_set_new_display_option(ALLEGRO_VSYNC, 1, ALLEGRO_REQUIRE); //Or ALLEGRO_SUGGEST to be included before the display is created.

One question though if you don’t mind as I’m still a bit new to all this to fully understand how it works: When I change the float FPS variable to any number, even as low as 5, the program still runs perfectly smooth and at the same speed. Please can you let me know what I should be seeing when changing this parameter. Thank You SiegeLord. :-*

SiegeLord
Member #7,827
October 2006
avatar

GaryT said:

I don’t get why you feel so strongly against it. It’s just a choice. :-/

It is not just a choice. Variable size steps are a bad design. Yes many people use it because they are ignorant, and the users pay the price. E.g. a popular game called Super Meat Boy used variable time step and here's what the result was:

"Super Meat Boy" is a difficult platformer that recently came out for PC, requiring exceptional control and pixel-perfect jumping. The physics code in the game is dependent on the framerate, which is locked to 60fps; this means that if your computer can't run the game at full speed, the physics will go insane, causing (among other things) your character to run slower and fall through the ground.

Here's a great discussion about the topic here: http://gafferongames.com/game-physics/fix-your-timestep/

Also, every single physics engine suggests or forces a fixed time step on you because otherwise they won't work:

A variable time step produces variable results, which makes it difficult to debug. So don't tie the time step to your frame rate (unless you really, really have to).

Using a fixed time step is highly recommended. Doing so can greatly increase the quality of the simulation.

If you pass maxSubSteps=0 to the function, then it will assume a variable tick rate. Every tick, it will move the simulation along by exactly the timeStep you pass, in a single tick, instead of a number of ticks equal to fixedTimeStep.

This is not officially supported, and the death of determinism and framerate independence. Don't do it.

Issues with variable time steps come up even without delving into physics. E.g. if you're implementing timeouts you can get immense variability with variable timestep size (e.g see this discussion: https://www.allegro.cc/forums/thread/600021/807935#target). Even the article by Elias specifically states:

For various reasons, variable timestamps are bad in a game though: If you have constant acceleration instead of constant velocity (e.g. not “move 4 pixel/second”, but “accelerate 1 pixel / second²”), you need a complicated integrator to get the right positions. Or think of non-linear motion, e.g. a circle path. And in general, physics quality might now differ depending on the vsync rate. For networked games which need to stay synchronized, it might not be possible at all, and for other multiplayer games, clients with a higher vsync might have an advantage or disadvantage due to more accurate physics prediction. For many classic style games, it simply will make the game logic and collision detection much more complicated.

So, some incredibly smart people (even I would find it incredibly daunting to implement a physics engine, much less a networked peer to peer physics engine as some have done) emphatically say that delta timing is bonkers. It is of course your choice to be wrong.

Quote:

Clearly states that if you can obtain a refresh rate from available graphics modes close enough to your initially intended logic rate, then you might choose to simply use this to set your logic rate, resulting in completely smooth movement without the need for interpolation.

And are you doing that? From everything you've told me so far you are just using one line of code to just enable vsync. That method requires quite a few more lines of code than that. Also, this kind of nonsense requires you to run a fullscreen game, since you can't specify refresh rate for windowed games. Additionally, this method is still a fixed step method, you just choose a fixed step that is dependent on the refresh rate. Lastly, it then says:

Of course this won’t work if it’s a network game.

The fact that you can't do windowed games with this method makes it completely untenable in my opinion.

Quote:

Solution 1: From the article is a method which uses Vsync and variable time steps to produce smooth movement.

Which is bad as is said in the article itself.

Quote:

Solution 4: From the article is a method which uses Vsync all on it’s very own to produce smooth movement based on attempting to obtain an available refresh rate from the various graphics modes available.

Which is not what you're doing, and even if you did, the method has some serious limitations.

"For in much wisdom is much grief: and he that increases knowledge increases sorrow."-Ecclesiastes 1:18
[SiegeLord's Abode][Codes]:[DAllegro5]:[RustAllegro]

GaryT
Member #14,875
January 2013

Thank you SiegeLord, I take on board all your advice in that last reply, and yes I realise there is without question complications that come with using variable time steps.

Firstly to answer your question, I am just using the single line of code at the moment to initialise Vsync. As for obtaining the refresh rate I’m mostly just using the tutorial code from the Wiki, and still just in the early stages of experimenting:

ALLEGRO_DISPLAY_MODE disp_data;
al_get_display_mode(al_get_num_display_modes() - 1, &disp_data);

al_set_new_display_flags(ALLEGRO_FULLSCREEN);
display = al_create_display(disp_data.width, disp_data.height);

int fsx = (disp_data.width - width) / 2;
int fsy = (disp_data.height - height) / 2;

int rf = disp_data.refresh_rate;

fsx and fsy are just my off-sets to center my display when in full screen mode.
I was thinking of applying a loop to line 2: al_get_display_mode in a first attempt to retrieve all available graphics modes. As it just retrieves the last one at the moment which has the highest resolution. So this up to now is just my very primitive effort of obtaining display mode information which accompanies the single Vsync line I have referred to.

So yes I am to a limited extent implementing solution 4.

Question Relating To The Above: I’ve only really used the above sample code to set any given computer to the highest resolution as you can see. Now after displaying the refresh rate to screen on my XP computer it shows to be 75Hz, but the monitor is only 60Hz. So my question is: If the graphics card can put out 75Hz, the monitor can only process 60Hz, and I have Vsync turned on, Will the game just run at 60Hz and therefore the potential 75Hz being displayed is not applicable in any way in this instance. I’m a little confused about this. I’m assuming because of the very smooth animation that the test game is producing, the synchronizing is to the monitor not the graphics card. If this is the case then I suppose I need to retrieve monitor information not graphics mode info. What do you suggest? ::)

My situation is that for the near future I just want to get on learning C++ and Allegro. I desperately wanted a solution to produce smooth graphics above anything else. I cannot stand to see jerking motion on the screen. If I can make a few basic 2D games to help me on my way, that display smooth motion on the various computers I’m testing on, then I will be extremely happy with that at this stage. ;)

I am very interested in the interpolation method and it seems many people simply mention it on the forums without really saying anything specific. I wonder if this type of interpolation is one specific technique for this application, or if there is more than one. Also I wonder how many beginners ever become properly aware even, of this general situation relating to smooth motion and timing. I say this because after looking on the forums for quite some time now, some people have been stuck like I was with the so often (typically appearing in beginners tutorials as well) method of using just a timer set to 60 FPS (nothing else just a timer). This drove me bonkers since the stuttering was for me very bad. I’ve described my findings on this in my first entry.

Anyway that’s a bit about my situation. Back to interpolation. I have run your program and it is smooth. I can’t fully follow it through yet, but was guessing, as I reduced the float FPS variable down to say 5, to see something such as a large reduction in speed or increase in choppiness, but it displays the same. Please can you confirm about this.

So thank you again for your latest reply, and I will be reading the links you have provided to further help me appreciate what it’s all about.

Please don’t forget to answer my question relating to my sample code above about the relationship between graphics modes, monitor refresh rates, and Vsync. Anyone’s welcome to join in ! Thank You. :)

Kris Asick
Member #1,424
July 2001

There is one thing I should probably point out about using variable time steps (IE: delta time), and that is that if the framerate drops too low, or if you are using any compounding math at all, the gameplay will fail to be consistent across all systems. :P

So yes, delta time has flaws and interpolation helps get around them, but it's still a heck of a lot simpler to code straight delta time over interpolation.

But I have a question for you, Gary: Have you ever made a game before?

If the answer is "no", then you really should worry less about how your game works and more about just making it in the first place. Once you have a working prototype, then you can worry about tweaking it to be more professional and/or use more advanced/reliable techniques, otherwise you may be going through all this effort to learn interpolation and such for nothing. :P

--- Kris Asick (Gemini)
--- http://www.pixelships.com

GaryT
Member #14,875
January 2013

You are asking the right question Kris. The only game I’ve made before was during October, November last year after learning Python. I ended up using Pygame which as you might know is relatively easy to get started with for new programmers. It was nearly complete, but I was being incredibly silly and simply controlling the game speed by changing the Clock.tick() value and/or using a time waster for a delay loop, and just moving one pixel per frame. As is obvious this made things even simpler with collision detection where in my Pac Man game (a near copy of the old Oric Munch from those days) Munch was squeezing through a 25 pixel gap the same width as himself. :D

Anyway, after using slightly more efficient code and further testing the game on other computers I hit a speed issue. If I were more experienced I would feel embarrassed right now but I’m not so I’ll carry on. It became immediately obvious that running the game at around 1000 to 2000 frames a second with a delay loop of this sort, was never going to work compatibly at all, not even amongst the few computers I’d been testing on.

So now I simply set Clock.tick(60) and this is exactly when this whole issue for me started, and I’ve been looking into it for about a month now. The very first thing I noticed was on all four computers I had the awful jerking I described at the beginning. I researched very hard and quite a few people were clearly unaware including myself that simply using a fixed timer all on it’s own, is typically going to produce such jerking/jittering. This is very obvious and clear to me now. ;D

I also did not like the fact that I had to have the correct version of both Python and Pygame installed on the target computer to run a game. I started learning C++ and I’m approaching a similar stage to where I reached with Python. I like very much that I can static build and just run my programs/games on any Windows computer. Also I know the potential of C++ and/or Allegro far exceeds Python in the long term.

I completely agree with you Kris about making a complete game first and worrying about the finer details later, but I never intended to make an over kill of this smooth movement issue, and I’m not. What I’ve learned from you and SiegeLord and other peoples input in other discussions, I greatly appreciate. :)

Now I have the Vsync and Full screen functionality, I’m just going to move forwards and complete a basic 2D game, knowing I can at least make it run compatibly on a few computers with Smooth Graphics. I don’t regret my emphasis on the smoothness issue. :P

I cannot stand jerking graphics!

Back to my two questions in my last post:

1) Monitor Hz / Graphics Card Hz / Vsync. Please have a look. Thank You.

2) SiegeLords Interpolation program. What should I see on screen after reducing the float FPS variable to say 5. Thank You.

Kris Asick
Member #1,424
July 2001

It... shouldn't be possible for the graphics card and display itself to be running different frequency rates. If your graphics card is running 75 Hz and it's showing up on the display properly, then it's entirely possible the display is running 75 Hz, regardless of anything it's telling you.

I only have a 60 Hz LCD monitor myself, but it IS capable of going as high as 72 Hz if the signal resolution is less than the native resolution.

A part of the reason for this is that text modes run 70 Hz, so most LCD monitors for computers can handle low-resolution modes at 70 Hz, even if they're only supposed to be a 60 Hz monitor.

--- Kris Asick (Gemini)
--- http://www.pixelships.com

GaryT
Member #14,875
January 2013

About 6 months ago the graphics card in my XP computer stopped working completely and we nearly scrapped the computer. Eventually we found a Nvidia card that fit within the available space. It’s been working ever since.

I’m simply using the disp_data.refresh_rate; and displaying it to the screen, so I guess it will be displaying correctly. The same test program I’m using displays on screen: 60Hz for both my laptops, and 75Hz when running the program on XP which uses a 60 Hz monitor. Interesting what you said about this not necessarily being the maximum rate though.

So then, does this mean that even if I had a super duper graphics card capable of 120Hz, that it should communicate with the monitor, detect the monitors available refresh rates, and restrict displaying it’s own potential 120Hz, and instead just display 75Hz if that was the maximum possible refresh rate of the monitor, for the given program/game. If this is the case and my new graphics card is capable of outputting over 75Hz, then perhaps 75Hz is the true limit of my old 60Hz monitor. Does this sound plausible. Please correct me if I’m misunderstanding you though. Thanks Kris. ;)

Audric
Member #907
January 2001

If the videocard misunderstands the monitor capabilites and refreshes graphics faster, I think you just get the same result as if vsync is OFF and you (the programmer) modify the screen graphics faster the the screen frequency: the extra images are just not seen.

GaryT
Member #14,875
January 2013

Hello Audric, I can see how that follows through logically.

So if the game is not executing as if Vsync has failed completely, but instead the graphics card or whatever part of the system is responsible for retrieving the information, does so, but in error, then yes it follows that the game loop speed could be set by an incorrect Vsync value (faster in your example), and this might lead to a jerky display again with some frames being skipped.

But whereas if Vsync failed completely and the game was designed to detect this and still continue to run, and Vsync was the only method of speed control, then the game loop would run at lightening speeds.

So if the game looks perfectly smooth and is running at a typical speed, then I would assume based on the above that Vsync is indeed working correctly. That would mean my XP computer monitor at work probably is capable of 75Hz.

I still feel surprised though. Thank You Audric. :)

Kris Asick
Member #1,424
July 2001

Audric doesn't have it right... sort of. There's a difference here between sending updates to the graphics card faster than the frequency it's refreshing the display at, and actually trying to run the refresh rate faster than the display can handle.

There's a specific range of refresh rates an LCD panel can handle electronically. If the refresh rate exceeds this limit, the panel will completely fail to render the image and will usually produce an error message saying the frequency is out of range.

IE: If you try to run a 120 Hz signal through to a 60 Hz monitor, the monitor will simply throw up an error message on-screen and you won't see anything else. At least, modern monitors work this way. DO NOT attempt to run an over-frequency signal to an old CRT monitor that lacks multi-sync capabilities, or you could very well fry its electronics. :P

But yeah, typically your computer will have separate display and graphics card drivers. The graphics card will have its own limits, as will your display, and only the modes and frequencies that BOTH can handle will be available to you.

Most modern graphics cards can handle 120 Hz easily. Heck, my old GeForce 2 GTS card could! (But only at 640x480 resolution.) I think most newer cards can handle 240 Hz as well, but I haven't actually looked into this. Typically, your display will be the primary limiting factor.

--- Kris Asick (Gemini)
--- http://www.pixelships.com

GaryT
Member #14,875
January 2013

I like that answer Kris very much. I’m now feeling even more convinced that my monitor can refresh at 75Hz. Cheers. :D

But hang on a moment. I’ve just been reading how monitors are capable of displaying at higher frequencies if a lower resolution is selected. So I’m wondering now, even though my program is set to Full Screen Mode and is also achieving the Maximum Screen Resolution, that maybe because of how I’m using Allegro to obtain the refresh rate, perhaps I’m simply retrieving the maximum possible refresh rate, but which applies to some lower resolution.

I was just assuming before that by selecting a particular graphics mode i.e. al_get_display_mode(al_get_num_display_modes() - 1, &disp_data); this selecting the one at the end of the list having the highest resolution, that associated with it would be the refresh rate available for this resolution. I still would expect this to be the case though. ;D

Kris Asick
Member #1,424
July 2001

Well, one way to know for certain is to just put a framerate ticker in your game and see what it says. ;)

--- Kris Asick (Gemini)
--- http://www.pixelships.com

 1   2 


Go to: