I've been reading this. It explains couple of game loop types. I'm mostly interested in implementing the last one: "Constant Game Speed independent of Variable FPS"
I'm having hard time to understand it. AFAIU it basically updates the game logic 25 times per second but it there is no limit on rendering. But it also implements an interpolation value for in-between times.
While implementing it I wanted to use allegro 5's timer functions so I came up with this:
But it doesn't seem to work. For some reason logic update and display update happens at the same time with a rate of 5 times per second (LOGIC_FPS).
I need advice.
Thanks in advance.
I'd suggest using al_wait_for_event_until.
This is how I would do it.
That's taken strait from my code so don't just copy and paste. It's only an example so will not work without the rest of my class.
Here is what I use to separate the logic from the drawing... Obviously isn't complete, I haven't started any project which needs to separate the logic from the drawing.. If you're creating a simple game probably you don't need it, but it's good to understand and work this way from the beginning.
You can compile it right away. The Beep() function is just to make a "beep"...
I may be wrong but actually you should separate the input from the logic and the drawing, that is what I'm trying to do. With Allegro 5 this is very easy to do.
But to do this, each object (if you're using C++, which I hope, since is the best for game programming) should have:
- A Draw Function
- A Logic Function
and..
- A Input function (depending if that object is controlled by the user.)
Anyway, I'm not pretty sure about all this, I still learning.
Desmond I guess it doesn't have anything to do with my game loop question but I'm eager to know what advnatage would I have by using a timeout? I originally used al_wait_for_event_until as seen in the wiki tutorials but then I thought the simple the better and got rid off the timeout. Thanks
AMCerasoli, I tested it and it sucessfully separates game logic and FPS but still the FPS is pre determined. As suggested in the link I gave why run at 25 FPS when its possible to run it at 300 FPS it's wasting clock cycles. My code basically does the same thing except it doesn't put a limit on the FPS. I still don't see why it doesn't work though. any ideas?
EDIT: Ok here is an hybrid between AMCerasoli's and my code. Someone please tell me why "Limitless update" gets written at the same rate with "Display updated"? Change the FPS value and see it for your self. What am I doing wrong?
else if(ev.timer.source == FPS) { //DRAW
You are checking the timer field of the event without knowing the type of the event. ev.timer and ev.timer.source could have absolutely any value. Stop it.
My take on separating logic and drawing :
With this simple logic you can implement any logic per second, and any frames per second, with frames being dropped if they are taking too long.
Edgar, that's what I came up with too. Can you solve my problem though?
BTW, what kind of effect does al_is_event_queue_empty(event_queue) have? why do I need it? thanks
edit: also, is it still logical to use interpolation with this method?
As suggested in the link I gave why run at 25 FPS when its possible to run it at 300 FPS
Drawing more frames than the number of Hz that your monitor runs at is pointless, as you will never see more than Hz number of frames anyway. If you're talking about logic per second that's different though.
Someone please tell me why "Limitless update" gets written at the same rate with "Display updated"?
Because you output 'Limitless update' each time any event occurs - whether it's a logic event or a display event.
Can you solve my problem though?
BTW, what kind of effect does al_is_event_queue_empty(event_queue) have? why do I need it? thanks
You need to exhaust all of the events that are in the queue before you update your display so that all the logic events have been processed.
if (al_is_event_queue_empty(event_queue)) {break;}
This line will break out of the event loop once there are no events left to process. If there are events left, it will go to the beginning of the loop and check the next event.
If you want to adjust the rate that each timer runs, you can do that with al_set_timer_speed.
Edit for your edit
edit: also, is it still logical to use interpolation with this method?
Well, you could do it with ALLEGRO_EVENT.timestamp and al_get_time. I don't see the point of using interpolation at this level though. It just complicates things needlessly. I also don't see the point of running logic at a different rate than the display.
Because you output 'Limitless update' each time any event occurs - whether it's a logic event or a display event.
Aah I see. It's because of al_wait_for_event(eventQueue, &e) right? It waits until an event occurs thus "Limitless update" happens exactly at the same speed of the fastest occurring event.
How can I truly make it separate though? I don't want to limit my FPS. I want to make it run as fast as possible while limiting game logic to 25 lps for example?
NVM I better use a good old get_tick like routine. since timers in allegro are tied to the events what I'm asking is not possible.
I don't want to limit my FPS.
It is physically impossible to render more frames per second than the refresh rate of the monitor. So set your display timer to the refresh rate of the monitor and forget about it.
I want to make it run as fast as possible while limiting game logic to 25 lps for example?
Is your logic really so expensive that you can't run it at the same rate as your display?
Edgar I agree that "FPS dependent on Constant Game Speed" type of game loop is simpler and works most of the time but I want to try new methods and see the difference for my self.
Also there might be scenarios where the refresh rate of the monitor is more than the computer can handle. For example my monitor runs at 75 Hz but 75 fps might be too much for my old GPU. In this case if game logic and fps are not separated game logic will slow down too. It's something that I especially not want in a network game. It's easier to sync players if the game logic has its own rate.
You can easily skip logic or frames with the algorithm I suggested, and you don't have to make your game speed the same as your frame rate, but it will then look choppy if you don't interpolate.
that's what I use now. you're right that "fps as fast as possible" is not really needed. for interpolation I use something like this:
and the main loop now looks like this
thanks everyone for your help
You are checking the timer field of the event without knowing the type of the event. ev.timer and ev.timer.source could have absolutely any value. Stop it.
Damn, is true I forgot to changed the last time!.
You should separate the drawing from the logic because:
Drawing more frames than the number of Hz that your monitor runs at is pointless.
Drawing takes 80 % more processing time than logic, basically you would be wasting GPU processing drawing the same frames, wasting battery if you're using a laptop, burning out your GPU, etc. this is why you Logic should always be faster than your frames per second
After 30 FPS, the human eye can't notice the difference between 30 FPS or 3000 FPS.
In this case if game logic and fps are not separated game logic will slow down too. It's something that I especially not want in a network game.
Well, I don't know what system are you using to create your network game, may be Peer-to-peer or client-server, but I think synchronizing the game completely is not a good option, I haven't created an only game, but I use to think about it, and in a Client-Server game I think the Server program doesn't need any logic per second, let alone FPS.
I think it should be something like: the client send info to the server the server check if that info is correct and then send it to the other clients. so the Server just need receive as many request it's possible, and you don't need to time out the LPS of the Server... Anyway, read this web page, is poor gold.
You are checking the timer field of the event without knowing the type of the event. ev.timer and ev.timer.source could have absolutely any value. Stop it.
Actually in this case it's fine, as all event types have the three common fields: type, source, timestamp. It would be better style to use ev.any.source though.
After 30 FPS, the human eye can't notice the difference between 30 FPS or 3000 FPS.
This old myth must die.
After 30 FPS, the human eye can't notice the difference between 30 FPS or 3000 FPS.
Actually I can quite easily see the successive images of a mountain sticking up out of the terrain as I spin a camera around in place, at least on a CRT. Even at 85 fps. But 30 fps is generally good enough.
I can also wave my finger around between my eye and the screen and see separate silhouettes, one for each refresh.