Thanks for the feedback everyone. I have improved the rendering pipeline. Now it doesn't create unnecessary stage changes when drawing all the grass and so on by batching the calls based on depth. The engine also stores the "render descriptions" instead of recreating them every frame, which should produce much less garbage for the GC to consume.
Macbook Pro 2.4 GHz with Nvidia Geforce 9400m. 256MB VRAM.
Windows 7 32 bit.
I had between 51 and 60 FPS depending on where I was relative to the clouds. I guess the clouds were making me a bit fill limited.I did not edit the graphics settings.One of my 2 cores was constantly on 100%.BTW this is awesome, really nice job!
Thanks! Is there any way you could increase the resolution or set it to fullscreen, in order to see if the problem is fill rate? You can use ESC to quit if you try it fullscreen. I attached a more optimized version, so please try that if you can. I get an extra 2-4 FPS on my netbook.
seems to me you have some tuning to do if the above spec can barely manage 60fps....
The problem is that older cards will truly have a harder time with my rendering pipeline. I am trying to ensure that it's as fast as it can be, but the limitation is the speed a shader can execute. My game may not look like Crysis, but resolution-independent graphics do require a capable GPU.
I was fiddling with an A5 prog a couple weeks ago and it was taking up one core, it was using the timer to limit al_flip_display() to 60 fps but since it simply skipped that part, it was still running full out going through the event loop. I had to put a calculated al_rest() into it to get it down to approximately idle.
I'll see if this relates to the high CPU usage that my game seems to require! Thanks.
Thought this thread was going to be about this [thesplot.com] ...
Luckily "Project Splot" is a mere code name... Thanks for pointing that out.
Oh, and if anyone is interested, here is what the demo looks like at max: