|
Project Splot |
Erin Maus
Member #7,537
July 2006
|
Hey, I'm curious as to how good the performance of my graphics rendering pipeline is. Everything displayed is resolution independent, but that comes at a cost! You'll need an OpenGL 3.3 capable card, so basically anything 8600 GTX and up I think (as far as nVidia is concerned). The engine also depends on the .NET Framework 4 Client Profile. What I'm looking for is the FPS. For example, on my netbook with an Atom processor and an ION graphics card, I get 15 FPS at 640 x 480 (the default settings). On my desktop I get a silky smooth 60 FPS at HD resolution (nVidia 560 Ti and an early Core i7). If you could provide your graphics card, processor, and the FPS you get, as well as the graphics settings (edit Graphics.xml in the root directory to change the defaults), I'd be very pleased! Here is a screenshot: {"name":"605983","src":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/a\/7\/a7d026f9de3364136eab999ed916682f.png","w":656,"h":518,"tn":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/a\/7\/a7d026f9de3364136eab999ed916682f"} The game is attached. WASD moves around, and that's all there is right now. Thanks for looking . --- |
jmasterx
Member #11,410
October 2009
|
Macbook Pro 2.4 GHz with Nvidia Geforce 9400m. 256MB VRAM. I did not edit the graphics settings. One of my 2 cores was constantly on 100%. BTW this is awesome, really nice job! Agui GUI API -> https://github.com/jmasterx/Agui |
Neil Walker
Member #210
April 2000
|
^ seems to me you have some tuning to do if the above spec can barely manage 60fps.... Neil. wii:0356-1384-6687-2022, kart:3308-4806-6002. XBOX:chucklepie |
Arthur Kalliokoski
Second in Command
February 2005
|
I was fiddling with an A5 prog a couple weeks ago and it was taking up one core, it was using the timer to limit al_flip_display() to 60 fps but since it simply skipped that part, it was still running full out going through the event loop. I had to put a calculated al_rest() into it to get it down to approximately idle. They all watch too much MSNBC... they get ideas. |
23yrold3yrold
Member #1,134
March 2001
|
Thought this thread was going to be about this ... -- |
Erin Maus
Member #7,537
July 2006
|
Thanks for the feedback everyone. I have improved the rendering pipeline. Now it doesn't create unnecessary stage changes when drawing all the grass and so on by batching the calls based on depth. The engine also stores the "render descriptions" instead of recreating them every frame, which should produce much less garbage for the GC to consume. jmasterx said: Macbook Pro 2.4 GHz with Nvidia Geforce 9400m. 256MB VRAM. Thanks! Is there any way you could increase the resolution or set it to fullscreen, in order to see if the problem is fill rate? You can use ESC to quit if you try it fullscreen. I attached a more optimized version, so please try that if you can. I get an extra 2-4 FPS on my netbook. Neil Walker said: seems to me you have some tuning to do if the above spec can barely manage 60fps.... The problem is that older cards will truly have a harder time with my rendering pipeline. I am trying to ensure that it's as fast as it can be, but the limitation is the speed a shader can execute. My game may not look like Crysis, but resolution-independent graphics do require a capable GPU. Arthur Kalliokoski said: I was fiddling with an A5 prog a couple weeks ago and it was taking up one core, it was using the timer to limit al_flip_display() to 60 fps but since it simply skipped that part, it was still running full out going through the event loop. I had to put a calculated al_rest() into it to get it down to approximately idle. I'll see if this relates to the high CPU usage that my game seems to require! Thanks. 23yrold3yrold said: Thought this thread was going to be about this [thesplot.com] ... Luckily "Project Splot" is a mere code name... Thanks for pointing that out. Oh, and if anyone is interested, here is what the demo looks like at max: {"name":"605987","src":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/b\/1\/b1a30e77392906be0588ce7cf285e64c.png","w":1920,"h":1080,"tn":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/b\/1\/b1a30e77392906be0588ce7cf285e64c"} --- |
jmasterx
Member #11,410
October 2009
|
With new version: @640x480, a pretty steady 58-60 FPS. @1280x800 FS, around 22-25 FPS. I think you had mentioned you use shaders to do the 2D so that might also be hurting performance on mid range cards like mine. On my PC, GTX 275 Quad Core i5, smooth 60FPS @1920x1080. Agui GUI API -> https://github.com/jmasterx/Agui |
Thomas Fjellstrom
Member #476
June 2000
|
Aaron Bolyard said: I'll see if this relates to the high CPU usage that my game seems to require! Thanks. Do something similar to what's on the wiki example. let al_wait_for_event actually wait, and then you won't be using up excess cpu. -- |
Erin Maus
Member #7,537
July 2006
|
I follow the example perfectly as far as I can tell. Here is my main loop: while (isRunning) { AllegroMethods.al_wait_for_event(queue, ref e); AllegroEventType type = (AllegroEventType)e.type; switch (type) { ... } if (AllegroMethods.al_is_event_queue_empty(queue) != 0 && wasUpdated) Draw(); }
--- |
Max Savenkov
Member #4,613
May 2004
|
GeForce 560Ti, Win 7 64bit, steady 60 FPS at any resolution, although I noticed a few momentary slowdowns (which weren't reflected by FPS counter).
|
Thomas Fjellstrom
Member #476
June 2000
|
Aaron Bolyard said: I follow the example perfectly as far as I can tell. I guess it then depends on how and where you set wasUpdated. If you do it too often, you may have some issues. -- |
jmasterx
Member #11,410
October 2009
|
In my game I check for too many timer events in addition: 1void SceneManager::run()
2 {
3 al_start_timer(m_gameTimer);
4
5 //is the event handled?
6 bool handled = false;
7 ALLEGRO_EVENT next;
8
9 //main loop
10 while(m_gameIsRunning)
11 {
12 handled = false;
13 al_wait_for_event(queue,&evt);
14
15 bool hasNext = al_peek_next_event(queue,&next);
16 if(hasNext && next.type == ALLEGRO_EVENT_TIMER)
17 {
18 al_drop_next_event(queue);
19 }
20 //render the scene
21 if(m_needsRedraw && al_is_event_queue_empty(queue))
22 {
23 m_currentScene->render();
24 m_needsRedraw = false;
25 }
26
27 defaultBeginEventHandler(&evt);
28 m_currentScene->processEvent(&evt,handled);
29
30 //do default behavior if event was not handled by the scene
31 if (!handled)
32 {
33 defaultEndEventHandler(&evt);
34 }
35
36 processMessages();
37 }
38 }
Agui GUI API -> https://github.com/jmasterx/Agui |
Stas B.
Member #9,615
March 2008
|
Looks pretty good but I'm only getting around 50 fps. (GeForce 8500GT, quad core CPU) |
Erin Maus
Member #7,537
July 2006
|
Stas B. said: Looks pretty good but I'm only getting around 50 fps. (GeForce 8500GT, quad core CPU) Parsing SVG data is the hardest part, but luckily the .NET framework makes it pretty easy with LINQ, etc. I can provide my SVG parser if you want. As far as rendering goes, I explain it here and also on my website in an article Vector graphics on the GPU. --- |
Stas B.
Member #9,615
March 2008
|
That's pretty neat, but from the looks of it, pretty slow. How does it scale when you have enemies, props and decorations in the scene? Why don't you just pre-render the the relevant graphics while loading a level? If you render them large enough to not have to scale them up and use mipmaps when scaling down, it looks indistinguishable from real vector graphics. Maybe you could use a hybrid approach. I don't see why grass blades, for instance, have to be real vector graphics. |
m c
Member #5,337
December 2004
|
You could do something like this using flash. (\ /) |
Neil Walker
Member #210
April 2000
|
won't even run on my laptop. Get a brief framed window then it closes. No log file is produced either to show you. On my other computer which is windows 7 64-bit, 8gb ram, intel i5 with HD2000 graphics, all I got was a blue screen. Neil. wii:0356-1384-6687-2022, kart:3308-4806-6002. XBOX:chucklepie |
Erin Maus
Member #7,537
July 2006
|
m c said: You could do something like this using flash. Pretty sure you'd get like 2 FPS doing something like this in Flash. The specs requirement would even be higher. I can't even run basic games using Flash on my netbook without them using all processor cores and still running at no more than 5 FPS. Neil Walker said: won't even run on my laptop. Get a brief framed window then it closes. No log file is produced either to show you.On my other computer which is windows 7 64-bit, 8gb ram, intel i5 with HD2000 graphics, all I got was a blue screen. It requires OpenGL 3.3. I am assuming neither of your cards support it (I know the HD2000 doesn't). edit: I managed to increase performance upwards of 50% on my netbook by tremendously reducing redundant state calls. Attached is the modified version. Could anyone who tried running it before, such as jmasterx or Stas B., try it again and see how much (or how little) it has improved? --- |
Stas B.
Member #9,615
March 2008
|
Aaron Bolyard said: Could anyone who tried running it before, such as jmasterx or Stas B., try it again and see how much (or how little) it has improved?
FPS is up to a steady figure of 60. |
Erin Maus
Member #7,537
July 2006
|
Stas B. said: FPS is up to a steady figure of 60. Thanks a bunch. I am working on exactly that. I hope to have it done in the next few days. Thanks again! (By the way, the 60 FPS is a limit of the game; so you're getting the max!) --- |
William Labbett
Member #4,486
March 2004
|
How did you set up the company ?
|
|