|
Primitives drawing performace |
Aikei_c
Member #14,871
January 2013
|
I wonder how efficient is drawing primitives compared to drawing bitmaps in allegro? Or is it the same? |
Kris Asick
Member #1,424
July 2001
|
if (Aikei_c.allegro.majorVersion == 5) { Well, typically, the GPU will outpace the CPU, but you still want to avoid drawing stuff if you don't have to because it eliminates calling the GPU and telling it do so stuff, saving you precious CPU time. So one potential optimization is to make sure you don't draw something if you can't actually see it. If only 500 primitives is enough to kill the framerate then you may be better off using al_draw_prim(). Yes, this function will kill the framerate MUCH more easily if you call it more than a handful of times per frame, but you can send massive amounts of data to it in a single call with virtually no penalty, so what you can do is set up all of your primitive draws into one large array of verticies and colour values, then just draw the entire array with a single call to al_draw_prim(). That should go lightning fast! Beyond that, depending on the kinds of primitives you're drawing, you may be better off using bitmaps and simply scaling them or such. Do you have a picture of what it is you're trying to accomplish with your primitives? } if (Aikei_c.allegro.majorVersion == 4) { Why haven't you upgraded to Allegro 5 yet? } --- Kris Asick (Gemini) |
Aikei_c
Member #14,871
January 2013
|
First of all, I would like to thank you for your input. Kris Asick said: you may be better off using al_draw_prim().
I believe all high level primitives like al_draw_circle do use al_draw_prim to draw them and I think they consume even less of my gpu/cpu time or whatever, than al_draw_bitmap, since I seem to be able to draw more primitives than bitmaps. Edit: Here is a picture of what I am doing. I'm not sure yet how to use al_draw_prim to draw a lot of circles with one call to it, but I'll probably figure it out... Edit: I must also mention that I haven't actually tried bitmaps instead of primitives in this particular case yet, and what I am saying about drawing primitives being more efficient than bitmaps only concerns my highly inaccurate tests where I used only pretty big bitmaps and not so long lines as primitives (if this can matter). |
Kris Asick
Member #1,424
July 2001
|
If you can only handle 2000 draw_bitmap calls without the framerate dipping then you must have a very outdated video card, or a low-end/mobile video card. It also means that your CPU may be outpacing your GPU. Looking at your screenshot though, even on a high-end system that kind of an effect could be bad for the framerate, so I'd like to make the following suggestion: Instead of drawing tons of little circles, draw two big circles representing the green and yellow ranges. Then, for every target that comes into range, draw the target with a little highlight to show that it's within a particular range. But yeah, rendering tiny tiles consumes a lot more power than one might expect. Even highly successful titles that use tiny tile sizes like Terraria have had performance issues for people since day one because you can't brute-force the tiles and expect decent performance: You need SOME kind of optimization in place or it's just not going to perform that well. The game I'm working on has tiny tile sizes and is optimized so that they're all drawn to an oversized back buffer similar to how old game systems like the NES worked. Then, instead of having to redraw all the tiles every frame, only new tiles which come into view need to be drawn to this buffer, which is then drawn to the screen as a whole instead of as single tiles. I absolutely need to use al_draw_prim() with the final blit though because I have a depth effect going on that requires me to draw the map several times, and it also needs to be able to wrap around the edges of the buffer seamlessly. --- Kris Asick (Gemini) |
Aikei_c
Member #14,871
January 2013
|
I have GeForce 8600 GT, which is pretty outdated, you are right. |
Kris Asick
Member #1,424
July 2001
|
Hmm... that is a tricky situation indeed... If the state of these circles don't need to be updated every frame, then you may want to render them all to a bitmap first, then render that bitmap to the screen. This way, you only have to render all the circles themselves once per time they need to be updated, and while this will nail the framerate for a fraction of a second, it shouldn't be too big a deal. Another approach using large circles would be similar to how 2D lighting engines work: Calculating where separations occur and drawing a shape that encompasses only the areas that qualify. ...as said, it's a tricky situation unfortunately. --- Kris Asick (Gemini) |
Aikei_c
Member #14,871
January 2013
|
Thanks for your suggestions, Kris. |
Kris Asick
Member #1,424
July 2001
|
One other advantage to consider is that, if you draw them all to a bitmap prior and you end up moving the character they're centred around, you can skip redrawing any circles that haven't changed state. You could do this by having an array that tracks what kind of circles have been drawn where, then when you move, you make a list of each circle that's changed to a different colour or disappeared and only erase/redraw those specific circles. Since you're working with outdated hardware though and are only getting about 1/5th the performance A5 can hit, so long as your FPS doesn't drop below 15 you'll likely still get 60 FPS on mid to high-end hardware. --- Kris Asick (Gemini) |
Aikei_c
Member #14,871
January 2013
|
Kris Asick said: so long as your FPS doesn't drop below 15 you'll likely still get 60 FPS on mid to high-end hardware Well, I'm not getting less than 60 fps even without any optimizations, and I'm not planning to drop to less than 60 on my hardware, cause I can even run some modern games on my hardware and almost every game made in the previous year; I really don't want any of my future players to think that I'm a fucking lazy stupid developer who can't actually make a 2D game which would run on not-so-good hardware, when most people can do that. |
Thomas Fjellstrom
Member #476
June 2000
|
3D hardware is rather bad at 2D, and I think people are somewhat used to fancy 2D games using a ton of processing power. 2D pushes the fillrate and bandwidth limitations of GPUs, where as 3D games tend to push the vertex/poly rate harder. You shouldn't expect miracles just because 2D is conceptionally simpler than 3D. -- |
Aikei_c
Member #14,871
January 2013
|
Thomas Fjellstrom said: 3D hardware is rather bad at 2D, and I think people are somewhat used to fancy 2D games using a ton of processing power. 2D pushes the fillrate and bandwidth limitations of GPUs, where as 3D games tend to push the vertex/poly rate harder. You shouldn't expect miracles just because 2D is conceptionally simpler than 3D.
That's right, at least in theory. But life tells me: every 2D game I tried ran on my hardware, no exceptions. I don't remember any 2D game which lagged on my hardware. I haven't tried every game, of course, so there might be some games which will lag. Some examples of games I tried: Terraria, which Kris mentioned as pretty hard on some peoples' hardware, runs on my computer without any lags. What else... Rogue legacy - runs fine. Awesomenauts - runs fine. Therefore, if I let myself make a game which would lag on my current hardware, I'd be just too unhappy and wouldn't be able to sleep well knowing I did something like this. |
Thomas Fjellstrom
Member #476
June 2000
|
There was a specific game I was trying to remember the name of.. but when it came out, and for quite a while after, it was capable of bringing mid range, and possibly even top end GPUs to their knees. It had some super awesome graphics, shaders and effects. I wish I could remember it's name. What settings do you use in those games? Low? Medium? Super high? -- |
Arthur Kalliokoski
Second in Command
February 2005
|
Thomas Fjellstrom said: There was a specific game I was trying to remember the name of.. but when it came out, and for quite a while after, it was capable of bringing mid range, and possibly even top end GPUs to their knees. It had some super awesome graphics, shaders and effects. I wish I could remember it's name. I remember a meme about computer power "But can it run Crysis?" They all watch too much MSNBC... they get ideas. |
Aikei_c
Member #14,871
January 2013
|
Thomas Fjellstrom said: What settings do you use in those games? Low? Medium? Super high?
I usually don't even bother with settings and use anything set by default, unless the game runs too slow, and in this case I use low settings. |
Thomas Fjellstrom
Member #476
June 2000
|
Yeah, I'm just saying you shouldn't spend too much effort on it. Some is good. too much just causes you to never get the game out -- |
Aikei_c
Member #14,871
January 2013
|
Well, just posting it because I promised to |
|