Allegro runs slow on Raspberry Pi
The Master

Hi all,
I know the Raspberry Pi has a low power CPU (700MHz when last I checked), but it has a really powerful GPU. I'm trying to run a game I wrote on a Raspberry Pi, and it compiles and runs, but has an unacceptably low FPS. Is the allegro port able to use the Raspberry Pi's GPU yet or is it still in the works?

Elias

A4 or A5?

Chris Katko

The Raspberry PI is going to be ridiculously slower by design than a desktop or laptop computer. Just because it may be able to say, get into 1080p, doesn't mean it can fill the screen at 30 FPS. It's also got internal video decoding hardware to mask the fact that it's really slow, whereas a traditional GPU could do it without dedicated hardware and not even flinch.

That being said, A4 or A5 is an important question because A4 doesn't use GPU acceleration at all to start with.

Arthur Kalliokoski

Isn't the RPI supposed to be the equivalent of a Pentium II? The allegro.txt file used to warn about compiling for something silly like 686 (PII), because lesser computers wouldn't run it, yet some enjoyable games were made using software video buffer access.

The Master

I'm using A. So it should be hardware accelerated. I was able to get a slight performance boost by loading images as video bitmaps. But not nearly what I was hoping for. I'm looking at optimisation strategies but I think the biggest performance hog is bitmap drawing.

Edgar Reynaldo
al_hold_bitmap_drawing(true);// cache drawing calls
/*draw all graphics from same source texture*/
al_hold_bitmap_drawing(false);// upload to gpu and draw

The Master

al_hold_bitmap_drawing(true);// cache drawing calls
*draw all graphics from same source texture*
al_hold_bitmap_drawing(false);// upload to gpu and draw

That didn't really help much. Thanks anyway.

Since I'm running it on linux, I can use callgrind to see what function calls are consuming the most time. The weird thing is, I have set the new bitmap flags to ALLEGRO_VIDEO_BITMAP, and the program is still treating them like a memory bitmap and using al_draw_soft_triangle.

I compared it to another game I made using the same framework (I've written a framework using Allegro 5, for my own personal use). Let's call the one I'm trying to optimise "Program A" and the other game used for comparison "Program B." B consistently uses OGL drawing calls for rendering bitmaps, while A keeps using _al_draw_soft_triangle even though I set the ALLEGRO_VIDEO_BITMAP flag.

Since they use the same framework, the only difference really is the source of the bitmaps. B gets them from files on disk, while A (due to the nature of the program) is loading them from memory chunks received from a server. So does al_load_bitmap_f not load into memory or something?

EDIT

Did a check, and had Program A load from a file, and it still keeps treating it like a memory bitmap, even with the ALLEGRO_VIDEO_BITMAP. Does setting that flag even make a difference?

EDIT

Evidently not. Checks the manual for al_set_new_bitmap_flags Ah, so it's a video bitmap by default. So I set the ALLEGRO_MEMORY_BITMAP flag, and B starts using _al_draw_soft_triangle to draw its images, and down goes the FPS! So clearly Program A is creating memory bitmaps regardless of the flag setting... Bizare.

l j

Are you creating your bitmaps before creating your window?

The Master

After I create the display.

Still not sure what the problem is.

EDIT

I am still trying to figure out what the problem is. I don't know why my program is loading everything as a memory bitmap while the comparison program is doing it all in video memory.

This is really starting to drive me up the wall. Even when I put al_set_new_bitmap_flags(ALLEGRO_VIDEO_BITMAP) right before al_load_bitmap, it fails to load the file! And if I put it after creating the display, it doesn't take effect.

What is wrong with allegro?

gezegond

Are you running both programs simultaneously?

The Master

No.

However, I noticed that the bitmap loading is occurring in another thread separate from the one in which the display is created. Would that cause a problem for creating a video bitmap?

EDIT

EPIC BRAINBLAST! That was the problem. Found the problem was the same one as in this thread. So I need to make sure all video resources are secured in the same thread as the one in which the GPU was set up. OK, that problem is fixed.

Now to see if the FPS goes up on the Raspberry Pi...

gezegond

Glad to hear you got rid of the problem. ;D

The Master

Code works on raspberry pi, nice and virile 57fps!

Yeah I wouldn't have normally made this mistake. But I'm actually downloading the images, using asynchronous TCP I/O which uses a separate thread and I didn't even realise it. So now it is fixed. Thanks for everyone's help.

Thomas Fjellstrom

Good to hear it!

Thread #613683. Printed from Allegro.cc