This is sad... :'(
Edgar Reynaldo
Eric Johnson

Whether right or wrong, Allegro is seen as more of a hobbyist library than a commercial one, so the results don't surprise me.

These results might surprise you though... 8-)


At least we will always know that in the past we were the clear winners:

Edgar Reynaldo
Eric Johnson said:

These results [] might surprise you though... 8-)


I don't really care about the Google battle crap.

What bugged me is how out of date the wikipedia page on Allegro is.

Chris Katko

I didn't notice anything super out of date.

Though we might as well add games using Allegro on Steam. Like Factorio, and that one battle dungeon game.

Edgar Reynaldo

That article didn't describe Allegro 5 at all, it was all about Allegro 4.

Chris Katko

...did you read it?


Allegro 5[edit]
Current development is focused on the Allegro 5 branch, a complete redesign of both the API and much of the library's internal operation. Effort was made to make the API more consistent and multi-thread safe. By default, the library is now hardware accelerated using OpenGL or DirectX rendering backends where appropriate. Many of the addons that existed as separate projects for Allegro 4 now interface seamlessly with Allegro proper and are bundled with the default installation. Allegro 5 is event driven.

Allegro 5 natively supports OpenGL.

It's certainly Allegro 4-centric with 5 being the exception descriptions. But, I would think that's common with articles staying very similar to how they were first written.


What benefits does allegro5 have over allegro4?
This is a genuine question, because I've been writing a game using allegro4 (just because it's what I'm used to) and I'm thinking of trying to convert it to allegro5.

One of the things I've had the most trouble with in programming my game is that updating the screen is fairly slow, and I want my game to run well even on very old computers - pentium 3, etc. Right now I have it running at 60fps on a pentium 3, but only if the resolution is 320x240.
Would converting my game to allegro5 potentially help it run faster?

Edgar Reynaldo

In your case, Allegro 4 may be more appropriate with such old hardware. Allegro 5 doesn't officially support anything below XP anymore on Windows.

4 vs 5 has been done over and over. Basically, you get hwaccel, multimedia, input, and its all event driven. No more wasted CPU. No more software drawing. Use OpenGL seamlessly alongside your A5 code.

I recommend you start here if you want to learn about Allegro 5 :

Chris Katko

The only place where it's "slower", is doing lots of per-pixel operations. But if you do them in memory like before, they're just as fast. It's just that per-pixel modifications on videocard RAM completely swamps the bus on modern computer.

But if you have a huge existing codebase with heavily-coupling to A4, A4 might be the best bet just because changing any API takes time and bugs. But for everything else, there's really reason to keep A4 over A5 except stubbornness or laziness.

If you were to use any modern SDL, SFML, or anything, they're all designed like A5. A4 originated in the Amiga and DOS days and the API shows it. You don't need assembler and compiled sprites to draw bitmaps fast. You simply tell the GPU "draw this bitmap at x,y".

It's really fast to a GPU to "do X", but telling the GPU "do x" becomes a problem when "x" is really small and really numerous. It takes Y amount of time to send a command over the bus to the GPU, and that overhead multiplied by a tiny operation like "change a single pixel" completely swamps the bus. While not exactly true, to illustrate the point (ha, play on words!->), it takes just as long for a modern GPU to draw an entire sprite with rotation, color, and lighting, that it does to draw a single pixel. So, draw your sprites ONCE in every animation (using those free gigs of VRAM) and then just draw the sprites.

The modern way of programming graphics (and every video game by a AAA company is this way since like 1998) is that you exploit that a GPU can do macro operations, and, if you need very specific, custom per-pixel operations (other than lighting, coloring, which is basically FREE on GPUs) you build a shader for it. A shader is just a little program that takes few variables, changes them, and outputs a result. So instead of manually making a per-pixel procedurally generated fire, you put that code directly onto the video card/GPU and boom, it's literally 1000x, 10000x or more times faster than a CPU ever was and it scales with GPU upgrades with no code changes. (More shader runs, simply get divided up against more shader units. And likewise with smaller GPUs.)

So basically, unless you're doing something really wrong by completely ignoring how hardware is designed, A5 (and any OpenGL powered program) is gonna be balls-out fast for a 2-D game. Like, literally unimaginably faster than any DOS-era, CPU-powered blitting program that we grew up with. If the new way is slow, you're either making a mistake, or, you're forgetting that you're using way higher resolutions, color depths, #'s of sprites, etc than you ever did before. Even 3-D games push unimaginable amounts of polygons. Even my years-old netbook with integrated intel graphics that runs on less than 10 watts of power, can push enough polygons to hit at least 2002-era AAA-game graphics--maybe higher.

[edit] Here's Battlefield 3 running on the same freakin' CPU as my chromebook.




My exact laptop (I think) running at 800x600 ... GTA V:


Now, there is one huge more trick. They're running custom engines that all use deferred rendering. It's what all the modern AAA games that can afford a dedicated graphics architect us. Any game that doesn't (99% of indie games, even 3-D) will run much slower because they can't afford to tweak the graphics pipeline to get the 100% pure exploitation of all GPU resources.

Want to know how modern graphics engines work with deferred rendering? Check out these amazing analysis articles:



Considering the heavy streaming of assets going on and the specs of the PS3 (256MB RAM and 256MB of video memory) I’m amazed the game doesn’t crash after 20 minutes, it’s a real technical prowess.


Deus Ex HR:

Thread #617228. Printed from