Allegro.cc - Online Community

Allegro.cc Forums » Programming Questions » How To Render Awesomium Into Allegro 5 Buffer?

This thread is locked; no one can reply to it. rss feed Print
How To Render Awesomium Into Allegro 5 Buffer?
rhhzero
Member #15,887
February 2015

Hello again!

I was checking out an HTML/CSS powered GUI for use with allegro in another one of my game projects and thought it was an excellent way for me to implement a nice looking and functional UI. I am currently using the Allegro 5 library for my game, and was wondering if anyone with experience with Awesomium knows how do I begin about rendering what's in WebView to the Allegro buffer? I saw an example of someone creating a login screen that showed up in the game (actually, it was on this forum!), looked like whatever was rendering in the Allegro buffer was being rendered as usual and the login input was being rendered on top of that, kind of like it was on the top layer.

Sorry if what I am asking is dumb. I really want to try to implement Awesomium into my game and I don't even know where to begin honestly.

jmasterx
Member #11,410
October 2009

It looks like the way to do it is to get its bitmap into the graphics card then because the rgb format is wrong, a shader should be used to get the channels in the correct order.

https://www.allegro.cc/forums/thread/607732/923295

If you can get the first part working I can help with the shader part.

rhhzero
Member #15,887
February 2015

Thanks for always replying and helping!

I was looking into some documentation for Awesomium when I noticed that the BitmapSurface class which contains a pointer to the raw pixel buffer in a 32-bit BGRA format has a method called "CopyTo" which copies the raw pixel data in the buffer to another array. It seems this method can also convert the data into RGBA format.

So now that I have extracted this data into an array that is in RGBA format, I have absolutely no idea how exactly I'm supposed to pass this information in a way where Allegro can then render it. Or am I going about this the completely wrong way? I am sorry, I wasn't quite sure what was meant by getting the bitmap data into the graphics card.

jmasterx
Member #11,410
October 2009

Like the post I linked suggests, you want to al_create_bitmap(buffer_w,buffer_h)

Then each frame you want to lock the bitmap as write only, you then copy the Awesome buffer to the Allegro bitmap in BGRA format, here is the example for locking https://github.com/liballeg/allegro5/blob/5.1/examples/ex_lockbitmap.c

In here there are a.cc topics that relate to locking https://www.allegro.cc/manual/5/al_lock_bitmap

Then you will use a shader that will correctly order the RGBA. The reason you should avoid converting to RGBA yourself is that cpus are bad at this kind of task whereas GPUs accel at these kinds of tasks so you will get a much better framerate if you let the GPU do the conversion.

Gideon Weems
Member #3,925
October 2003

Interesting concept. Given that it's come up before, I've taken the liberty of adding an entry to the FAQ.

rhhzero
Member #15,887
February 2015

Thank you very much, I'm learning a good amount right now! That tidbit about avoiding using the CPU to convert pixel buffer format was really nice to know too. All the steps shown to use Allegro with a UI engine like Awesomium so far have been very clear and helpful. Which is why I apologize in advance that I'm about to ask even more questions. :-[

I think the "ex_lockbitmap.c" example jrmasterx linked is excellent, and I already learned a lot from inspecting it! However I am confused what is happening in the "fill" function after locking the bitmap. It seems what happens next is populating the bitmap with pixel data, but how exactly is that happening here? What is the role of the void pointer "data" here? How would one start about using this to copy information stored in an unsigned char array (Awesomium pixel data) into the bitmap? Where/how does an Allegro bitmap object hold it's data?

Also, if the shader's role (after I successfully translate what's in the Awesomium buffer to the locked bitmap) is to convert the format of the pixels residing in the locked bitmap (which are still in BGRA at this point), how do you even begin doing that? I gathered that you create a shader, then I must attach a shader before using. How do I do that? I'm sorry, this is actually my first experience hands on with using shaders. I really appreciate the help, and hope this is also helping out another newbie too who might be lurking around :)

bamccaig
Member #7,536
July 2006
avatar

I'm guessing that the pixel format is specified in the lock call with the format[1] parameter i.e., ALLEGRO_PIXEL_FORMAT_RGB_565. You'd have to understand what those formats mean. This one (according to a comment in the example) appears to represent pixel data as 16-bit integers as in RRRRRGGG GGGBBBBB. I agree that the actual pixel manipulations appear to assume the reader knows much about pixel manipulations. I have no idea why those values are divided by 8 and 4. Perhaps that is a binary hack? A comment explaining why would be very useful.

255/8 is 31 in integer math, which is 00011111 in an 8-bit unsigned int. Shifting 11 bits left should be 11111000 000000000, I think. It seems to work for that, but I'm not sure why they're initializing the colors with 255 if there's only 5 and 6 bits each, respectively. Perhaps it has to do with the pixel format... Or perhaps it's magic numbers and a complete lack of comments. ::)

Go to: