"al_put_pixel" not working
RPG Hacker

I have problems getting al_put_pixel to work as intended. It seems that, whenever I use it, it either does nothing or only writes black pixels into my texture (that is, sampling the texture always returns 0.0 for the red channel and looking at texture in gDEBugger always shows it as either completely black or transparent with one vertical line of black, depending on the bitmap format) . I'm trying to pass a run-time generated texture to my shader to use as some kind of array. It doesn't matter whether I lock my texture or not, the outcome is always the same. Here is some example code:

#SelectExpand
1// Texture generation code (called once at program start): 2 3al_set_new_bitmap_flags(ALLEGRO_VIDEO_BITMAP | ALLEGRO_NO_PRESERVE_TEXTURE); 4al_set_new_bitmap_format(ALLEGRO_PIXEL_FORMAT_SINGLE_CHANNEL_8); // Also tried ALLEGRO_PIXEL_FORMAT_ARGB_8888 here, no difference - still just black 5 6unsigned int numAnimations = 1; 7if (maxNumTiles() > 1) 8 numAnimations = maxNumTiles(); 9m_pAnimationData = al_create_bitmap(1, numAnimations); 10 11// [...] 12 13 14 15// Filling texture data (called once per frame): 16 17al_set_target_bitmap(m_pAnimationData); 18ALLEGRO_LOCKED_REGION* pLockedRegion = al_lock_bitmap(m_pAnimationData, al_get_bitmap_format(m_pAnimationData), ALLEGRO_LOCK_WRITEONLY); 19 20// [...] 21 22for (unsigned int i = 0; i < maxNumTiles(); i++) 23{ 24 if (pLockedRegion) 25 al_put_pixel(0, i, al_map_rgba(std::min<unsigned int>(someNumber /* The number I want to read in the shader */, 255), 255, 255, 255)); 26// I can put a breakpoint here and actually step into al_put_pixel, which means it actually gets called, implying that locking the texture worked as intended. Stepping further into the function, I see that _AL_INLINE_GET_PIXEL gets called, so it should definitely write SOMETHING to the texture. 27 28// [...] 29 30if (pLockedRegion != nullptr) 31 al_unlock_bitmap(m_pAnimationData); 32al_set_target_backbuffer(al_get_current_display()); 33}

And the outcome is always something like this (for ALLEGRO_PIXEL_FORMAT_ARGB_8888):

{"name":"b68aa89a0c.png","src":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/d\/3\/d322e87ed1641915e4fe0ac5b9c775cc.png","w":854,"h":622,"tn":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/d\/3\/d322e87ed1641915e4fe0ac5b9c775cc"}b68aa89a0c.png

For ALLEGRO_PIXEL_FORMAT_SINGLE_CHANNEL_8, it always shows the whole textures as black. Note that the width of the texture is actually expanded to 16 in the GPU, but that shouldn't matter, it should still have at least a few pixels other than black in the texture. To be more precise, it should have a gradient from black to white (from black to red for single channel) in there with only a few pixels inbetween with different values (for most pixels in the loop, someNumber equals i). I also tried writing to pLockedRegion->data directly, still no difference, only black.

Anyone has an idea what I'm doing wrong? (Allegro version is 5.1.10, btw.)

EDIT:
Alright, by now I have made some progress on this, but I still need some help. I think I've actually experienced one or multiple bugs in Allegro related to al_put_pixel, but it might just be me making a few mistakes. Some input on this would be nice.

First of all, I rewrote my locking code like this:

#SelectExpand
1ALLEGRO_LOCKED_REGION* pLockedRegion = al_lock_bitmap_region(m_pAnimationData, 0, 0, 1, maxNumTilesTileset, ALLEGRO_PIXEL_FORMAT_ANY, ALLEGRO_LOCK_WRITEONLY); 2al_set_target_bitmap(m_pAnimationData);

So now I only lock the region I'm actually writing to (in case the texture turns out bigger than I requested), I pick ALLEGRO_PIXEL_FORMAT_ANY as the pixel format and I set the target bitmap after locking the image (to prevent FBO creation in Open GL).

Anyways, here are the observations I made.

First of all, the minimum texture size on my GPU seems to be 16x16. This shouldn't be important, but as it turns out it actually is. If I set the texture format to ALLEGRO_PIXEL_FORMAT_ARGB_8888, the line of pixels I want to modify usually turns out grey (R205/G205/B205/A205). However, when I modify the width and height of my bitmap to be at least 16 pixels each, then al_put_pixel actually works as inteded. It creates a nice gradient from turquoise (R0/G255/B255/A255) to white (RY/G255/B255/A255) on the texture. So here is my first observation: al_put_pixel doesn't seem to work on bitmaps that are smaller than the minimum texture size supported by the GPU. Or maybe it doesn't work on any bitmaps where the texture had to be resized because of GPU restrictions (like, I can imagine it also not working correctly on systems where the GPU only supports power of two textures and you create a NPOT-texture). My guess is that this is a bug in Allegro that was overlooked because this is kind of a rare scenario (especially on more modern GPUs).

My second observation: when using ALLEGRO_PIXEL_FORMAT_SINGLE_CHANNEL_8, I never can get it to write the pixels I want into the texture. It seems that it always either puts full black (R0) or almost red (R205) into column of the texture I want to fill. My guess is that this is another bug in Allegro, probably because, once again, this is a rare scenario. While modifying pixels in RGBA textures isn't too uncommon, modifying (or even just using) an R8 texture in Allegro is quite rare.

I'm not entirely sure how locking and al_put_pixel in Allegro work, but from what I understand, the bitmaps use an internal buffer that is returned when the bitmap is locked, and whenever the bitmap is unlocked, the data in the buffer is copied to the locked region of the texture. If that is true, then my guess is that either the pixels are copied from the wrong section of the internal buffer into the texture or the locking call returns a pointer to the wrong place in the buffer. In any case, it looks like some uninitialized memory ends up in the texture. If I'm not wrong, 205 (or 0xCD in hex) is what Visual Studio uses to makr uninitialized memory in RAM.

So can anyone look into this and confirm whether these are actually bugs in Allegro?

SiegeLord

Interesting observations. .What version of Allegro is this? Nvm, saw it. I'm going to have to investigate this. FWIW the minimum texture size is a know thing, as Allegro doesn't actually create textures that are less than 16x16 pixels in size. There probably is a locking bug somewhere that is still confused though.

RPG Hacker

Yeah, my guess is that in some places it confuses texture size with (requested) bitmap size so that it accidentally uploads some untouched memory to the texture in certain cases. It seems to always lock the correct pixels on the texture, though. At least in the case of ARGB8. That is, the one column of pixels I actually want to modify always ends up having different pixels. Only the pixel values themselves are wrong for some texture sizes and formats. I can provide some screenshots once I'm back home.

Bruce Pascoe
SiegeLord said:

FWIW the minimum texture size is a know thing, as Allegro doesn't actually create textures that are less than 16x16 pixels in size.

Is there a reason for this? This tripped me up a few times, e.g. when trying to perform hardware tiling using al_draw_prim(). Textures smaller than 16x16 end up with gaps in them when tiled because Allegro forces the texture size to 16x16. I have to tile in software to work around it.

RPG Hacker

I have to tile in software to work around it.

How about creating a sub bitmap of the bitmap you want to tile, would that work? I'm not very familiar with sub bitmaps, but if they're recognized as standalone textures by the shader, it could work.

Bruce Pascoe

Nope, tried that. I actually use sub-bitmaps extensively for sprite atlasing in my engine, and found out the hard way that you can't use an atlas if you also intend to use hardware tiling--the GPU only sees the underlying texture, not the individual subimages. So you end up with garbage for output. :P

Thomas Fjellstrom

By hardware tiling do you mean texture wrapping?

RPG Hacker

On the actual problem of this topic: I worked around it by doing some extra stuff.

1) Make texture size at least 32x32 (since, according to Allegro documentation, every GPU should support at least 32x32 textures as their minimum)

2) For the width I just pick 32 and go with it, for the height I pick the best suitable power so that >= 32 so that the texture can contain all data I need. I do this to avoid any potential problems with GPUs that don't support NPOT-textures.

3) To avoid wasting too much space, instead of only writing columns to my texture, I also write rows, with x = index % 32 and y = index / 32. Currently none of my textures are bigger than 32x32, which, I suppose, is okay.

4) I avoid ALLEGRO_PIXEL_FORMAT_SINGLE_CHANNEL_8 since I still can't get it to work (always produces black for all pixels I lock) and 8-bit aren't enough for my data, anyways. Normally I'd go with a 16-bit or 24-bit texture here, but I can't find one that is supported by both, D3D and OpenGL. Therefore I went with ALLEGRO_PIXEL_FORMAT_ANY_32_NO_ALPHA, again wasting some space. But that's okay, I guess. Even with that format and 32x32, that's only 4096 bytes per tileset, not too dramatic (except for a few cases I still have too otpimize).

With that, I get the following output for my texture (this time taken from Pix instead of gDEBugger):

{"name":"3bfb22c3c9.png","src":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/b\/b\/bbb35bc5214dadda6f91b0395a53e582.png","w":955,"h":488,"tn":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/b\/b\/bbb35bc5214dadda6f91b0395a53e582"}3bfb22c3c9.png

Now this all works fine, even though it is awfully slow right now, which surprises me. I am already using "no preserve texture" and "write only" and I only lock around ten 32x32 textures per frame and only small sections of them (only the few colored pixel rows you can see in the screenshot; even fewer for most textures). I really have no idea why this is so awfully slow. Maybe my PC is just too busy right now (although I don't know from what, but my PC sucks, so who knows). Then again, the code could really be quite slow. I don't know, I'll have to investigate some more.

EDIT:
Looks like BlueStacks was keeping my computer quite busy. Closing that, I can get the game to mostly run solid at 60 FPS again and only occasionaly drop below that, which was about the same behavior I got before doing all the locking stuff.

Bruce Pascoe

@Thomas: Yes. Texture wrapping doesn't work with sub-bitmaps (this is expected), but also breaks with images < 16x16 because Allegro pads them. Writing a game engine meant for general use, I tend to pick up on little things like this. ;)

Thomas Fjellstrom

I thought you could do wrapping with sub areas of textures as long as you had the uv's setup properly, but maybe that's something else.

SiegeLord

The only way I know of tiling sub-bitmaps is to use shaders.

RPG Hacker, could you test this program and report what happens? It works fine for me on Linux GL:

#SelectExpand
1#include <allegro5/allegro.h> 2 3int main() 4{ 5 al_init(); 6 7 ALLEGRO_DISPLAY* d = al_create_display(800, 600); 8 9 al_set_new_bitmap_format(ALLEGRO_PIXEL_FORMAT_ARGB_8888); 10 ALLEGRO_BITMAP* b = al_create_bitmap(8, 8); 11 12 al_set_target_bitmap(b); 13 al_clear_to_color(al_map_rgb_f(0.5, 0.5, 0.5)); 14 15 al_lock_bitmap(b, al_get_bitmap_format(b), ALLEGRO_LOCK_WRITEONLY); 16 for (int y = 0; y < al_get_bitmap_height(b); y++) 17 { 18 for (int x = 0; x < al_get_bitmap_width(b); x++) 19 { 20 al_put_pixel(x, y, al_map_rgb_f((float)x / al_get_bitmap_width(b), 0, (float)y / al_get_bitmap_height(b))); 21 } 22 } 23 al_unlock_bitmap(b); 24 25 al_set_target_bitmap(al_get_backbuffer(d)); 26 al_clear_to_color(al_map_rgb_f(0, 0, 0)); 27 al_draw_scaled_bitmap(b, 0, 0, al_get_bitmap_width(b), al_get_bitmap_height(b), 0, 0, 256, 256, 0); 28 al_flip_display(); 29 al_rest(5.0); 30 31 return 0; 32}

I am still investigating the SINGLE_CHANNEL_8 business.

RPG Hacker

Sure, will test this in Direct3D and OpenGL as soon as I'm back home from work and then update this post with the results.

EDIT:
Alright, I just gave this a try on my own computer. First of all, to reproduce my own situation from this thread, I adjusted your example code a bit. I set the texture height to 200 and tried different texture widths smaller than 16 and bigger than 16. I also adjusted the bitmap locking to instead use a lock_region and only lock the one column on the left. Then I put the locking, pixel putting and flipping code into a loop with a slightly shorter wait so that I could attach a GPU debugger to the application (most GPU debuggers can only halt the application during a flip, so this was necessary).

Now here are my results: In the game window itself, the texture always showed up perfectly, no matter which width I tried. When using your original code, I got a texture with gradients between black/blue/red/violet. When using my modified code, I also got the correct results in the game, having a texture with a gradient from black to blue in the left-most column and with grey for all other pixels. next I attached gDEBugger to the application (forcing the display to OpenGL) and inside it I actually got the same results as described in this thread. That is: Textures with a width < 16 appeared as black in the debugger and textures >= 16 appeared correctly. Then I forced my display to Direct3D and attached pix to it. Here the texture always showed up correctly, no matter the width.

All of this leads me to believe that the code in Allegro itself is actually fine and that the bug I experiences is in gDEBugger. It displays the textures in a wrong way in a few situations. Back when I was trying to get my code to work, I was trying something a bit complicated, so I didn't even try to work with the texture inside Allegro until I'd get the texture to display correctly inside gDEBugger. That's probably why I didn't notice this and thought it was a bug in Allegro itself. Now I still consider it weird that it sometimes did show the one column in the texture as modified, but with the wrong color values. Still I guess the bug lies within gDEBugger and not within Allegro.

Now this goes for ALLEGRO_PIXEL_FORMAT_ARGB_8888. When trying all of this with ALLEGRO_PIXEL_FORMAT_SINGLE_CHANNEL_8, I still only get a black texture as a result, even inside Allegro. So I guess at least in this case it really is an Allegro bug.

SiegeLord

That's interesting, thanks for testing. Another way to truly verify the contents of your bitmap would be to use a shader. Same goes for ALLEGRO_PIXEL_FORMAT_SINGLE_CHANNEL_8, as I suspect it just doesn't draw without a shader on.

Bruce Pascoe

For what it's worth, I think I've encountered this bug myself--al_put_pixel() writing black pixels. I've verified this by saving the bitmap out to disk after writing a bunch of red pixels to it--it's all black.

Thread #615470. Printed from Allegro.cc