"bitmap_color_depth" error with Datafile BITMAPs
LordHolNapul

Hi,
I cannot understand why the returned color depth from my datafile bitmap is always 32 bit instead , for example of 24 or 8 bits.

I have in my datafile many types of images, everyone with different characteristics.
Every image return 32 bits...

here some code:

DATAFILE * temp_obj = load_datafile_object("mydatfile.dat", "flower");
BITMAP * temp_bmp = static_cast<BITMAP *>( temp_obj ->dat);
int debug_int = bitmap_color_depth(temp_bmp );  // always 32 bit

What I'm doing wrong ? ???:o

PS: my context is ALLEGROGL !!

Matthew Leverton

I expect that they are getting color converted. See set_color_conversion().

LordHolNapul

no, this doesn't matter...
I've just discovered that the returned color from "bitmap_color_depth()" is the color set up inside this function:

allegro_gl_set(AGL_COLOR_DEPTH, 32);    // color depth of the frame buffer. 

If I put it to 32, all my bitmaps are 32, if I put it at 24, all my bitmaps are 24.

This is not correct! The "allegro_gl_set" speaks to the OPENGL Framebuffer.
At least It should speak to the "TEXTUREs" color depth, but not forcing allegro BITMAPs color depth before I've create the relative Texture!

This is surely a bug to fix.
:'(

Thomas Fjellstrom

Of course it matters. Allegro will color convert on load to the framebuffer depth. Tell allegro to not convert on load, and it wont.

Kris Asick

When you load a datafile or image, all objects in the datafile or image are converted to the colour format of the screen by default. To disable this, you must make a call to set_color_conversion().

However, if you disable this functionality, you will get the correct colour depth information, but the image won't be usable with many of Allegro's functions. (Granted, you're using AllegroGL, and I'm not certian how OpenGL handles colour depth differences.) What you want to do if you need to know the original bit-depth but still want to use the image in a native screen format, is load the image with the colour conversion off, then make another bitmap of the current screen depth, turn colour conversion back on, and do the conversion onto it with a simple call to blit().

--- Kris Asick (Gemini)
--- http://www.pixelships.com

LordHolNapul

Yes, I've tried and it works:

set_color_conversion(COLORCONV_NONE); // debug;

// .. load from datafile

So, the bitmap cannot be used ? No problem, I use Textures for Output.

Just a little problem, I've loaded an 8BIT Bitmap (used as alpha), and it is defined again with 32 bit color depth. The problem didn't occurs with 24bit Bitmaps. Of course, getting the alpha channel inside every pixel of this Bitmap didn't result anything (always zero). The Alpha shifted to some color value between the 32bits (I suppose but not tested!).

Is this a problem to fix with allegro ?

Thanks anyway, in my purpose this is not important so the question is complete.
Ciao! 8-)

Thread #591739. Printed from Allegro.cc