Allegro.cc - Online Community

Allegro.cc Forums » Programming Questions » No red, only green and blue!

This thread is locked; no one can reply to it. rss feed Print
 1   2 
No red, only green and blue!
boulifb
Member #7,909
October 2006

Hello Allegro gurus,

I am currently working on my game project and I have some strange behaviours with the colors. When I use 15 bits colors (values taken from the game boy palettes) I get the correct display colors when I use the getr15, getg15 and getb15 under Windows. When I recompile the project on Mac OS X (tiger-intel c2d and Xeon) I get wrong colors: I have only green and blue composites and no red. I have declared 32bits color depth on my bitmaps. When I check the degugging values I have the corect colors values on both Windows and Tiger but the display result differ. Is there a problem with colors on Mac OS X? Did I missed something somewhere? (yes, I have used the set_color_depth function before initializing the set_gfx_mode function)

Thanks in advance for the help.

Best regards.

Fred.

OK I solved the problem:
On Mac OS X the colors are reverted: it is BGR, not RGB. Colors are now applied correctly.

Thomas Harte
Member #33
April 2000
avatar

Are you composing the 32bit colours yourself with fixed code? If so then you may face compatibility issues - not all Windows machines go RGB and I dare guess that not all OS X machines go BGR. I definitely wouldn't be surprised if the order of colours within a word varies between Intel and PowerPC for obvious reasons. makecol32 will sort things out for you though...

boulifb
Member #7,909
October 2006

What is really surprising is that I have intel Macs. 1 Macbook Pro Core2Duo and a Mac Pro (Xeon x64).

I should get the same bit order as it is intel based processors.

I suspect a Mac OS legacy.

Other people will have the information now ;)

Best regards.

Fred.

Thomas Harte
Member #33
April 2000
avatar

Quote:

I should get the same bit order as it is intel based processors.

Only if they're using the same video cards in the same video modes.

Windows (pre-Vista) makes applications write straight to the framebuffer. OS X keeps a buffer of the current state of each application which, starting with 10.2, is in VRAM (so the graphics card can do window composition).

So, in Windows, RGB/BGR/whatever ordering is determined by the order used by the framebuffer.

In OS X, BGR/RGB/whatever ordering is determined by the order OS X selects for its texture format. After that it is purely hardware driven - OS X will not and cannot impose any legacy order if the hardware doesn't support it.

Neither OS guarantees either order. Allegro figures out the order at runtime, and provides makecol32 so that you can build your 32bit words correctly.

If you assume a byte order based solely on OS then your program will not work on all hardware.

boulifb
Member #7,909
October 2006

as I said, I use 15 bits colors from the Game Boy that I transform into 32 bits colors for Allegro platform (Mac OS X and Windows).

When I apply the 15bits color I use the current code:
on Mac OS X:
colDest=makecol(getb15(palette[colorIndex]), getg15(palette[colorIndex]), getr15(palette[colorIndex]));

on Windows:
colDest=makecol(getr15(palette[colorIndex]), getg15(palette[colorIndex]), getb15(palette[colorIndex]));

RGB values are just inverted.
The current code gives me the correct colors. But maybe there is a universal way to do so no matter the OS or platform to do something more standard?

Ron Novy
Member #6,982
March 2006
avatar

If there is a way to detect what order OSX is using then it should be used instead. In either case I believe this issue has been discussed many times for OSX but I've never seen a solution that caused a good working patch to the lib... Maybe someone with better skills on OSX can create a patch for set_gfx_mode that detects the RGB/BGR order and sets things up appropriately.

----
Oh... Bieber! I thought everyone was chanting Beaver... Now it doesn't make any sense at all. :-/

X-G
Member #856
December 2000
avatar

Quote:

colDest=makecol(getb15(palette[colorIndex]), getg15(palette[colorIndex]), getr15(palette[colorIndex]));

Uh... makecol always takes its arguments in the order R, G, B, regardless of platform.

--
Since 2008-Jun-18, democracy in Sweden is dead. | 悪霊退散!悪霊退散!怨霊、物の怪、困った時は ドーマン!セーマン!ドーマン!セーマン! 直ぐに呼びましょう陰陽師レッツゴー!

Evert
Member #794
November 2000
avatar

Quote:

In either case I believe this issue has been discussed many times for OSX but I've never seen a solution that caused a good working patch to the lib..

Eh?

@OP: Make sure you're loading/creating your graphics after you call set_gfx_mode. Allegro doesn't know the RGB ordering until you call set_gfx_mode(). You may get lucky on one system, but it will fail on others.
If that doesn't help, try reproducing the problem with the Allegro examples.

boulifb
Member #7,909
October 2006

Actually, I think it is because of the coding of the palettes .
I use a word to code the colors. They are coded on 15 bits.

I'm not sure yet it comes from the makecol function, but maybe the coding of a word.

As I use universal binary compiler, it is maybe the cause. I'll try tonight with the intel compiler. maybe it'll give the correct sorted values (RGB, not BGR).

On Windows either virtual running under Parallels Desktop for Mac or native (standard ISO insall on mac or PC) the compistes are RGB sorted, and only on Mac OS X (tiger 10.4.8 - intel) values are BGR.

fred.

Evert
Member #794
November 2000
avatar

Quote:

Actually, I think it is because of the coding of the palettes .
I use a word to code the colors. They are coded on 15 bits.

This makes no sense to me... what do you mean? Can you post some sample code to show what you're doing?

Aside, can a moderator move this topic to Programming Questions instead of Allegro Development?

boulifb
Member #7,909
October 2006

I do the following:

1void ApplyPalette(BITMAP* tiles, WORD* palette, int index, int transparency)
2{
3 int x=-1, y=-1, colSrc=-1, colDest=-1, colorIndex=-1;
4 
5 // apply the pallete to the background image
6 for (y=0; y<TILE_HEIGHT; y++)
7 {
8 for (x=0; x<TILE_WIDTH; x++)
9 {
10 colSrc=getpixel(tiles, (index*TILE_WIDTH)+x, y);
11 
12 if (colSrc == makecol(255, 255, 255))
13 {
14 colorIndex=0;
15 }
16 else if (colSrc == makecol(215, 215, 215))
17 {
18 colorIndex=1;
19 }
20 else if (colSrc == makecol(178, 178, 178))
21 {
22 colorIndex=2;
23 }
24 else if (colSrc == makecol(0, 0, 0))
25 {
26 colorIndex=3;
27 }
28
29#ifdef WIN32
30 colDest=makecol(getr15(palette[colorIndex]), getg15(palette[colorIndex]), getb15(palette[colorIndex]));
31 
32 if (transparency != -1 && colDest == makecol(getr15(palette[transparency]), getg15(palette[transparency]), getb15(palette[transparency])))
33#else
34 colDest=makecol(getb15(palette[colorIndex]), getg15(palette[colorIndex]), getr15(palette[colorIndex]));
35 
36 if (transparency != -1 && colDest == makecol(getb15(palette[transparency]), getg15(palette[transparency]), getr15(palette[transparency])))
37#endif // WIN32
38 {
39 colDest=makecol(255, 0, 255);
40 }
41
42 putpixel(tiles, (index*TILE_WIDTH)+x, y, colDest);
43 }
44 }
45}

I call the function that way:

1void LoadAnimatedTiles(PENVIRONMENTDESC pEnvironment)
2{
3 int i=-1;
4 WORD palette[4][4];
5 
6 palette[0][0]=0x0060;
7 palette[0][1]=0x0100;
8 palette[0][2]=0x01e0;
9 palette[0][3]=0x0ee3;
10
11 palette[1][0]=0x00e0;
12 palette[1][1]=0x0136;
13 palette[1][2]=0x008f;
14 palette[1][3]=0x0008;
15 
16 palette[2][0]=0x5400;
17 palette[2][1]=0x58c2;
18 palette[2][2]=0x7249;
19 palette[2][3]=0x7f8e;
20 
21 palette[3][0]=0x00a0;
22 palette[3][1]=0x02bf;
23 palette[3][2]=0x001f;
24 palette[3][3]=0x7f20;
25 
26 switch (pEnvironment->pLevel->world)
27 {
28 case WORLD_FOREST_FALLS:
29 pEnvironment->foregroundTiles=load_bitmap("ffts.bmp", NULL);
30
31 for (i=0; i<8; i++)
32 {
33 ApplyPalette(pEnvironment->foregroundTiles, palette[1], i, 0);
34 }
35...

I'm obliged to reverse the color to have the correct display on Windows and Mac OS.

Fred.

Evert
Member #794
November 2000
avatar

Right.
As said, this has nothing to do with Windows or MacOS X. The problem is that you hard-code the colour numbers in 16 bits, with a specific RGB ordering (say RGB). Allegro will use the system's native colour ordering, which may be RGB or BGR (or something else entirely). If you're lucky, the ordering you used to encode your colour constants and the one used by the system are the same. If not, you'll end upwith the wrong colours, as you have seen.

Solution: extract the components from your encoded numbers manually (it only takes a shift and a logical and), or store them as RGB triplets.

boulifb
Member #7,909
October 2006

you mean that I should not use the getr15 getg15 and getb15 functions and I have to rewrite them???

Evert
Member #794
November 2000
avatar

I meant what I wrote.
Phrased differently, no, you should not be using get[r,g,b]15 on the colour values you hard-coded because you can't predict the order of RGB components on any system.

boulifb
Member #7,909
October 2006

Damned, these functions are so convinient.
How can I replace them?
I'm a dumb with manipulating bits in C :P

Thomas Harte
Member #33
April 2000
avatar

I don't know the GBA colour encoding, but if it's 15 bit then something like:

... somewhere in your source before you use them ...
#define gba_getb(v) (v&31)
#define gba_getg(v) ((v >> 5)&31)
#define gba_getr(v) (v >> 10)

... to use them ...
r = gba_getr(col);
g = gba_getg(col);
b = gba_getb(col);

Though I may have r, g and b the wrong way round. "&31" effectively says "throw away all but the lowest five bits" and ">> 5" says "move all the bits 5 positions to the right, throw away any that cross the decimal point".

boulifb
Member #7,909
October 2006

the macros does not seem to work properly: all is almost black

GB's colors work like this:
R G B
0 11111 00000 00000
15--------------->0

As it is 5 bits per color, the MSB is ignored.

The goal is to transform 15 bits colors into 32 bits colors using the hardcoded palettes I shown.

As shown, the values of the palettes values are coded as a word (unsigned short) which is 16 bits wide.

I don't really know how to do that while Allegro does it I use the getr15, getg15 and getb15 functions.

X-G
Member #856
December 2000
avatar

Harte's macros will do exactly that. ::)

--
Since 2008-Jun-18, democracy in Sweden is dead. | 悪霊退散!悪霊退散!怨霊、物の怪、困った時は ドーマン!セーマン!ドーマン!セーマン! 直ぐに呼びましょう陰陽師レッツゴー!

Evert
Member #794
November 2000
avatar

The colour values you get out are in the range 0-31, while Allegro's makcol function takes values in the range 0-255, so you'll need to scale them (by a factor 8 or so, though using 255./31. would give you a better result).

boulifb
Member #7,909
October 2006

indeed, I get better results by multiply the composites by 8.
also, I have to use the makecol24 function instead of makecol and things seems to be ok on Windows.

I'll tell you tonight how it is on the Macintosh. If I understood well, normally, with this method, it should get the same display results no mater the platform and OS. right?

X-G
Member #856
December 2000
avatar

Quote:

also, I have to use the makecol24 function instead of makecol and things seems to be ok on Windows.

What? Why? Where are you calling it?

--
Since 2008-Jun-18, democracy in Sweden is dead. | 悪霊退散!悪霊退散!怨霊、物の怪、困った時は ドーマン!セーマン!ドーマン!セーマン! 直ぐに呼びましょう陰陽師レッツゴー!

boulifb
Member #7,909
October 2006

I call makecol24 after the extraction of the 15bits RGB values before applying the color to the pixel. Why?

X-G
Member #856
December 2000
avatar

Because there's no reason makecol() should fail unless you're trying to use it before calling allegro_init() and set_gfx_mode() (Assuming you still have to do that).

--
Since 2008-Jun-18, democracy in Sweden is dead. | 悪霊退散!悪霊退散!怨霊、物の怪、困った時は ドーマン!セーマン!ドーマン!セーマン! 直ぐに呼びましょう陰陽師レッツゴー!

boulifb
Member #7,909
October 2006

Any way, I have just tried on Mac OS X what we did today on Windows that is manully extracting the RGB composites and convert them manually to 32 bits colors.

While it is working on Windows, again, colors are still reverted on Mac OS.
I have tried on my Mac Pro with an nVidia video card and on my MacBook Pro with an ATI video card. Both give the same results on Mac OS X: colors are BGR, even when it is handly recoded.

That's really odd!!!

My goal is to have a unique source code based on allegro, no matter the OS it will run on.

For better results, I wish to keep the hard coded 15 bits colors in palettes that I apply to the tile sets when loading them.

I have supplied the ApplyPalette function to see how I apply the colors.

If you have any idea to solve this reversion problem, let me know, I'm opened to any idea.

I'm stucked!

Evert
Member #794
November 2000
avatar

Post a small example program that reproduces the error. Not a section of code, but something we can compile and run.

 1   2 


Go to: