Allegro.cc - Online Community

Allegro.cc Forums » Programming Questions » [A5] Getting al_convert_mask_to_alpha to work with OpenGL

This thread is locked; no one can reply to it. rss feed Print
[A5] Getting al_convert_mask_to_alpha to work with OpenGL
Desmond Taylor
Member #11,943
May 2010
avatar

Now I'm sounding stupid. I am making a game using OpenGL but in 2D and I can't get al_convert_mask_to_alpha to work :/ Here is a screen shot of what happens

{"name":"603583","src":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/3\/9\/3969bfbec22d0845c20646a8fbf1769f.png","w":646,"h":508,"tn":"\/\/djungxnpq2nug.cloudfront.net\/image\/cache\/3\/9\/3969bfbec22d0845c20646a8fbf1769f"}603583

Edit: Here's the code.

#SelectExpand
1#include <allegro5/allegro.h> 2#include <allegro5/allegro_image.h> 3#include <allegro5/allegro_opengl.h> 4 5ALLEGRO_BITMAP* bitmap; 6GLuint texture; 7 8// This will take an OpenGL texture paramiter soon. 9void draw_square( float x, float y, float z ) 10{ 11 int width = al_get_bitmap_width( bitmap ); 12 int height = al_get_bitmap_height( bitmap ); 13 14 glLoadIdentity(); 15 16 glBegin(GL_QUADS); 17 glTexCoord2f( 0, 0 ); glVertex3f( x, y + height, z ); 18 glTexCoord2f( 1, 0 ); glVertex3f( x + width, y +height, z ); 19 glTexCoord2f( 1, 1 ); glVertex3f( x + width, y, z ); 20 glTexCoord2f( 0, 1 ); glVertex3f( x, y, z ); 21 glEnd(); 22} 23 24int main( int argc, char** argv ) 25{ 26 al_init(); 27 28 al_init_image_addon(); 29 30 al_set_new_display_flags( ALLEGRO_OPENGL ); 31 al_set_new_display_option( ALLEGRO_DEPTH_SIZE, 24, ALLEGRO_SUGGEST ); 32 33 ALLEGRO_DISPLAY* display; 34 display = al_create_display( 640, 480 ); 35 if ( !display ) 36 return -1; 37 38 39 bitmap = al_load_bitmap( "gfx/1.png" ); 40 if ( !bitmap ) 41 return -2; 42 else 43 al_convert_mask_to_alpha( bitmap, al_map_rgb( 0, 0, 0 ) ); 44 45 glMatrixMode( GL_PROJECTION ); 46 47 glLoadIdentity(); 48 49 glOrtho( 0, 640, 480, 0, 0, 100 ); 50 51 glMatrixMode( GL_MODELVIEW ); 52 53 glEnable( GL_DEPTH_TEST ); 54 55 //glEnable( GL_BLEND ); 56 //glBlendFunc( GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA ); 57 58 glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT ); 59 60 texture = al_get_opengl_texture( bitmap ); 61 62 glBindTexture( GL_TEXTURE_2D, texture ); 63 64 glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR ); 65 glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR ); 66 67 glGenTextures( 1, &texture ); 68 glEnable( GL_TEXTURE_2D ); 69 70 draw_square( 10, 10, 0 ); 71 draw_square( 140, 10, 0 ); 72 73 al_flip_display(); 74 75 al_rest( 3 ); // Rest for 3 seconds so that we can see the result. 76 77 al_destroy_bitmap( bitmap ); 78 79 return 0; 80}

Matthew Leverton
Supreme Loser
January 1999
avatar

I assume you would need a 32-bit display.

Arthur Kalliokoski
Second in Command
February 2005
avatar

I assume you would need a 32-bit display.

He needs a 32 bit buffer, or is that the "display"?

They all watch too much MSNBC... they get ideas.

Matthew Leverton
Supreme Loser
January 1999
avatar

Well, my comment is basically irrelevant, since the depth he is setting is z-buffer, and not color depth. Although he could still check to see what color depth is being set (should be 32-bit).

Desmond Taylor
Member #11,943
May 2010
avatar

Arthur Kalliokoski
Second in Command
February 2005
avatar

The actual window (possibly fullscreen) has a fixed number of colors, it might be 16, 24 or 32, 16 is lower quality, 24 shows as many colors as possible, and 32 is usually helpful in alignment purposes (?). OpenGL allows you to specify how many bits of depth you want when you load an image, and if you want to use alpha, you have to specify 32 here (to make room for the alpha channel). Even if the display is 16 bits, the alpha blending has already been taken care of by the time it's sent to the 16 bit display. I don't see anything in bitmap loading in A5 that allows you to do this. OTOH, you could probably do it the OpenGL way (much more code to deal with) to load images into a texture.

They all watch too much MSNBC... they get ideas.

Thomas Fjellstrom
Member #476
June 2000
avatar

24 shows as many colors as possible, and 32 is usually helpful in alignment purposes (?).

In the oldendays. These days even if you're setting a 24bit framebuffer, you're actually getting a 32bit framebuffer, and the fourth component is ignored. At least that's how X sets it up on all the machines I know of.

--
Thomas Fjellstrom - [website] - [email] - [Allegro Wiki] - [Allegro TODO]
"If you can't think of a better solution, don't try to make a better solution." -- weapon_S
"The less evidence we have for what we believe is certain, the more violently we defend beliefs against those who don't agree" -- https://twitter.com/neiltyson/status/592870205409353730

Edgar Reynaldo
Major Reynaldo
May 2007
avatar

Do a little test, and try reading the top left pixel of your bitmap after it's been loaded (but before conversion) and print out what color it says is there.

Also, make sure the background is really full black and not nearly black instead.

Desmond Taylor
Member #11,943
May 2010
avatar

I'm just going to convert all the graphics to PNG with transparency.

Go to: