Allegro.cc - Online Community

Allegro.cc Forums » Programming Questions » AllegroGL - how to getpixel32 from a Texture ?

Credits go to Bob, Kitty Cat, Krzysztof Kluczek, Milan Mimica, and Richard Phipps for helping out!
This thread is locked; no one can reply to it. rss feed Print
 1   2 
AllegroGL - how to getpixel32 from a Texture ?
LordHolNapul
Member #3,619
June 2003
avatar

Hi everyone,
I'm coding something trying to use the last AllegroGL version. I know there is OpenLayer, but I would like to try this way to have more freedom of movement...

The question is simple, but the answer, I don't know...
I 've got a lot of opengl textures, that are images converted from the common allegro BITMAPS; I've destroyed the BITMAPS, and kept the Textures.

I would like to know how can I read the single pixel in 32 bit format (I'm interested in Alpha channel) defining X and Y of the Texture.

The Allegro code I would like to translate into an OpengL routine is the following

int _getpixel32(BITMAP *bmp, int x, int y);
// this should become:
GLint myOpenGLgetpixel( GLint * my_texture, int x, int y,); // <--- pseudocode!!

Can someone help me ? ::)
thanks in advance

Krzysztof Kluczek
Member #4,191
January 2004
avatar

I can't recall any OpenGL function for reading from texture, so probably the only way is to draw the pixel and read it from the screen, which is as slow as it sounds. Basically, reading anything from GPU is usually a bad idea.

Why wouldn't you juct keep the bitmap yourself and read from this? If you are afraid of memory usage, you can keep only alpha channel in separate array. :)

Richard Phipps
Member #1,632
November 2001
avatar

LordHolNapul
Member #3,619
June 2003
avatar

yes, I know there is a way using glGetPixelMap, but I don't know how to do it...

In some way I must choose the buffer where to do the alpha test, between a certain number of buffer...

I would like to get the alpha of the SPRITES and not the alpha of the BACKGROUND (it's always full opacity).

Do you have some example code?

???

Milan Mimica
Member #3,877
September 2003
avatar

If EXT_framebuffer_object is supported you could bind a texture to a framebuffer and then read from the framebuffer with glReadPixels()... it's a bit clumsy. That's why AllegroGL provides you video bitmaps which use textures and you can use normal allegro functions like getpixel on them.

Krzysztof Kluczek
Member #4,191
January 2004
avatar

Quote:

I would like to get the alpha of the SPRITES and not the alpha of the BACKGROUND (it's always full opacity).

Then make sure you create backbuffer with alpha channel and write sprite alpha value to it. :)

Kitty Cat
Member #2,815
October 2002
avatar

Won't:

int color = 0;

glBindTexture(GL_TEXTURE_2D, texture);
glGetTexSubImage2D(GL_TEXTURE_2D, 0, x, y, 1, 1, GL_RGBA, GL_UNSIGNED_BYTE, &color);

work? Just be careful of byte order.

--
"Do not meddle in the affairs of cats, for they are subtle and will pee on your computer." -- Bruce Graham

LordHolNapul
Member #3,619
June 2003
avatar

Quote:

Then make sure you create backbuffer with alpha channel and write sprite alpha value to it. :)

Yes, this is what I must do!
I'm still trying to understand how the function below works ..(?)

void glReadBuffer( GLenum mode )

Allegro use the following code:

allegro_gl_set(AGL_DOUBLEBUFFER , 1) 
//[....]
allegro_gl_flip();   // Flips the front and back framebuffers.

So we are in a DOUBLE BUFFER mode, and this is a swap between GL_FRONT and GL_BACK
buffers... :P

Now, how can I define a new Buffer , isolated from this two , that can be used exclusively to test the alpha channel. Probably it can also be used for drawing operations, but the main purpose is the alpha check. Do someone of you know the Opengl sequence of commands to define this new buffer ?

So , my workspace should be like this....

- BACKBUFFER:
1) Backround - BackBuffer
2) Sprites - BackBuffer
3) Alpha Sprites - (....?)
4) Lighting/post effects - backbuffer

- Front BUFFER
1) Screen (with the previous image)

Lot of thanks :-/

*************************************************
EDIT for KITTY KAT

Your function is very interesting, but I cannot find any documentation about on my pages.... The only I've found similar to your is the following... very interesting.. how it works ? 8-)

void glTexImage2D( GLenum target, GLint level, GLint components, GLsizei width, 
GLsizei height, GLint border, GLenum format, GLenum type, const GLvoid *pixels )

Bob
Free Market Evangelist
September 2000
avatar

Quote:

int color = 0;

glBindTexture(GL_TEXTURE_2D, texture);
glGetTexSubImage2D(GL_TEXTURE_2D, 0, x, y, 1, 1, GL_RGBA, GL_UNSIGNED_BYTE, &color);
work? Just be careful of byte order.

Or use this type-safe, endian-safe code instead:

GLubyte color[4];

glBindTexture(GL_TEXTURE_2D, texture);
glGetTexSubImage2D(GL_TEXTURE_2D, 0, x, y, 1, 1, GL_RGBA, GL_UNSIGNED_BYTE, &color[0]);

color[0] is the R channel, color[1] is G, etc. If you want the reverse order, just use GL_ABGR_EXT instead of GL_RGBA.

Quote:

how can I define a new Buffer , isolated from this two , that can be used exclusively to test the alpha channel. Probably it can also be used for drawing operations, but the main purpose is the alpha check.

Can you describe, exactly, the operation you want to do to your bitmaps? It's not clear at all what you are trying to accomplish, since you are mixing intent with implementation.

--
- Bob
[ -- All my signature links are 404 -- ]

LordHolNapul
Member #3,619
June 2003
avatar

Where can I find useful documentation about this function ?
It could be the quick solution to alpha check of a texture...

glGetTexSubImage2D(...);

Kitty Cat
Member #2,815
October 2002
avatar

Hmm, apparently I got it mixed up with glTexSubImage2D. The Get version doesn't exist. You can however use glGetTexImage2D to get the whole texture, then read whichever pixel(s) you want.

But as Bob asked, what exactly are you trying to accomplish with this? There may be a better way to do it. You don't generally need to use a getpixel-like function in accelerated rendering contexts, since it negates a big chunk of its advantages.

--
"Do not meddle in the affairs of cats, for they are subtle and will pee on your computer." -- Bruce Graham

LordHolNapul
Member #3,619
June 2003
avatar

I just want to access the 32bits of a TEXTURE, 8bit red, 8 bit green, 8bit blue and 8 bit alpha....
I don't know how. I don't know the sequence of Opengl commands...

When I've got all this kind of informations I can do some features to my games..

My Texture are 32bit texture (with alpha channel) and my screen buffer is 32 bit.
thanks.

Bob
Free Market Evangelist
September 2000
avatar

Quote:

When I've got all this kind of informations I can do some features to my games..

Are you going to tell us what that feature is?

glGetTexImage() is how you get the information you want.

--
- Bob
[ -- All my signature links are 404 -- ]

LordHolNapul
Member #3,619
June 2003
avatar

well , ok, this is not a real feature...
I'm trying to convert my game (download it below) to an OpengL version..
then I should move on in further programming and new features about my game...

if you write to me some code I will surely try it :)
The problem is that I'm not familiar with opengl yet. I've got output of my sprites, but advanced tests are very difficult to me.

??? :( :-/

Richard Phipps
Member #1,632
November 2001
avatar

I think what you mean is that you want to convert your memory sprites to a texture format that OpenGL can use. This is different from reading from a texture in video memory.

LordHolNapul
Member #3,619
June 2003
avatar

Quote:

I think what you mean is that you want to convert your memory sprites to a texture format that OpenGL can use. This is different from reading from a texture in video memory.

Plz, do not think.
I want to read all the BITS of an OpenGL Texture or at least, some specific BITS inside the Texture, giving the coordinates. the Allegro BITMAPS has been delete to free the space, and they cannot be used. So I need to replace all BITAMP commons operation upon the TEXTURES.

(Video memory , in this case, is ambiguos, because we could have Allegro video bitmaps and Opengl video bitmaps, so forget about that. Call them Textures..)

Someone know how ?
Thanks :-X

Richard Phipps
Member #1,632
November 2001
avatar

I'll try again. Reading from a texture is EXTREEMLY SLOW. Since most PC's have more main memory than video card memory, you might as well keep the original images in memory, use getpixel or something similar with them. You would also need the original images in memory to restore them to the gfx card if the user tabs away and back or you change screen resolution.

Krzysztof Kluczek
Member #4,191
January 2004
avatar

Quote:

the Allegro BITMAPS has been delete to free the space, and they cannot be used.

Then don't delete them. They don't take that much space and if they take less than 100MB, then its fine if you leave them in memory and use them. :)

Quote:

You would also need the original images in memory to restore them to the gfx card if the user tabs away and back or you change screen resolution.

If user tabs away OpenGL restores everything for you. :)

Richard Phipps
Member #1,632
November 2001
avatar

How, if the video card memory is changed?

Krzysztof Kluczek
Member #4,191
January 2004
avatar

Quote:

How, if the video card memory is changed?

Probably driver keeps a copy in system memory. OpenGL standard doesn't say anything that you may lose a texture after it has been properly created, so driver somehow has to restore it for you. :)

Kitty Cat
Member #2,815
October 2002
avatar

Quote:

I want to read all the BITS of an OpenGL Texture or at least, some specific BITS inside the Texture, giving the coordinates.

You're still not saying what for. :P If you need to read pixels off a texture/bitmap, then the program is possibly not designed to be accelerated well at all.

If you say what you need to use the pixels for, perhaps we can help figure out a way to do it to take advantage of the provided acceleration.

--
"Do not meddle in the affairs of cats, for they are subtle and will pee on your computer." -- Bruce Graham

Richard Phipps
Member #1,632
November 2001
avatar

OpenLayer has a setting to keep a memory copy or not, so I didn't think it was automatic in the OpenGL driver.. But I wouldn't know for sure! :)

Krzysztof Kluczek
Member #4,191
January 2004
avatar

Quote:

OpenLayer has a setting to keep a memory copy or not, so I didn't think it was automatic in the OpenGL driver.

This is probably for reading pixels from Allegro BITMAP, like we are suggesting in this thread. :)

Richard Phipps
Member #1,632
November 2001
avatar

Well if it is not enabled than alt-tabbing away and then back results in a black screen as all the textures are lost (IIRC). :)

LordHolNapul
Member #3,619
June 2003
avatar

Ok. I'll keep allegro bitmap, but in my opinion it's not the best way, because:

If I enlarge the Texture, his alpha channel enlarges too... so if I must get a point from the BITMAP, I must do more computing (enlarged coordinates) to get the right point.
If I ROTATE the sprite, his alpha channel rotates too... same problem.

Anyway all of this sounds much difficult.

Until today I've rotated my sprites drawing rotated images, and not rotating the Square in which they are...

Thanks everyone guys. Simple is perfect, a bit expensive in memory, but perfect (for 2D games i mean).

;)

 1   2 


Go to: