OpenLayer screenshot system: glReadPixels appropriate?
Indeterminatus

Hello, guys. My research in possible screenshot system implementations with OpenGL all pointed to glReadPixels.

iirc, it is safe to call OpenGL functions additionally to OpenLayer, as long as it's guaranteed that such functions do not meddle with OpenGL's state.

I couldn't find the info whether glReadPixels is state-changing or not (I could imagine it is not, but you never know ...), so I'd go for that approach, unless someone now convinces me there's a better method ;)

P.S.: For the case it is not clear what I meant with "screenshot system": I want the current screen saved to a .png file (or whatever) on keypress. Nothing of all that is much of a problem, except of the screen-grabbing part.

adhrymes

i haven't used opengl forever but if you're using a screen buffer couldn't you just write the last buffer out to a file?

Thomas Harte

glReadPixels is pretty much the only way to implement such a thing, but be warned that the format of data returned will depend on the GL state with respect to arguments previously passed to glPixelStore, glPixelTransfer, and glPixelMap. What you should do is to glPushClientAttrib(GL_CLIENT_PIXEL_STORE_BIT) before using glPixelStore/Transfer/Map to set the way you want data returned, use glReadPixels to get the data for your screenshot and then execute a glPopClientAttrib so that you've not affected the GL state in any way.

Fladimir da Gorf
Quote:

iirc, it is safe to call OpenGL functions additionally to OpenLayer, as long as it's guaranteed that such functions do not meddle with OpenGL's state.

It's the other way around: Call any OpenGL functions you want but don't assume that the OpenGL state stays the same if you call any OpenLayer's functions.

In OL2.0, you can compile OL with OL_NO_STATE_CHANGE which will make OpenLayer to not to change the OpenGL state at all (but the performance will be slightly worse).

Murat AYIK

I implemented screen-grabbing right after I switched to AllegroGL(while kinda a newbie). I tried "blit()" using a temp bitmap with success and it still works the same way without any problems. I guess AllegroGL already converts it into something safe. I was planning to ask it but forgot. This thread looks appropriate. May it anyhow cause any problems in the future?

Indeterminatus

Thanks, Thomas Harte, that really hit the nail on the head. glReadPixels it is, then.

Fladimir da Gorf, thank you for clearing things out, already looking forward to OpenLayer 2.0! :)

Krzysztof Kluczek

Simple screenshot function. Works fine for me so far. :) (EDIT: assuming you are using 32-bit Allegro color depth)

1void save_screenshot(const char *path)
2{
3 BITMAP *bmp = create_bitmap(SCREEN_W,SCREEN_H);
4 if(!bmp) return;
5 byte *buff = new byte[4*SCREEN_W*SCREEN_H];
6 if(!buff)
7 {
8 destroy_bitmap(bmp);
9 return;
10 }
11 
12 glReadPixels(0,0,SCREEN_W,SCREEN_H,GL_RGBA,GL_UNSIGNED_BYTE,buff);
13 
14 int y;
15 for(y=0;y<SCREEN_H;y++)
16 memcpy(bmp->line[y],buff+4*SCREEN_W*(SCREEN_H-1-y),4*SCREEN_W);
17 
18 save_bitmap(path,bmp,NULL);
19 
20 destroy_bitmap(bmp);
21 delete buff;
22}

Quote:

In OL2.0, you can compile OL with OL_NO_STATE_CHANGE which will make OpenLayer to not to change the OpenGL state at all (but the performance will be slightly worse).

I think adding something like ReturnAffectedStateToOpenGLDefaults could be cool. :)

imaxcs

Krzysztof Kluczek: I tried your function and needed to rename "byte" with "bool". But the screenshot that is taken looks like...I don't know (see attached).::)
I am using OpenLayer and do not know, what colordepth it uses, so that might be the problem.
Can you help me?

Krzysztof Kluczek
Quote:

Krzysztof Kluczek: I tried your function and needed to rename "byte" with "bool".

No. You needed to rename "byte" to "(unsigned char)" or place "typedef unsigned char byte;" somewhere. :)

Quote:

But the screenshot that is taken looks like...I don't know (see attached).::)

It looks exactly like you were setting Allegro to 16-bit (or 15-bit) color depth. Change your "set_color_depth(16)" to "set_color_depth(32)" and it will be fine. :)

imaxcs

That did the trick! :)

Kitty Cat

FYI, changing the code to use create_bitmap_ex(32, ...); would've worked, too (in case you can't set a 32-bit mode for some reason).

Fladimir da Gorf

OpenLayer should choose the same color depth for Allegro and AllegroGL. I'll have to investigate this...

Also in OL2.0 there's screen capture and Save functions in Bitmap ;) (And also BITMAP *Bitmap::GetMemoryBitmap())

Thread #552203. Printed from Allegro.cc