OpenGL: EXT_framebuffer_object
Fred Alert

Hi, I'm trying to use the EXT_framebuffer_object extension to render a 1600x1600 OpenGL image to hard disk. But up to now, my code only saves black bitmaps.
Can somebody tell me what's wrong with my code?

1 
2void render_to_hd()
3{
4 GLuint fb, texture, depth_rb;
5 if(!allegro_gl_is_extension_supported("GL_EXT_framebuffer_object"))allegro_message("framebuffers not supported");
6 
7 glGenFramebuffersEXT(1,&fb);
8 glBindFramebufferEXT(GL_FRAMEBUFFER_EXT, fb);
9 glGenRenderbuffersEXT(1,&depth_rb);
10 glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, depth_rb);
11 glGenTextures(1, &texture);
12 glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT32, 1600, 1600);
13 glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, depth_rb);
14 glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_COLOR_ATTACHMENT0_EXT, GL_TEXTURE_2D, texture, 0);
15 
16 glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, 1600, 1600, 0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
17 glBindTexture (GL_TEXTURE_2D, texture);
18
19 glTexEnvi(GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_MODULATE);
20 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
21 glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
22 
23 glDrawBuffer(GL_FALSE);
24 glReadBuffer(GL_FALSE);
25 
26 GLenum status=glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT);
27 if(status==GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_EXT) allegro_message("error");
28 
29 glViewport(0, 0, 1600, 1600); //113
30 
31 draw(); //draws everything as if it was for normal buffers
32 
33 unsigned char buff[4*1600*1600];
34 BITMAP *bmp=create_bitmap(1600,1600);
35 glGetTexImage( GL_TEXTURE_2D, 0, GL_RGB, GL_UNSIGNED_BYTE, buff);
36 int y;
37 for(y=0;y<1600;y++)
38 memcpy(bmp_up->line[y],buff+4*1600*(1600-1-y),4*1600);
39 save_bitmap("bitmap.bmp",bmp,NULL);
40 
41}

edit: Of course that's pseudo code, Allegro and OpenGL are initiated correctly.
edit2:updated the code

GullRaDriel

allegro_init();
allegro_gl_init();
set_color_depth( your_color_depth );
set_gfx_mode( GFX_OPENGL );

I am just trying to help, not sure this will solve the problem, but both allegro and allegrogl must be initialized before any use of their functions

Kitty Cat

glFramebufferTexture1DEXT
??? I think you mean
glFramebufferTexture2DEXT
As well, your renderbuffer storage should be GL_DEPTH??_EXT since you're using it for the depth buffer (where ?? is the bit depth you want for the buffer) and not the color buffer. Also, you really shouldn't put such a large static array on the stack like that. :)

Fred Alert

@Kitty Cat: :)Thanks for the hints, I changed it, but my saved bitmap remains black:(

My copiler can't find GL_DEPTH32_EXT...

Unfortunately the array has to be that large. But I think that's not the problem - I tried to render a 100x100 texture this way, but the problem keeps to be the same.

I built in glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT), and this delivers the error enum GL_FRAMEBUFFER_INCOMPLETE_ATTACHMENT_EXT.
But I don't understand it...

Kitty Cat

GL_DEPTH I'd imagine is invalid, thus the render buffer is incomplete, which causes the framebuffer to be incomplete. Checking my info, it appears it's GL_DEPTH_COMPONENT??, where ?? is 16, 24, or 32.

Pre-emptively, if you're going to want a stencil buffer, you'll need to create a packed depth-stencil renderbuffer instead, using GL_DEPTH_STENCIL_EXT or GL_DEPTH24_STENCIL8_EXT (I'd imagine both are the same), then bind that to both the depth and stencil framebuffer attachments. Last I tried, a stand-alone stencil buffer wouldn't work.

As for the array, it'd be better to make it a dynamic array. Make char *image_buf; global, allocate it once at program start, then deallocate it before closing. :)

Fred Alert

@Kitty Cat, thanks, that killed the error message! But the Problem still remains...

Kitty Cat

Does the code work if you draw to the screen (don't forget to flip for that)?

EDIT:
Oh. And you may need to switch to a power-of-2 texture size.

Ciro Duran
KittyCat said:

EDIT:
Oh. And you may need to switch to a power-of-2 texture size.

That would depend if Fred Alert has a card which supports non-power-of-two texture dimensions. The support should be automatic, according to that extension spec. Though I second KittyCat. Try render to the screen first.

EDIT: I'm having right now the same problem as Fred Alert. Renders on screen of textures that have been updated through FBOs are appearing black. I'm tired tonight, so I'll continue debugging tomorrow ::).

Attachment contains a screenshot of my program showing all textures in black, with a test quad above them all.

Fred Alert

Rendering to screen works. What about allegro_gl_flip()? It surely has to be left out when rendering to texture!
What do you mean by power-of-two textures? 2-dimensional?
I don't understand the term "power of...".

GullRaDriel

Texture with dimension that are power of two values.

Milan Mimica

<math>x = {2^a}</math>
x is a power of two because
<math>log_2{x}=a</math>

Try with image sizes like 512, 1024, 2048... i think there is a limit though.

Fred Alert

:D I got the point. 2^n.
I tried but it made no difference :(

Kitty Cat

Did you remove

glDrawBuffer(GL_FALSE);
glReadBuffer(GL_FALSE);

? That might be causing problems.

Thread #588424. Printed from Allegro.cc