Allegro.cc - Online Community

Allegro.cc Forums » Programming Questions » OpenGL Depth Buffer Problems

This thread is locked; no one can reply to it. rss feed Print
OpenGL Depth Buffer Problems
thebignic
Member #14,419
July 2012

I'm trying to write to an OpenGL depth buffer using a slight modification of the default fragment shader. The idea is that I can just set the uniform depth for a sprite (not sure how efficient this will be, but trying to get it to work first.)

Shader:

#SelectExpand
1#ifdef GL_ES 2precision mediump float; 3#endif 4uniform sampler2D al_tex; 5uniform bool al_use_tex; 6varying vec4 varying_color; 7varying vec2 varying_texcoord; 8 9uniform float depth; 10 11void main() 12{ 13 14 if (al_use_tex) { 15 gl_FragColor = varying_color * texture2D(al_tex, varying_texcoord); 16 } else { 17 gl_FragColor = varying_color; 18 } 19 20 gl_FragDepth = depth; 21 22}

Setting up the depth buffer: (NOT using allegro's display options to specify a buffer, but I've done this manually before and it worked fine.)

        glGenTextures(1, &depthTexture_GL);
  glBindTexture(GL_TEXTURE_2D, depthTexture_GL);


  glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
  glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
  glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
  glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);


  glTexImage2D(GL_TEXTURE_2D, 0, GL_DEPTH_COMPONENT, W, H, 0, GL_DEPTH_COMPONENT, GL_UNSIGNED_BYTE, 0);
  

  glBindTexture(GL_TEXTURE_2D, 0);

And binding the depth buffer like this:

  glBindFramebuffer(GL_FRAMEBUFFER, al_get_opengl_fbo(al_get_target_bitmap()));
  glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_TEXTURE_2D, depthTexture_GL, 0);
  GLenum err = glCheckFramebufferStatus(GL_FRAMEBUFFER);
  ...
 glDepthMask(true);

But the fragment shader does not appear to be writing to the buffer when i set gl_fragDepth.

If I pre-load the depth buffer with pixel data I can display it fine (if bind the buffer as a texture at a later draw call) so I know the buffer is being setup correctly, it just does not appear to be binding correctly even though I don't get any errors when I bind. gl_fragDepth just does not appear to be writing to the buffer.

Is allegro doing something behind the scenes that would be unbinding the depth buffer when I do a simple al_draw_bitmap() call? (binding a different FBO?)

HALP!?

Elias
Member #358
May 2000

You could check with an OpenGL debugger (I've found "apitrace" for Linux, which is a rather hard to use one...). As far as I know Allegro should not do anything like that. Did you call glEnable(GL_DEPTH_TEST)?

--
"Either help out or stop whining" - Evert

thebignic
Member #14,419
July 2012

I've tried glEnable(GL_DEPTH_TEST) but afaik that only discards the fragment routine if it fails the criteria for the depth test which at this point, I don't even care about.

I know its not writing to it, so enabling the actual depth test wont help.

I'll check out apitrace. I didn't know anything like that even existed...

Edit: apitrace basically just shows a log of the APi calls which isn't all that useful. it doesnt look like allergo is unbinding anything... and I already know the framebuffer is complete and there are no errors after attaching the depth buffer.

Elias
Member #358
May 2000

thebignic said:

I've tried glEnable(GL_DEPTH_TEST) but afaik that only discards the fragment routine if it fails the criteria for the depth test which at this point, I don't even care about.

Quote:

Even if the depth buffer exists and the depth mask is non-zero, the depth buffer is not updated if the depth test is disabled. In order to unconditionally write to the depth buffer, the depth test should be enabled and set to GL_ALWAYS (see glDepthFunc).

From: https://www.opengl.org/sdk/docs/man/html/glDepthMask.xhtml

--
"Either help out or stop whining" - Evert

thebignic
Member #14,419
July 2012

I had the following:

    glDepthFunc(GL_ALWAYS);
    glEnable(GL_DEPTH_TEST);
    glDepthMask(true);

But that just seemed to cause the buffer to be full of 0.5 even though I had never cleared the buffer and had pre-loaded it with randomized pixel values.

Good to know that it has to be enabled for the mask to be written but doesn't seem to help, and in fact made things more difficult to debug because it was clearing the buffer.

Even if my fragment shader specifically sets gl_fragDepth=0 or gl_fragDepth=1, the buffer is still a neutral grey when GL_DEPTH_TEST is enabled.

Elias
Member #358
May 2000

How are you setting your value for depth?

--
"Either help out or stop whining" - Evert

thebignic
Member #14,419
July 2012

Guess who was changing the target bitmap AFTER setting the shader?

<--- this guy.

Go to: