al_set_shader_float etc. not working in C++ interface to Allegro 5
bitrex

I have a generic GLSLShader class which has protected functions which make the Allegro API calls to set various uniforms for shaders:

#SelectExpand
1protected: 2 3... 4 5void set_shader_float(const char* name, float f) const 6{ 7 if (!al_set_shader_float(name, f)) 8 { 9 std::cerr << al_get_shader_log(_shader.get()) << std::endl; 10 throw std::runtime_error("Unable to set shader float uniform.\n"); 11 } 12} 13 14 15void set_shader_float_vector(const char* name, int num_components, 16 const float* f, int num_elems) const 17{ 18 if (!al_set_shader_float_vector(name, num_components, f, num_elems)) 19 { 20 std::cerr << al_get_shader_log(_shader.get()) << std::endl; 21 throw std::runtime_error("Unable to set shader float vector 22 uniform.\n"); 23 } 24} 25 26etc...

I then call these functions from the constructor and/or render method of a subclass that composites a specific shader's source code and compiled shader object:

#SelectExpand
1class TestShader : public Shader::GLSLShader 2 { 3 public: 4 TestPixelShader(const char* pixel_shader_src) : 5 Shader::GLSLShader(pixel_shader_src), 6 _shader_timer(std::unique_ptr<Timer::allegro_timer_t>(new Timer::allegro_timer_t{1.0/60})) 7 { 8 al_start_timer(*_shader_timer); 9 } 10 11 private: 12 std::unique_ptr<Timer::allegro_timer_t> _shader_timer; 13 14 GraphicsTypes::allegro_bitmap_variant_t 15 _render(const GraphicsTypes::allegro_bitmap_variant_t& sampler_bitmap, 16 const GraphicsTypes::allegro_bitmap_variant_t& buffer_bitmap) const override 17 { 18 float tints[12] = 19 { 20 4.0, 0.0, 1.0, 21 0.0, 4.0, 1.0, 22 1.0, 0.0, 2.0, 23 4.0, 4.0, 1.0 24 }; 25 26 this->set_shader_float("test", 1.0); 27 28 this->set_shader_float_vector("tint", 3, &tints[0], 1); 29 al_draw_bitmap(sampler_bitmap, 0, 0, 0); 30 31 this->set_shader_float_vector("tint", 3, &tints[3], 1); 32 al_draw_bitmap(sampler_bitmap, 60, 0, 0); 33 34 this->set_shader_float_vector("tint", 3, &tints[6], 1); 35 al_draw_bitmap(sampler_bitmap, 0, 60, 0); 36 37 return buffer_bitmap; 38 } 39 };

Shader source:

#SelectExpand
1 const char* _test_pixel_shader_src = 2 { 3 "#ifdef GL_ES\n precision mediump float;\n #endif\n \ 4 uniform float test; \ 5 uniform sampler2D al_tex; \ 6 uniform vec3 tint; \ 7 varying vec4 varying_color; \ 8 varying vec2 varying_texcoord; \ 9 void main() \ 10 { \ 11 vec4 tmp = varying_color * texture2D(al_tex, varying_texcoord); \ 12 tmp.r *= tint.r; \ 13 tmp.g *= tint.g; \ 14 tmp.b *= tint.b; \ 15 gl_FragColor = tmp; \ 16 } " 17 };

These function wrappers work fine and my bitmap is rendered correctly when I use the wrapper functions where the uniforms are passed in to Allegro's shader API calls by pointer, however all functions like set_shader_float where the uniforms are passed by value fail with the runtime error. I'm having trouble figuring out what specifically is going wrong because the output from the shader's log seems to be empty. Any suggestions as to what could be going wrong here?

SiegeLord

In general, the API seems to work in one of Allegro's examples. Can you verify that you're actually using the correct shader source (e.g. if you change it to output a constant color it is correctly reflected in the outoput)?

bitrex

Hi, sorry for the delay in my reply. I haven't had much time to work on my Allegro project this week but I think I've traced the bug down to a thread safety issue - the "render" method shown here is held by an object in one thread but is being called through a reference from another thread and I think that's why setting the uniforms is giving strange behavior - clearly OpenGL calls would not be thread safe. I'll try locking/mutexing this section for now and see if that works; ideally calls that alter the pipleline would not be made across threads at all but that's a TODO...:P

Thread #617024. Printed from Allegro.cc