In my game, I'm experiencing nasty single-frame jumps in frame time in some very specific circumstances.
There are a lot of drawing going on each frame, but only one particular operation leads to slowdown when it happens. I have a texture, content of which I change sometimes according to some game events. When that happens, I draw a pre-made image over that texture. The texture itself is rendered to screen in every frame.
Simplified, the code for updating the texture looks like this:
al_get_blender( &a, &b, &c );
al_set_blender( ALLEGRO_DEST_MINUS_SRC, ALLEGRO_ONE, ALLEGRO_ONE );
ALLEGRO_BITMAP *old = al_get_target_bitmap();
al_set_target_bitmap( alTrgImg->GetBitmap() );
al_draw_bitmap( alImg->GetBitmap(), x, y, 0 );
al_set_target_bitmap( old );
al_set_blender( a, b, c );
From my own al_get_time()-based profiling, I gather that slowdown happens inside al_flip_display() whenever I run this code. Amount of slowdown is quite noticeable. When usually I get 16ms per frame (with VSync enabled), here I get up to 65ms or even 100ms!
This only happens if I:
1) Use DirectX Allegro render
2) Enable VSync
3) Run game in Fullscreen
If I change any of this, the game runs just fine without any slowdowns. If I run the game without VSync, I can get up to ~1000FPS without any slowdowns, so it's not as if this operation with texture is very expensive!
So far, I failed to track down this problem to any specific code inside al_flip_display() with CPU profilers (Visual Studio built-in profiler and AMD CodeXL), and I lack knowledge to use GPU profilers (I looked at results from NVidia's tool, but couldn't interpret it in any useful way).
I also was unable to reproduce this problem with a smaller example, so I guess I must be doing something else wrong. Somewhere.
I already spent a lot of evening debugging on this, and I'm stumped, so, I'm asking for any advice, or suggestion for a tool, or any other help.