I have my framework set up to use tranformations and al_set_clipping_rectangle() to create views which can be drawn into. My framework sets up a default view which translates and clips things such that the "virtual display" is centered and black bars appear around the edges to maintain the aspect ratio.
This has been working nicely for me for a long time, but I was doing some testing on Ubuntu 13 and noticed that when using full screen mode, there tended to be some graphical artifacts left in buffer.
For instance, when the program starts, some of the graphics on the desktop remain in the buffer and show up in my program's window. It seems that Ubuntu is doing some transitional animation which affects the buffers even while my program is using the buffers, thus I end up with some of the desktop graphics being rendered onto the buffer and displayed in my program.
As a workaround, I had the entire screen cleared each frame before I set up my view transformation and clipping during the intro screen. This worked well, but then I tried Alt+Tabbing away and back to my program. The artifacts were back.
I changed the framework so that it would automatically clear the entire buffer for a couple of frames when the program regains focus. This wasn't enough to eliminate the issue since I can't figure out exactly when Ubuntu is going to decide to put stuff in the buffer.
Thinking about it now, I am wondering if it is ever safe to assume the buffers aren't going to be modified from outside my program. I had always assumed that only things I do within my program can affect the buffers that my program is using. I have never had issues on other platforms.
Does anybody have any thoughts on how I should deal with this? I am wondering if I might be able to use ALLEGRO_EVENT_DISPLAY_EXPOSE to determine when I should be clearing the entire buffer. I hate to waste resources redrawing black pixels over and over unnecessarily.