I wanted to benchmark blitting various size textures to the screen.
Either
- Draws per second (requiring FPS timing)
- Seconds per X draws (simply run X blits and time the program's execution.)
I ran into an issue with the first method. Since Allegro is using an event structure but is "essentially" single-threaded, I could "fire off" an event that says "one second has passed" but the problem is... how do you TERMINATE an existing job? The job itself would run (say blitting for 1/2 second) and only when that job (a function, really) is finished its for loop, can the event actually be processed.
I kept trying to get that to work but failing (perhaps I just have a mental block for the simple solution). So I fell back on the second method. Simply draw 30000 blits of a specific texture size and time it using Linux/bash's time command.
But here's the problem, and I'm considering making a StackOverflow post about it but I couldn't find ANYONE with the problem.
SOMETIMES, when I run:
$ time ./myprogram
It'll be 5-10 seconds. Other times with the same program it'll report close to .5 seconds which is FAR lower than it actually is. Almost like it's timing a thread other than the main thread, and that other thread is just idling.
Here's the actual output:
And even then, when I had lower draw call numbers and it was faster, time still felt like it slightly under-reported every time. I would sit there, watch the command that I triggered, and I would feel like it was 5 seconds and it would tell me... ~2.3. And I am NOT that bad at counting seconds. But it's possible I just miscalculated something. What's NOT possible is my program magically goes from 20 seconds ... to .5 seconds.
I have tried both bash's "time" ($ time ./myprogram) as well as the (POSIX?) /usr/bin/time . Both report the same values as far as I can tell.
My only thought is that Allegro is running multiple threads and time is reporting the wrong one.
As I understand it, time measures the process, so the number of threads should not give a false result.
Are you saying the prompt does not return for 20 seconds but it still reports 0.5 seconds?
The only thing I can suggest is using the -v flag to /usr/bin/time or maybe investigate the perf command - they might give you some insight on why some runs are different.
Pete
al_get_time. Don't know what the implementation is on linux, but it should have resolution on the order of milliseconds at least.
I just timed it... on video... with my phone. 11 seconds, yet time reports 1.x seconds. Like it's off by a factor of 10x.
Try it yourself!
And the build file--though yours should work fine. For some reason I've always had problems doing it the "proper" way with the package config utility on this laptop.
FYI, I imagine I can time it other ways like with the internal allegro timer. My question here is because I've only seen it do this with my Allegro program(s) and I couldn't find a SO answer even remotely describing the problem. Perhaps I'll ask on Stack Overflow after all...