Allegro.cc - Online Community

Allegro.cc Forums » Programming Questions » high precision time

This thread is locked; no one can reply to it. rss feed Print
high precision time
orz
Member #565
August 2000

How do I get the current time w/ high precision? Millisecond or sub-millisecond precision is essential, it must work under either windows95 and portability is preffered.

Alternatively, does anyone have any suggestions for profiling? I haven't tried gcc's profiling stuff yet, but VC fails miserably when I tell it to profile... the program is started in the wrong directory and runs at 1% of it's normal framerate, rendering most performance data useless.

Jason Heim
Member #484
June 2000

orz,

this may be a dumb question, but why can't you use the normal timer routine? most of the time in performance analysis, you don't really care what the current time really is, you just want to know how many seconds, milliseconds, or nanoseconds have passed.

or am i way off track for what you want?

i did some perf analysis with my game just using the normal timer, although the more precise you make the timer the further off you throw your game perf :)

best of luck,

jase

orz
Member #565
August 2000

Which normal timer routines?
The Allegro timer routines aren't precise enough, though I'm using them now.
The only other timer routines I have documentation for are either DOS-specific and don't work w/ Allegro or are VERY low precision.

epiwerks
Member #489
June 2000

Windows takes over the timer and sets it to a fixed resolution (I think it's 100 hz, but not sure), so you might as well forget it.

------
I'm back.

Jason Heim
Member #484
June 2000

just as i thought, a dumb question...

missed the fact that you're running in windows. i run with DJGPP and dos, the allegro timer routines work great for me there (although i haven't gone under millisecond granularity).

sorry!

jase

Bob
Free Market Evangelist
September 2000
avatar

In DOS, you can have up to 1.09E-6 seconds resolution for the timer. But in Windows (and Linux), you are more limited to something like 8 or 10ms per tick. Allegro fakes higher precision timers by calling your routine multiple times in a row.

If you want high precision timers, then you'll need to use the 'rtdsc' instructoin on Pentiums. This asm instruction returns the number of CPU cycles since the last call to it.

You haven't told us though why you needed such high-rez timers...

--
- Bob
[ -- All my signature links are 404 -- ]

orz
Member #565
August 2000

In general I just like having high-precision time available, but in specific, I'm trying to measure the time spent doing different aspects of the screen output, which initial analysis suggests is responsible for a substancial decrease in framerates compared to older versions, even with newer features disabled.

The best I can think of with normal timing functions is trying to measure millisecond events by integrating over larger numbers of them.

Additionally I should say that I don't need callbacks, I just need to be able to measure the time.

Bob
Free Market Evangelist
September 2000
avatar

Have you tried to profile your program?
DJGPP 2.03 gives a very good approximation of just how much is spent on what.

--
- Bob
[ -- All my signature links are 404 -- ]

Bob
Free Market Evangelist
September 2000
avatar

Sorry, I should've re-read your post.
When compiling in VC for profiling, did you include the profiling version of Allegro? What happens if you don't?

--
- Bob
[ -- All my signature links are 404 -- ]

Ahhhh Penis
Member #564
August 2000
avatar

Is it really that important for this thing to be under windows? By the way, what is this 'thing'.
>>Palyndrome :)

George Foot
Member #669
September 2000

On Pentiums and better systems you can use the RDTSC instruction to read the number of CPU cycles elapsed since the system started. If your CPU is 600MHz, this increases 600 million times per second. Is that precise enough? :) It should work in all OSes.
I'm going to write a portable routine to do this sort of thing; it won't generally have this degree of accuracy, though. It'll interface to a number of lower level timestamping routines.
1) RDTSC on Pentiums
2) vtd.vxd in Windows
3) gettimeofday() in Unix, if available
4) clock()
(1) gives high precision, as I already said. (2) and (3) give microsecond precision, though I'm not sure of the accuracy of (2); (3) is accurate to milliseconds. (4) is generally worse precision, but it's just a last resort.
I'd like to integrate this with Allegro's timer system, and have been thinking of it for a while, but the problem is that in DOS Allegro screws with the PIC so much that `uclock' doesn't work, so `clock' is the best timer we can use (except on Pentiums). But I think the non-Pentium DOS market is becoming a very small audience, and people probably don't care that much about it.
I'm not sure whether to make a timer routine with a specific rate of increase (CPU MHz for rdtsc, ms for gettimeofday, us for vtd.vxd, CLK_TCK for clock), or to translate all these into approximate millisecond values. The translation is more convenient for the user, but discards some of the accuracy of rdtsc for instance. Then again, rdtsc overflows 32 bits in a second or two (or less), so it's kind of inconvenient.
George

George Foot
Member #669
September 2000

Incidentally, here's the basic (gcc) code to use rdtsc. Only try this on Pentiums.
unsigned long long rdtsc(void)
{
unsigned result[2];
asm("rdtsc" : "=a" (result[0]), "=d" result[1]));
return result[0] + (long long)result[1] << 32;
}
I'm not totally sure that the return calculation will work, you may need to fiddle with it.
George

orz
Member #565
August 2000

This thing is a clone of Star Control 2 SuperMelee. (12 megs download, ftp://orz.res.wpi.net/main/cur.zip)
I tried VCs profiling with allp.lib and alleg.lib. Both were EXTREMELY slow, and both ignored the "working directory" setting. The results from allp.lib were at least internally consistent. The results from alleg.lib didn't make any sense to me, but I was pretty tired by that time.
Two versions (tw05u2.zip and tw05u3.zip) showed a substancial difference in the same rendering mode (non-discrete, non-AA), from 600 fps down to 200 fps. It looks like the rendering code is taking most of the time in both. A number of things changed between versions: Allegro version (3.12->3.9.32), method of measuring framerate, lots more, but none should be able to account for this.

orz
Member #565
August 2000

gfoot:
1. I'll try your RTSC (I have to go to class now).
2. What I'd like in Allegro is a function that returns the current time in microseconds, and another that tells how accurate the first is. That's all I ever really use the existing timer stuff for anyway.

Jeremias Raziel
Member #581
August 2000

I dont know if this is vera accurate, but: In the docs of allegro, wasnt there a function called install_int_ex which lets you pick a higher accuracy than milliseconds? You just have to give it ticks, which are 118... or so per ms. But if you know how much these are exactly, then youd only have to divide it by then to get the value for ms/10, or if you better like it: Divide it by 1000 and youd get.. Nanoseconds? Microseconds? Dont know it. Hopefully it helps you.
Ls

Go to: