|
|
| high precision time |
|
orz
Member #565
August 2000
|
How do I get the current time w/ high precision? Millisecond or sub-millisecond precision is essential, it must work under either windows95 and portability is preffered. Alternatively, does anyone have any suggestions for profiling? I haven't tried gcc's profiling stuff yet, but VC fails miserably when I tell it to profile... the program is started in the wrong directory and runs at 1% of it's normal framerate, rendering most performance data useless. |
|
Jason Heim
Member #484
June 2000
|
orz, this may be a dumb question, but why can't you use the normal timer routine? most of the time in performance analysis, you don't really care what the current time really is, you just want to know how many seconds, milliseconds, or nanoseconds have passed. or am i way off track for what you want? i did some perf analysis with my game just using the normal timer, although the more precise you make the timer the further off you throw your game perf best of luck, jase |
|
orz
Member #565
August 2000
|
Which normal timer routines? |
|
epiwerks
Member #489
June 2000
|
Windows takes over the timer and sets it to a fixed resolution (I think it's 100 hz, but not sure), so you might as well forget it. ------ |
|
Jason Heim
Member #484
June 2000
|
just as i thought, a dumb question... missed the fact that you're running in windows. i run with DJGPP and dos, the allegro timer routines work great for me there (although i haven't gone under millisecond granularity). sorry! jase |
|
Bob
Free Market Evangelist
September 2000
|
In DOS, you can have up to 1.09E-6 seconds resolution for the timer. But in Windows (and Linux), you are more limited to something like 8 or 10ms per tick. Allegro fakes higher precision timers by calling your routine multiple times in a row. If you want high precision timers, then you'll need to use the 'rtdsc' instructoin on Pentiums. This asm instruction returns the number of CPU cycles since the last call to it. You haven't told us though why you needed such high-rez timers... -- |
|
orz
Member #565
August 2000
|
In general I just like having high-precision time available, but in specific, I'm trying to measure the time spent doing different aspects of the screen output, which initial analysis suggests is responsible for a substancial decrease in framerates compared to older versions, even with newer features disabled. The best I can think of with normal timing functions is trying to measure millisecond events by integrating over larger numbers of them. Additionally I should say that I don't need callbacks, I just need to be able to measure the time. |
|
Bob
Free Market Evangelist
September 2000
|
Have you tried to profile your program? -- |
|
Bob
Free Market Evangelist
September 2000
|
Sorry, I should've re-read your post. -- |
|
Ahhhh Penis
Member #564
August 2000
|
Is it really that important for this thing to be under windows? By the way, what is this 'thing'. |
|
George Foot
Member #669
September 2000
|
On Pentiums and better systems you can use the RDTSC instruction to read the number of CPU cycles elapsed since the system started. If your CPU is 600MHz, this increases 600 million times per second. Is that precise enough? |
|
George Foot
Member #669
September 2000
|
Incidentally, here's the basic (gcc) code to use rdtsc. Only try this on Pentiums. |
|
orz
Member #565
August 2000
|
This thing is a clone of Star Control 2 SuperMelee. (12 megs download, ftp://orz.res.wpi.net/main/cur.zip) |
|
orz
Member #565
August 2000
|
gfoot: |
|
Jeremias Raziel
Member #581
August 2000
|
I dont know if this is vera accurate, but: In the docs of allegro, wasnt there a function called install_int_ex which lets you pick a higher accuracy than milliseconds? You just have to give it ticks, which are 118... or so per ms. But if you know how much these are exactly, then youd only have to divide it by then to get the value for ms/10, or if you better like it: Divide it by 1000 and youd get.. Nanoseconds? Microseconds? Dont know it. Hopefully it helps you. |
|
|