How can I convert microseconds to milliseconds and make it possible to compare timestamps by substraction? Example:
int t1 = (int)timeGetTime();
int t2 = (int)timeGetTime()+40;
// now t1 - t2 < 0 which means that t1 < t2.
This logic won't work if you divide time values by 1000, to convert microseconds to milliseconds.
Edit
I guess the only solution is to store all timestamps in microseconds. Conversion to milliseconds can开发者_StackOverflow中文版 only be done after substraction, to delta values.
Why not multiply the millisec values by 1000 so that you are comparing in microseconds? Alternatively, use floating point numbers instead.
The resolution of clock timers in Windows are restricted to about 10 milliseconds so you will never be able to get time values to the precision of microseconds.
If your time values are coming from somewhere else that is capable of that resolution then take the values as microseconds. Multiplying by 1000 or dividing an int by 1000 will not give you any better resolution it will just change the scale of your comparision.
You'll need to use a high resolution timer to get microsecond granular time.
You can use the windows API to check how granular you can make it.
QueryPerformanceFrequency(&ticksPerSecond); QueryPerformanceCounter(&tick);
These two functions will help you along that way. Take a look at the MSDN articles for more. :)
精彩评论