I am using a timer with interval 1 second. But in the timer's tick event when I print the time it's always 62 or 65 ms. I don't understand why it's taking 10 ms more.
Please can some one have look into this.
Here is the code I am using:
static int _counter;
var _timer = new System.Timers.Ti开发者_运维知识库mer(1000);
public Form1()
{
InitializeComponent();
_timer.Elapsed += new ElapsedEventHandler(_timer_Elapsed);
_timer.Start();
}
private void _timer_Elapsed(object sender, ElapsedEventArgs e)
{
Console.WriteLine(DateTime.Now.ToString("{hh:mm:ss.fff}"));
_counter++;
if (_counter == 20)
_timer.Stop();
}
And this the output:
{01:59:08.381}
{01:59:09.393}
{01:59:10.407}
{01:59:11.421}
{01:59:12.435}
{01:59:13.449}
{01:59:14.463}
{01:59:15.477}
{01:59:16.491}
{01:59:17.505}
{01:59:18.519}
{01:59:19.533}
{01:59:20.547}
{01:59:21.561}
{01:59:22.575}
{01:59:23.589}
{01:59:24.603}
{01:59:25.615}
{01:59:26.629}
{01:59:27.643}
You need to understand that Windows is not a real-time operating system. Real-time operating systems have timer mechanisms that allow the system to make hard guarantees about when timer-initiated events occur and the overhead associated with them, and allow you to specify what behavior should occur when the deadline is missed -- for example if the previous execution took longer than the interval.
I would characterize the Windows timers as "best effort" when it comes to smaller intervals. When the interval is sufficiently long you don't notice that you aren't getting the exact interval that you requested. As you get closer and closer to the resolution of the timer (the frequency at which the timer runs), you start seeing the overhead as a percentage of the interval increase. Real-time systems take special care to minimize the software overhead, relying on more sophisticated and faster hardware solutions. The exact frequency of the Windows timer depends on the timing services that the underlying hardware provides and so may differ from system to system.
If you have real-time needs -- and doing something every 50ms may fall into that category -- then you may need to look at specialized hardware and/or a real-time OS.
It's because of the limited resolution of the system clock. The event occurs at the next system tick after the specififed time, so you will always get a few extra milliseconds.
If you need a more precise timer, you can hook into the Win32 Multimedia Timer, it is the most accurate timer (down to 1ms). Here's an article on CodeProject showing how to hook into it from C#.
First, as other people have noted, you're setting it to 1s, not 50ms.
Secondly, windows is not a real-time OS. None of the timer classes are exactly precise. All you're doing it saying that you want to wait at least this long. It takes some amount of time for everything to fire and you to end up notified that the timer has ticked once windows gets around to actually servicing the tick message.
Note that usually, in most language, sleep calls specify the minimum time after which a process would awaken. After the specified time has passed, the process is put on a queue and hopefully the scheduler activates it. But this activation may sometimes be delayed. I'm not sure about the Timer class, but I suspect it may suffer from a similar problem.
You may perhaps try to increase the priority of your process to cut down the increased time.
System.Timers.Timer is not a precise timer. Especially when system is under load it can have even bigger delays.
Also to get better accuracy in your example change time measuring code to use Stopwatch class.
static int _counter;
System.Timers.Timer _timer = new System.Timers.Timer(1000);
Stopwatch sw;
public Form1()
{
InitializeComponent();
_timer.Elapsed += new ElapsedEventHandler(_timer_Elapsed);
_timer.Start();
sw = Stopwatch.StartNew();
}
void _timer_Elapsed(object sender, ElapsedEventArgs e)
{
Console.WriteLine(sw.ElapsedMilliseconds);
_counter++;
if (_counter == 20)
_timer.Stop();
sw.Reset();
sw.Start();
}
Using the system timers will always be a little longer than the value requested. This is due to the overhead of the other processes in the system.
On my system it's 14ms. Having googled; the difference is down to context thread switching delay. There's an article regarding high resolution timers here
As other responders have mentioned, Windows is not a real-time OS. If you must use windows, try using Win CE or Windows Embedded.
-S!
The accuracy of the time may depend on how many processes run. If you have that option , I would reduce the number of processes that run on your computer one by one and I mean those which consume significant cpu time,I would check if the times improve. Especially, browsers, virus scanners,programs running in the background.
The deviations are normal since they are not RTOS (real time operating systems). This is the best solution that I've found under the circumstances: Link
Program.MicroTimer microTimer = new Program.MicroTimer();
microTimer.MicroTimerElapsed += new Program.MicroTimer.MicroTimerElapsedEventHandler(OnTimedEvent);
microTimer.Interval = 1000; // Call micro timer every 1000µs (1ms)
// Can choose to ignore event if late by Xµs (by default will try to catch up)
// microTimer.IgnoreEventIfLateBy = 500; // 500µs (0.5ms)
microTimer.Enabled = true; // Start timer
System.Threading.Thread.Sleep(2000);
microTimer.Enabled = false;
Those are the code snippets. You can try them to see the values in the console.
精彩评论