I wanted to track the performance of my code so I stored the start and end time using System.DateTime.Now
. I took the difference between the two as the time my code to execute.
I noticed though that the difference didn't appear to be accurate. So I tried using a Stopwatch
object. This turned out to be much, much more accurate.
Can anyone tell me why Stopwatch
would be more accurate than calculating the difference between a start and end time using System.DateTime.Now
?
BTW, I'm not talking about a tenths of a percent. I get about a 15-20% difference.
As per MSDN:
The Stopwatch measures elapsed time by counting timer ticks in the underlying timer mechanism. If the installed hardware and operating system support a high-resolution performance counter, then the Stopwatch class uses that counter to measure elapsed time. Otherwise, the Stopwatch class uses the system timer to measure elapsed time. Use the Frequency and IsHighResolution fields to determine the precision and resolution of the Stopwatch timing implementation.
It uses a higher resolution / precision than DateTime.Now
.
You can also check out these related links:
Environment.TickCount vs DateTime.Now
Is DateTime.Now the best way to measure a function's performance?
DateTime
is good enough for precision to the second probably but anything beyond that I would recommend StopWatch
.