How does the conversion to UTC from the standard DateTime
format work?
More specifically, if I create a DateTime
object in one time zone and then switch to another time zone and run ToUniversalTime()
on it, how does it know the conversion was done correctly and that the time is still accurately represented?
There is no implicit timezone attached to a DateTime
object. If you run ToUniversalTime()
on it, it uses the timezone of the context that the code is running in.
For example, if I create a DateTime
from the epoch of 1/1/1970, it gives me the same DateTime
object no matter where in the world I am.
If I run ToUniversalTime()
on it when I'm running the code in Greenwich, then I get the same time. If I do it while I live in Vancouver, then I get an offset DateTime
object of -8 hours.
This is why it's important to store time related information in your database as UTC times when you need to do any kind of date conversion or localization. Consider if your codebase got moved to a server facility in another timezone ;)
Edit: note from Joel's answer - DateTime
objects by default are typed as DateTimeKind.Local
. If you parse a date and set it as DateTimeKind.Utc
, then ToUniversalTime()
performs no conversion.
And here's an article on "Best Practices Coding with Date Times", and an article on Converting DateTimes with .Net.