Is there any reason behind using date(January 1st, 1970) as default standard for time manipulation? I have seen this standard in Java as well as in Python. These two languages I am aware of. Are there other popular languages which follows the same standard?
Please describe.
using date(January 1st, 1970) as default standard
The Question makes two false assumptions:
Time in computing is not always tracked from the beginning of 1970 UTC. While that epoch reference is popular, various computing environments over the decades have used at least nearly two dozen epochs. Some are from other centuries. They range from year 0 (zero) to 2001.
Here are a few.
January 0, 1 BC
January 1, AD 1
October 15, 1582
January 1, 1601
December 31, 1840
November 17, 1858
December 30, 1899
December 31, 1899
January 1, 1900
January 1, 1904
December 31, 1967
January 1, 1980
January 6, 1980
January 1, 2000
January 1, 2001
The beginning of 1970 is popular, probably because of its use by Unix. But by no means is that dominant. For example:
January 0, 1900
(December 31, 1899).1 January 2001, GMT
.January 6, 1980
while the European alternative Galileo uses 22 August 1999
.Assuming a count-since-epoch is using the Unix epoch is opening a big vulnerability for bugs. Such a count is impossible for a human to instantly decipher, so errors or issues won't be easily flagged when debugging and logging. Another problem is the ambiguity of granularity explained below.
I strongly suggest instead serializing date-time values as unambiguous ISO 8601 strings for data interchange rather than an integer count-since-epoch: YYYY-MM-DDTHH:MM:SS.SSSZ
such as 2014-10-14T16:32:41.018Z
.
Another issue with count-since-epoch time tracking is the time unit, with at least four levels of resolution commonly used.