‘“Well, I'd hardly finished the first verse”, said the Hatter, when the Queen bawled out “He's murdering the time! Off with his head!”’ I won't be so bold as to suggest that all programmers should be beheaded (you wouldn't be able to read EXE, after all) but quite a lot of software developers do murder time.
In the past month the news coverage of issues related to the coming millennium have flourished in all publications, on radio and television. Even in EXE, some issues have been mentioned: Peter Collinson looked at the problem in his May column, in this issue two letters focus on time-related problems and Mark Harman in his article on program transformation suggests a way to tackle the infamous Y2K bug. The letters hint at the fact that the software implementations of time might generate uncertainties not just on January 1st, 2000.
Programmers who used delta-time routine on VMS (originally limited to handle a 9999 days difference) had their software malfunction on May 19, 1997. As for Y2K, this limitation was well-known, it was documented by DEC. The funniest thing is that VMS' internal time representation will work until July 31, 31086.
Time obtained from the Global Positioning System (GPS) will
sometime around August 20at 23:59:47 UTC on August 21, 1999. The dateweek number is stored in a
1310-bit value. It is estimated that about 10^6 GPS receivers are in
C's time_t (especially in Unix systems) failure at 22:14:07 EST on Monday January 18, 2038 or later in January 2106 depending if you're using signed or unsigned integers.
More recent systems are still vulnerable to time problems. HTTP cache scheme should fail in 9999 and James Gosling admitted that Java will ‘run out of dates in the year 292271023’.
‘“If everybody minded their own business”, the Duchess said, in a hoarse growl, “the world would go round a deal faster than it does.” “Which would not be an advantage”, said Alice, who felt very glad to get an opportunity of showing off a little of her knowledge. “Just think what it would make with the day and night! You see the earth takes twenty-four hours to turn round on its axis.”’
The issue is not limited to how well your particular system will handle time in the future, nor how well it can calculate the difference between two dates, one of which being in the future. It affects synchronisation between all devices (would you call an ATM ‘hole-in-the-wall’ a computer?) which communicate together.
When Rev. Charles Lutwidge Dodgson, aka Lewis Carroll, wrote Alice's Adventures in Wonderland the time zones and the International Date Line had not been invented yet. GMT, the local time on the Greenwich meridian based on the position of a hypothetical mean sun, was only established in 1884. That's long before the invention of computers and of the transistor! Same goes for the change in the definition of the GMT day: it was in 1925 that the day was to begin at midnight instead of noon. I won't even mention operating systems which believe that GMT is the same as the British legal time and hence consider that GMT changes to BST in summer!
On the other hand, the computer age had already started in 1967 when the System International second was redefined as being ‘the duration of 9192631770 periods of the transition between the two hyperfine levels of the ground state of the caesium 133 atom’. This time scale is called International Atomic Time or TAI.
So today we have GMT (also called Universal Time to differentiate it from GMT noon), TAI, GPS time... and UTC which stands for Coordinated Universal Time (the French and English speaking attendees of the conference which defined UTC couldn't agree on an acronym, so they choose one which is incorrect in both English and French!) They are all different and yet all related.
UTC uses the SI definition of the second but is changed whenever needed so that the difference between UTC and UT is never greater than one second. How do you code that in an algorithm? Easy for past dates, impossible for future ones as you don't know when the leap seconds will be introduced. The 21st leap second happened on the first of this month. Did you know that? That means that TAI and UT are 31 seconds apart. That's not a typo – at the creation of UTC in 1972, TAI and UT were 10 seconds apart. GPS time has a constant 19 seconds offset from TAI, hence its difference with UTC is increasing with every leap second.
Now imagine the effect on the economy if the computers calculating interest on multi-billion inter-bank fund transfers are not exactly synchronised, or worse, use two different time definitions.
To deal correctly with time routines, you have to implement an accurate leap year algorithm, know which time definition you're dealing with and possibly implement provisions for leap seconds. How many programmers are even aware of all the issues related to time?
More importantly, do you document when delivering software what assumptions you have made on time issues? Has your program a contractual lifetime where you expect it to be dumped and replaced with a new version? Have you tested all your routines and all the libraries you're using to check that they implement time correctly for the expected life of your product? Y2K has generated a lot of attention and focus but it is just one consequence of possible time handling limitations and one which is not particularly difficult to take account of when compared to most others.
At least we are now reassured that even if all computers crash at the changeover between 1999 and 2000, it will very probably not happen at the exact same time.
(Thanks to Lewis Carroll who was concerned with where the day begins all his life, and to the RISKS mailing list, full of fascinating information and very high signal-to-noise ratio)
(C)1997, Centaur Communications Ltd. Reproduced with the kind permission of EXE Magazine.
Updated on 2019-04-09: corrections made to the description of the GPS rollover of 1999. The GPS signal now transmits the week as a 13-bit value, so my error was prescient!