Posted: Fri Aug 22, 2008 3:50 am
I never said "official standard", it is a "standard", and has been documented, and is fixed and won't changed and have not for years. If that is not the definition of a "standard" then I got no clue. Now a "official standard" is something different though and has additional criteria. (no luck there yetLittle John wrote:"Interesting" interpretation of the word standard.Rescator wrote:In any case, it is a standard as it's been published (on my site) and have been so for a few years now.

Heh, yeah Y2K, I forgot what the new YK is called. (when unix wraps the seconds counter).Little John wrote:Generally speaking, I think we can learn from the "Year 2K Problem", that it's a good idea to make our programs safe for the future in good time. The routines provided here have been requested on the German forum ( by a rather young fellowidle wrote:Though I personally don't see the point in worrying about the date beyond 2038). After writing them, I considered it not a bad idea to share them here, too.
A standard like the one I propose or one simply based on.
Year (4 bytes), Months (1byte), Days (1byte), Hours (1 byte), Minutes (1 byte), Seconds (1byte), milliseconds (2 bytes) should cover most needs for precision as well as longevity for millions of years.
In like what, 11bytes? Possibly less if binary coded, or more if you also want nano secs. I prefer it "raw" though, it may waste a few bits that are unnused but the date/time can be read and presented to end users directly though.
If you plan to create a new datestamp format you really need to consider nanosec precision though, as that is what the NT filesystem actually uses for it's file datestamps in case you did not know. Not sure about Unix.