Date() beyond 2038-01-19, 03:14:07

Share your advanced PureBasic knowledge/code with the community.
User avatar
Rescator
Addict
Addict
Posts: 1769
Joined: Sat Feb 19, 2005 5:05 pm
Location: Norway

Post by Rescator »

Little John wrote:
Rescator wrote:In any case, it is a standard as it's been published (on my site) and have been so for a few years now.
"Interesting" interpretation of the word standard.
I never said "official standard", it is a "standard", and has been documented, and is fixed and won't changed and have not for years. If that is not the definition of a "standard" then I got no clue. Now a "official standard" is something different though and has additional criteria. (no luck there yet :) )
Little John wrote:
idle wrote:Though I personally don't see the point in worrying about the date beyond 2038
Generally speaking, I think we can learn from the "Year 2K Problem", that it's a good idea to make our programs safe for the future in good time. The routines provided here have been requested on the German forum ( by a rather young fellow :) ). After writing them, I considered it not a bad idea to share them here, too.
Heh, yeah Y2K, I forgot what the new YK is called. (when unix wraps the seconds counter).

A standard like the one I propose or one simply based on.
Year (4 bytes), Months (1byte), Days (1byte), Hours (1 byte), Minutes (1 byte), Seconds (1byte), milliseconds (2 bytes) should cover most needs for precision as well as longevity for millions of years.

In like what, 11bytes? Possibly less if binary coded, or more if you also want nano secs. I prefer it "raw" though, it may waste a few bits that are unnused but the date/time can be read and presented to end users directly though.

If you plan to create a new datestamp format you really need to consider nanosec precision though, as that is what the NT filesystem actually uses for it's file datestamps in case you did not know. Not sure about Unix.
User avatar
Kaeru Gaman
Addict
Addict
Posts: 4826
Joined: Sun Mar 19, 2006 1:57 pm
Location: Germany

Post by Kaeru Gaman »

just a little halftopic history for those who don't know/remember...

the Y2K problem was caused by the practice to store only two digits for the year.
data was stored alphanumeric, and space was expensive, and so they spared two byte on the year.
in the 70ies they thought "this is almost 30 years ahead to become a problem"


speaking of alphanumerical storing, you can spare half the space tho and still have it "raw":
each byte can hold two digits, one per nibble.
it does in hexadecimal mode, now just use the ten decimal numbers and leave the letters unused.
older CPU also had a Decimal Flag, to skip the upper six values of a nibble when incrementing/decrementing.
Year (4 bytes), Months (1byte), Days (1byte), Hours (1 byte), Minutes (1 byte), Seconds (1byte), milliseconds (2 bytes) should cover most needs for precision as well as longevity for millions of years.
interesting approach.
the question is also if it's worth to keep the ancient notation with a root of SIX.
it may take only one full generations to get used to a system completely based on a decimal splitting.
additionally, what is the use in a seven days week nowadays?
how about ten or fife?
oh... and have a nice day.
Post Reply