www.digitalmars.com         C & C++   DMDScript  

digitalmars.D - wut: std.datetime.systime.Clock.currStdTime is offset from Jan 1st, 1

reply Nathan S. <no.public.email example.com> writes:
https://dlang.org/phobos/std_datetime_systime.html#.Clock.currStdTime
"""
 property  trusted long currStdTime(ClockType clockType = 
ClockType.normal)();
Returns the number of hnsecs since midnight, January 1st, 1 A.D. 
for the current time.
"""

This choice of offset seems Esperanto-like: deliberately chosen 
to equally inconvenience every user. Is there any advantage to 
this at all on any platform, or is it just pure badness?
Jan 23 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Tuesday, January 23, 2018 23:27:27 Nathan S. via Digitalmars-d wrote:
 https://dlang.org/phobos/std_datetime_systime.html#.Clock.currStdTime
 """
  property  trusted long currStdTime(ClockType clockType =
 ClockType.normal)();
 Returns the number of hnsecs since midnight, January 1st, 1 A.D.
 for the current time.
 """

 This choice of offset seems Esperanto-like: deliberately chosen
 to equally inconvenience every user. Is there any advantage to
 this at all on any platform, or is it just pure badness?
Your typical user would use Clock.currTime and get a SysTime. The badly named "std time" is the internal representation used by SysTime. Being able to get at it to convert to other time representations can be useful, but most code doesn't need to do anything with it. "std time" is from January 1st 1 A.D. because that's the perfect representation for implementing ISO 8601, which is the standard that std.datetime follows, implementing the proleptic Gregorian calendar (i.e. it assumes that the calendar was always the Gregorian calendar and doesn't do anything with the Julian calendar). https://en.wikipedia.org/wiki/ISO_8601 https://en.wikipedia.org/wiki/Proleptic_Gregorian_calendar The math is greatly simplified by using January 1st 1 A.D. as the start date and by assuming Gregorian for the whole way. hecto-nanoseconds exactly like we do. hnsecs gives you the optimal balance between precision and range that can be gotten with 64 bits (it covers from about 22,000 B.C. to about 22,000 A.D., whereas IIRC, going one decimal place more precise would reduce it to about 200 years in either direction). - Jonathan M Davis
Jan 23 2018
parent reply drug <drug2004 bk.ru> writes:
24.01.2018 03:15, Jonathan M Davis пишет:
 On Tuesday, January 23, 2018 23:27:27 Nathan S. via Digitalmars-d wrote:
 https://dlang.org/phobos/std_datetime_systime.html#.Clock.currStdTime
 """
  property  trusted long currStdTime(ClockType clockType =
 ClockType.normal)();
 Returns the number of hnsecs since midnight, January 1st, 1 A.D.
 for the current time.
 """

 This choice of offset seems Esperanto-like: deliberately chosen
 to equally inconvenience every user. Is there any advantage to
 this at all on any platform, or is it just pure badness?
Your typical user would use Clock.currTime and get a SysTime. The badly named "std time" is the internal representation used by SysTime. Being able to get at it to convert to other time representations can be useful, but most code doesn't need to do anything with it. "std time" is from January 1st 1 A.D. because that's the perfect representation for implementing ISO 8601, which is the standard that std.datetime follows, implementing the proleptic Gregorian calendar (i.e. it assumes that the calendar was always the Gregorian calendar and doesn't do anything with the Julian calendar). https://en.wikipedia.org/wiki/ISO_8601 https://en.wikipedia.org/wiki/Proleptic_Gregorian_calendar The math is greatly simplified by using January 1st 1 A.D. as the start date and by assuming Gregorian for the whole way. hecto-nanoseconds exactly like we do. hnsecs gives you the optimal balance between precision and range that can be gotten with 64 bits (it covers from about 22,000 B.C. to about 22,000 A.D., whereas IIRC, going one decimal place more precise would reduce it to about 200 years in either direction). - Jonathan M Davis
I guess he meant it's inconvenient working with c/c++ for example to add/subtract difference between epoch in c/c++ and d
Jan 23 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Wednesday, January 24, 2018 10:05:12 drug via Digitalmars-d wrote:
 24.01.2018 03:15, Jonathan M Davis пишет:
 On Tuesday, January 23, 2018 23:27:27 Nathan S. via Digitalmars-d wrote:
 https://dlang.org/phobos/std_datetime_systime.html#.Clock.currStdTime
 """
  property  trusted long currStdTime(ClockType clockType =
 ClockType.normal)();
 Returns the number of hnsecs since midnight, January 1st, 1 A.D.
 for the current time.
 """

 This choice of offset seems Esperanto-like: deliberately chosen
 to equally inconvenience every user. Is there any advantage to
 this at all on any platform, or is it just pure badness?
Your typical user would use Clock.currTime and get a SysTime. The badly named "std time" is the internal representation used by SysTime. Being able to get at it to convert to other time representations can be useful, but most code doesn't need to do anything with it. "std time" is from January 1st 1 A.D. because that's the perfect representation for implementing ISO 8601, which is the standard that std.datetime follows, implementing the proleptic Gregorian calendar (i.e. it assumes that the calendar was always the Gregorian calendar and doesn't do anything with the Julian calendar). https://en.wikipedia.org/wiki/ISO_8601 https://en.wikipedia.org/wiki/Proleptic_Gregorian_calendar The math is greatly simplified by using January 1st 1 A.D. as the start date and by assuming Gregorian for the whole way. hecto-nanoseconds exactly like we do. hnsecs gives you the optimal balance between precision and range that can be gotten with 64 bits (it covers from about 22,000 B.C. to about 22,000 A.D., whereas IIRC, going one decimal place more precise would reduce it to about 200 years in either direction). - Jonathan M Davis
I guess he meant it's inconvenient working with c/c++ for example to add/subtract difference between epoch in c/c++ and d
If you need to interact with time_t, there's SysTime.toUnixTime, SysTime.fromUnixTime, stdTimeToUnixTime, and unixTimeToStdTime - assuming of course that time_t is unix time. But if it's not, you're kind of screwed in general with regards to interacting with anything else, since time_t is technically opaque. It's just _usually_ unix time and most stuff is going to assume that it is. There's also SysTime.toTM, though tm isn't exactly a fun data type to deal with if you're looking to convert anything. But if you care about calendar stuff, using January 1st, 1 A.D. as your epoch is far cleaner than an arbitrary date like January 1st, 1970. My guess is that that epoch was originally selected to try and keep the values small in a time where every bit mattered. It's not a particularly good choice otherwise, but we've been stuck dealing with it ever since, because that's what C and C++ continue to use and what OS APIs typically use. - Jonathan M Davis
Jan 23 2018
parent reply drug <drug2004 bk.ru> writes:
24.01.2018 10:25, Jonathan M Davis пишет:
 
 If you need to interact with time_t, there's SysTime.toUnixTime,
 SysTime.fromUnixTime, stdTimeToUnixTime, and unixTimeToStdTime - assuming of
 course that time_t is unix time. But if it's not, you're kind of screwed in
 general with regards to interacting with anything else, since time_t is
 technically opaque. It's just _usually_ unix time and most stuff is going to
 assume that it is. There's also SysTime.toTM, though tm isn't exactly a fun
 data type to deal with if you're looking to convert anything.
 
 But if you care about calendar stuff, using January 1st, 1 A.D. as your
 epoch is far cleaner than an arbitrary date like January 1st, 1970. My guess
 is that that epoch was originally selected to try and keep the values small
 in a time where every bit mattered. It's not a particularly good choice
 otherwise, but we've been stuck dealing with it ever since, because that's
 what C and C++ continue to use and what OS APIs typically use.
 
 - Jonathan M Davis
 
 
I'm agree with you that 1 A.D. is better epoch than 1970. IIRC c++11 by default uses 1 nsec presicion so even 64 bits are not enough to present datetime from January 1st, 1 A.D. to our days. And by the way I'd like to thank you for your great work - in comparison to very (at least for me) inconsistent means c/c++ provide to handle date and time, std.datetime is the great pleasure to work with.
Jan 23 2018
parent reply Jonathan M Davis <newsgroup.d jmdavisprog.com> writes:
On Wednesday, January 24, 2018 10:50:55 drug via Digitalmars-d wrote:
 24.01.2018 10:25, Jonathan M Davis пишет:
 If you need to interact with time_t, there's SysTime.toUnixTime,
 SysTime.fromUnixTime, stdTimeToUnixTime, and unixTimeToStdTime -
 assuming of course that time_t is unix time. But if it's not, you're
 kind of screwed in general with regards to interacting with anything
 else, since time_t is technically opaque. It's just _usually_ unix time
 and most stuff is going to assume that it is. There's also
 SysTime.toTM, though tm isn't exactly a fun data type to deal with if
 you're looking to convert anything.

 But if you care about calendar stuff, using January 1st, 1 A.D. as your
 epoch is far cleaner than an arbitrary date like January 1st, 1970. My
 guess is that that epoch was originally selected to try and keep the
 values small in a time where every bit mattered. It's not a
 particularly good choice otherwise, but we've been stuck dealing with
 it ever since, because that's what C and C++ continue to use and what
 OS APIs typically use.

 - Jonathan M Davis
I'm agree with you that 1 A.D. is better epoch than 1970. IIRC c++11 by default uses 1 nsec presicion so even 64 bits are not enough to present datetime from January 1st, 1 A.D. to our days.
Yeah. Hecto-nanoseconds is essentially optimal. Any less precise, and you're losing precision for nothing, and any more precise, and the range of allowed values becomes too small. I'd love to be more precise, and I'd love to have a greater range of values so that Duration could cover SysTime.max - SysTime.min, but unfortunately, that would mean taking Duration up to cent (assuming that it were implemented), which would be overkill. Even one more bit would make a big difference, but 65 isn't a power of two. The part of C++11's date/time stuff which horrified me was the fact that they templatized their duration type. It does make it more flexible, but it makes it a _lot_ less user-friendly to pass them around to different APIs. For most purposes, hecto-nanoseconds is plenty accurate with a range of values that is more than enough, resulting in a type that can be used in most circumstances while still being user-friendly.
 And by the way I'd like to thank you for your great work - in comparison
 to very (at least for me) inconsistent means c/c++ provide to handle
 date and time, std.datetime is the great pleasure to work with.
Thanks. I did it because I was sick of time-related bugs at work, and I wanted the language I wanted D to get it right. By no means do I claim that std.datetime is perfect, but IMHO, it's way better than what most languages typically have. LOL. I just ran into time-related bug the other day when I tried out the NHL apps for Android and Roku. I was one timezone to the east of the live game I was watching, and in both applications, the bar representing the timeline for the game claimed that the game was an hour longer than it was, with the game being an hour farther along than it was. Presumably, they did something with local time when they should have been using UTC. Time is one of those things that seems like it should be easy to get right but which is surprisingly easy to get wrong. - Jonathan M Davis
Jan 24 2018
parent Kagamin <spam here.lot> writes:
On Wednesday, 24 January 2018 at 08:12:55 UTC, Jonathan M Davis 
wrote:
 Thanks. I did it because I was sick of time-related bugs at 
 work, and I wanted the language I wanted D to get it right. By 
 no means do I claim that std.datetime is perfect, but IMHO, 
 it's way better than what most languages typically have.
Rust's SystemTime has some questionable implementation details too, but at least it's UTC-only and doesn't concern itself with timezones.
Jan 24 2018