Hacker News new | past | comments | ask | show | jobs | submit login

Timestamps do not have timezones. That's the whole point of a timestamp.



Timestamps must have an implicit timezone, otherwise they are meaningless, because they cannot identify a certain moment in time.

Only time differences do not have a timezone.

When Python continues to use the word "timestamp" for a time value with an unknown time zone, then that is the fault of Python, because the name is completely inappropriate.

I assume that whoever enabled a value "None" to be possible for the timezone did not really expect that anyone will ever use a timestamp with unknown timezone, but that in such cases there is a timezone known to the programmer, which is not stored with the timestamps, and that it is the responsibility of the programmer to restore the timezone whenever necessary.


In this case we're talking about Unix timestamps though, which count the number of seconds since 1970 UTC. Adding a timezone to it is nonsensical. The number is always the same, no matter which timezone you're in. It's in the conversion to year-month-day, etc. that timezone becomes relevant.


GP’s argument is that that means it has an implicit timezone of UTC.


That's not exactly right. I don't live in UTC, when I use timestamps they don't automatically (whether implicitly or explicitly) refer to UTC.

A Unix timestamp is a difference in time, not a time point. It's a number of seconds. It's a duration, not an instant, which is why it doesn't in and of itself need a time zone attached.

That being said, people do use timestamps to represent time points sometimes, and that does requires picking a particular timezone. That's fine too, but it's not the only use or meaning of a timestamp. It doesn't make timestamps without time zones meaningless, and doesn't mean they implicitly refer to the UTC epoch.

Durations and time differences are timezoneless numbers. When you add a duration to a time point (like the Unix epoch in your preferred timezone), you get another time point, in the same timezone.


> A Unix timestamp is a difference in time, not a time point. It's a number of seconds. It's a duration, not an instant, which is why it doesn't in and of itself need a time zone attached.

The number of seconds since what, exactly?

It appears to be the number of seconds since Jan 1, 1970, at midnight... in some timezone.


The point is that the question 'since what' is a misleading intuition, it's not at all essential or implied.

You want to know how long something takes. You take a timestamp before and one after.

A timestamp since what? Well, by definition, the number of seconds since the unix epoch.

Which unix epoch, exactly? In what timezone?

Unspecified. You could have picked any timezone, and the result would not change, as it is not essential. Even if there is no timezone, not even one that you picked in secret, it still works.

Asking the question for durations is a subtle error. You only need a timezone for a time point. Timestamps are often immediately converted to time points, but that is neither necessary nor inherent.


This seems just like kicking the can down the road, pretending that time zones don't matter because you don't want them to.

If I'm comparing system time one year ago to system time now, that's a duration, and the timezones don't matter. But if I'm not comparing, just recording, then the point of comparison is the unix epoch, which is a point time.

Time zones don't matter to durations, only to points in time, okay. The unix epoch is a point in time. Saying it isn't a point in time, or that its timezone doesn't matter doesn't actually make it so.

"It's the very precise number of seconds since an event that took place 52 years ago, and it has a margin of error of 86,400 seconds," sounds crazy to me.


To be more clear, I hope:

Comparing seconds-since-epoch "now" to the same from a year go: timezone doesn't matter so long as I'm consistent. Pick UTC, pick CDT, pick nothing, just do the same thing both times. This is the vast majority of usages of seconds-since-epoch, I think.

Comparing seconds-since-epoch "now" to epoch: in the abstract, timezone still doesn't matter, except, well, how do use the same timezone as epoch unless you know which timezone epoch was in? You could be off by 82,800 seconds! In practice, assuming that the epoch was Jan 1, 1970, midnight UTC seems to work for those rare cases, leading to the widespread (but technically incorrect) belief that epoch is in UTC.


If you refer to that date, then UTC. But you may just as well say that it's the number of seconds since 1969-12-31 7:00 pm EST. There is nothing special about UTC wrt Unix timestamps.


I would argue that "seconds since epoch in UTC" has an explicit timezone of UTC.


I agree with you and I think the above poster needs to adjust their thinking.

What they're saying is correct in the most technical sense, but our industry has landed on timestamps not needing timezones. If they want to work in this industry, they need to accept that.

It's ok to be surprised by something like this, it's less ok to argue about it.


A point on the globe has a timezone. "Distance from London in feet" does not.


That is true, but a timestamp is intended to specify a particular time, and a simple temporal interval fails to do that just as a simple spatial interval fails to specify a location; both are incomplete (the former on account of a missing starting point, and your counter-example on account of a missing azimuth.)

On the other hand Unix (POSIX) time does not have leap-seconds, while UTC does not, so it is no longer "in" UTC.


POSIX time isn’t “in UTC” regardless of that. UTC is just one of many timezones you can use when specifying which point in time timestamp 0 refers to. You could just as well use EST or anything else.


You are right - through this discussion, I have come to see that unless a value expresses a specific time (as opposed to interval) in a form that includes hours and minutes, then timezone is not an applicable concept. POSIX time does not do this.


So the analogy is "Distance from London in feet" to "Time since epoch in seconds"? Ok, but "epoch" has a timezone, which is UTC, so measurements relative to that have the same timezone.


This should be files under "falsehoods programmers believe about timezones".


It doesn't. Equivalent definition of Unix timestamps would be "seconds since 1969-12-31 7:00 pm EST". It doesn't mean Unix timestamps have an implicit timezone of EST.


This does exist as a concept, though I agree that "timestamp" is not a good name for that concept. The Java standard library calls it LocalTime. It is necessary to attach time zone information in order to determine an Instant.

The big problem with the python API, in my opinion, is that they conflate these two very different concepts in a single type.


utcnow() does not return an integer timestamp, it returns a tuple including hours and minutes values, in both Python 2 and 3. Regardless of whether you regard Unix (POSIX) time as implicitly timezoned, the return value of utcnow() unambiguously is.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: