Originally published at: https://boingboing.net/2018/09/27/corner-cases-everywhere.html
…
Oh, I thought this was going to the about those mythical man-months. I suppose those are more of a manager mistaken believe than a programmer’s.
I’m being pedantic here, but I think the DST complexity is a bit simpler than it seems.
Just treat DST as a boolean flag that offsets the time by 1 or 0 hours, depending on whether it’s set. Of course that will have to be inside a structure defining individual regions (time zones, countries, etc., but lets call each of them a region) that stores the actual date and hour of day that the time zone flag is switched. That way, if you have a country with half a dozen time zones, weird offsets here and there, etc. etc., it doesn’t matter. You can just break that country down into arbitrary time regions. Of course, leap years add a bit of a hitch, as does varying output formats per region. That does get simpler when just store every combination as a region, regardless of any national borders. On the other hand, if you have arbitrary regions, you’ll then need to work out which one a user falls into. At this point, I think it would require user selection, which is a pretty so-so way to do it these days. Could create an interface that lets them choose the nearest city in their time zone (I think Debian and its derivatives have a cool interface for that)
~ sigh ~ ok, it’s a little complicated.
The concept of “real world” time is absolutely a bane of my programming existence - concepts that seem natural like “months” are mathematical nonsense. Add leap seconds, daylight savings, and all the regional differences… so much pain!
Yeah, dates… Most of what we see when we talk about the time is a localized interpretation. In the systems I work on there is often a mix of UTC (much preferred for internal storage and calculations) and “locale time”, which is a depiction of UTC, adjusted for local timezone offset and Daylight Saving Time.
If an event (or series of events spanning multiple timezones) occurs where data might be pulled for evidence, having a very solid grip on when that event actually occured can make the difference for someone’s future if the event involves employment, a crime, a claim about priority.
The system I work on used to try and keep track of the various timezone rules around the world, but then Microsoft Windows started including detailed information for translating between UTC and local time. This was a welcome development, and through Windows Updates the rules automagically change without app developers having to keep it all straight.
And what if the user has set an alarm for 1:30 on the ‘spring forward’ day? That time never exists.
I have a DAB clock radio that will adjust its time automatically to cope with daylight saving. Because it doesn’t do this when it’s on standby (and there’s no way to disable it), it’s guaranteed to turn the alarm on at the wrong time on both the daylight saving switchover days.
This is great! I just learned last week about Japanese temporal hours, where daytime and nighttime are divided into six units each, and the hours sort of expand and contract as a consequence; e.g. the hours are literally longer during summer days. Here’s a video of a watch mechanism that’s been made to accomodate this splaying and unsplaying of the hours.
Time is a very difficult thing to do on a computer.
At the very least, a programmer needs to use time consistently:
- A time is a number of seconds from some “epoch” (Midnight January 1, 2000 UTC)
- A time can be expressed to the user as a Date + T24 (a 24 hour time). But you don’t ever want to work with Dates and 24 hour times outside the front end of your application. We use any number of “calendaring systems” to convert from time to Date+T24 and back again. Calendaring systems handle leap intervals, timezone shifts, regional naming conventions, and alternate calendar naming systems (there are countries that don’t use 12 months, arbitrary time intervals, etc. but have their own names for things). The programmer lets the calendaring system take care of these things.
- A timezone is represented as a named offset from the epoch’s timezone: UTC+000, UTC-1200, UTC+600.
- Make no assumptions about ANYTHING.
- And programmers that build websites and applications whose content refers to events that occur across multiple timezones or that occur in different timezones than some users must ALWAYS ALWAYS ALWAYS ALWAYS ALWAYS ALWAYS ALWAYS ALWAYS ALWAYS display timezone indicators next to any and all dates and times.
Have we done Jim the Fish yet?
(The worst thing about Microsoft’s date-time is that they use floating point, and don’t (or didn’t) have any protect against rounding errors when doing calculations and you’d end up with two times an atomic fraction of a second different.)
“Don’t I have a 2pm appointment?”
“No, your next one is at 2.00000000000001pm.”
It’s weird how programmers get accused of ignorance when they’re generally far more clueful about these subjects than the average person, if for no other reason than they are the ones accustomed to dealing with them in a rigorous way.
But I guess “Bad assumptions by people who lack the proper mentality to be programmers” doesn’t get as many clicks.
There’s a list of “Falsehoods Programmers Believe” that covers not just Time, but also Names, Languages, Gender, Families, Addresses, Phone Numbers, and Email Addresses.
Don’t forget that DST start/end are different in different countries!
DST isn’t always +1 hour in the spring, -1 in the fall, it’s the opposite in places in the southern hemisphere which observe it.
In fact, the 1 hour thing isn’t universal either. During WWII the United Kingdom was on British Double Summer Time (UTC+2)
Additionally, it’s not always 1 hour. Look up Lord Howe Island, Australia.
“rigorous way”
But in a dangerously reductive way, too. (Just like all the other things alluded to, like names.)
I’ve had programmers argue with me that we not only need to get rid of leap seconds, but time zones entirely (I forget if it was just for individual countries or the entire world), because they were so problematic for programmers. I mean, sure, let’s destroy the ability of our time-keeping system to deal with all the issues for which we developed a time-keeping system in the first place just to make things easier for programmers.
Um. No.
I live in the southern hemisphere and can asure you that we (mostly, apart from the exceptions) +1 in spring and -1 in autumn
…that makes even less sense than dst in the northern hemisphere O.o
Did you understand the context for leap seconds? I suspect they were talking about how leap seconds are implemented, which is a kludge to keep bad software from breaking. It makes me physically angry.
Computers usually count time as the number of seconds or milliseconds since a reference date, such as January 1st, 1970 at midnight GMT at sea level. People wrote software for decades assuming you could always divide this value by factors to extract the seconds, minutes, hours, days, etc. in calendar time without accounting for leap seconds. We insert leap seconds on the days they occur, but not directly to this count, so the old, naive software doesn’t break. We also can’t add them to the calendar time, because that would break the software that can’t handle a 61 second minute. What we do is smear the leap second across an entire day, making every second on those days a tiny fraction longer than a real second.
Software is broken at both ends - at the low level, where absolute time is counted, and at the high level, where localized calendar time is displayed, and so we work around those bugs by hacking the middle.
Lies programmers believe about calendars
Saw the headline, thought it was going to be like this:
- Programmer looks at calendar
- Sees deadline is 10 days away
- “Ah, I still got plenty of time.”
- Navigates to
bbs.boingboing.net