The thought of unusual Unix times brings back memories... or nightmares...
For several years, the company I had been working for used a multi-platform GUI library called C++/Views. We had the code for the library itself and compiled it as needed for the various platforms we needed to support. It came with a handy, yet annoyingly buggy, WYSIWYG GUI-creator tool.
Wouldn't you know, the WYSIWYG tool stopped working in September 2001. C++/Views was no longer supported, and we didn't have source code for the WYSIWYG tool. A guy Brian and I experimented around with it awhile and discovered we could use the tool only if we set the system clock back before September 8, 2001. So, that became every developer's workaround: temporarily setting the clock back to use the darn tool.
Good thing we use the decimal system or we would have never noticed this. Wake me up when it rolls over the 2^31 limit, that should be an interesting day. Much more interesting than y2k.
We would never have noticed what? I mean, 1234567890 only looks special because of a few cultural biases we share (like using the decimal system). Consider that if we were used to, say, a ternary system this quantity would be unrecognizable while other ones would be.
And I don't see what incidence noticing this has on anything, besides a few posts about it on the internet and a feeling in a lot of people that this is "special". My first reaction was thinking that this is cool, but after careful consideration I realize how pointless this is. Let's get excited over actual events. News.
Well since that is in the year 2038, there isn't exactly a rush to fix it (I'd be surprised if 32-bit software/hardware is still in use by then). Something tells me it won't be an issue.
Thankfully a 64bit integer clock gets us past the year 292 billion.
That's the same reasoning everyone used for y2k. There was no way that the same software and hardware being used in 1980 would still be used in 2000... and then it was.
But the difference is we have (hopefully) learned from our y2k experience.
Also commodity hardware/software is much more common nowdays than before 2000. Back then you were reluctant to get rid of something you paid millions for because it was custom hardware running a custom OS with custom Cobol on it.
Nowdays you have commodity hardware running continuously updated OSes with custom Java on it. Thankfully Sun can update the JVM and MySQL and hopefully your date problems are solved.
If we'd truly learned something, all time values would be 64 bits. Which isn't hard: every major 32 bit systems supports 64 bit values natively in C and in only a few instructions.
Or do you mean we learned that planes won't fall out of the sky due to date problems, and we shouldn't overreact next time?
I'm quite sure that 32 bit software will be in use for a long time to come and still be in use by 2038 primarily because the ever increasing use of machine virtualization allows one to defer updating software to run on new systems. The budget conscious may ask: Why update when you have perfectly working software on a VM which can just be moved from hardware to hardware?
I think I am starting to believe in Numerology. I don't know what this means yet, but I will get to the bottom of this one day ( hopefully before 9999999999 ).
For several years, the company I had been working for used a multi-platform GUI library called C++/Views. We had the code for the library itself and compiled it as needed for the various platforms we needed to support. It came with a handy, yet annoyingly buggy, WYSIWYG GUI-creator tool.
Wouldn't you know, the WYSIWYG tool stopped working in September 2001. C++/Views was no longer supported, and we didn't have source code for the WYSIWYG tool. A guy Brian and I experimented around with it awhile and discovered we could use the tool only if we set the system clock back before September 8, 2001. So, that became every developer's workaround: temporarily setting the clock back to use the darn tool.
Nasty billennium (1,000,000,000) bug. Fun, fun.