Hacker News new | past | comments | ask | show | jobs | submit login
Why is Wednesday, November 17, 1858 the base time for OpenVMS? (stanford.edu)
188 points by kristianp on Jan 7, 2020 | hide | past | favorite | 67 comments



As found here [1] but variations are all over the net.

In 1998, a programmer who had been working on Y2K fixes started to get anxious because he couldn't believe how pervasive the problem was. He switched from company to company trying to get away from it, but everywhere he went he became regarded as the Y2K expert and immediately became the team lead for that company's Y2K contingencies. He finally had a nervous breakdown, quit his job, and decided he wanted to be knocked unconscious when the Y2K actually came about.

A month before Y2K he was put into an artificial coma and cooled down to a near cryogenic easily sustained long term life support.

Unfortunately the life support notification system had a Y2K bug, and no one revived him for 8000 years.

Finally he was found and revived. He woke up, and saw himself surrounded by lots of glass, light, stainless steel, and tall beautiful people in white robes. He asked if he was in Heaven.

They replied, "No, this is Chicago. Actually but it's a lot like Heaven to someone like you."

"Someone like me?"

"You are from the 20th century. Many of the problems that existed in your lifetime have been solved for thousands of years. There is no hunger and no disease. There is no scarcity, or strife between races and creeds."

"What year is it now?"

"Yeah, about that - it's the year 9,998. You see, the year 10,000 is coming up, and we understand you know something called COBOL?"

[1] https://www.reddit.com/r/ProgrammerHumor/comments/3aakb8/the...?


The saddest Demolition Man ripoff ever.


Worth a read:

> Note that the OpenVMS time display and manipulation routines allow for only 4 digits in the 'YEAR' field. We expect this to be corrected in a future release of OpenVMS sometime prior to 31-DEC-9999.


One of my favorite touches from LongNow.org is that all the dates on their website render in 5 digits:

> The Long Now Foundation was established in 01996 to foster long-term thinking and responsibility in the framework of the next 10,000 years.


This website looks like a cult's homepage


Long Now was founded by Stewart Brand, the environmentalist behind the Whole Earth Catalog: https://en.wikipedia.org/wiki/Whole_Earth_Catalog

While they do have some active projects, like the 10K-Year Clock, it's mostly just a series of monthly-ish talks with a variety of ecologists, engineers, etc., themed around long-term thinking. It's all quite grounded.


it is pretty much cult-like, and the rabbit hole gets deeper the longer you look. highly recommend going to the related bar when/if you're in SF ever: https://theinterval.org/


Some might say the best way to preserve knowledge and culture for the long term would be to establish a successful cult. It seemed to work for the Egyptians and the Christians (experiment still ongoing). Just look at how we count years in the Western world.


Can you be more specific about this feedback? What makes it feel cult-like to you?


My guess is because it seems to share people and terminology with some 60s and 70s counterculture and new-agey stuff.

I say that with total respect for said movements, but I see why cynical analysis would say cult-like. And there are a few people and groups in similar categories that are cult-like.


The most prominent feature on that page is labelled with "become a member starting at $8/month".

It's reminiscent of an MLM page, and MLMs are arguably (hell, demonstrably) cult-like.


I read it as more of a PBS tote-bag / Patreon kind of thing. MLMs tend to promise fantastic riches, which LongNow certainly does not. (Not a member, but I've been following their talks for roughly 15 years now.)


Salvation or triple your money back!


Praise "Bob" or kill me!


For those not in the know: MLM = Multi-level marketing, also known as a "pyramid scheme".

https://en.wikipedia.org/wiki/Multi-level_marketing


I think the "antiquities" theme, though appropriate, plays into it.

Mainly, the "graying old man deep in thought surrounded by beams of blue light" really sets it off though.


They have just enough time to schedule meetings to discuss it, code it, review it, then deploy it. Except they'll discover on 15-DEC-9998 that there's a critical bug, and it'll be known as Y10K. All of the original developers will have been long gone, so there's no way of getting them involved. An entire new development environment will be created so they can run a client based language as a server to handle the new date format


Don't forget the 64-bit unix time epoch, that'll be a big one. 584 billion years from now, 2^64 seconds will roll over and just cause all sorts of mess.


Which means that 64-bit time_t is a somewhat-ultimate "lossless and future-proof archival format": you could use it to successfully round-trip-encode a reference to any historical or future second between the creation of our universe, and the last time life will be sustainable for carbon-based lifeforms in our universe.

Mind you, presumably, some civilizations in our universe will go on beyond that. Referencing this† chart, silicon-based life (e.g. maybe even us humans, if uploaded into computers) might be able to survive about half-way into "the Degenerate Era", before the material substrate we run on just radiates away through proton decay. Representing the last second life could exist would require a continuum of 4e32 seconds, or 108 bits.

https://en.wikipedia.org/wiki/Graphical_timeline_from_Big_Ba...

And also keep in mind, computers (including any silicon-based life-forms) think much faster, and so are in want of much more fine-grained measurements than our unit of "seconds" to place events on their own timelines. We already use nanosecond timestamps for our computer-system-internal log events (which would add a 30-bit fractional part to our 108-bit timestamp, to create a combined 138-bit fixed-point representation.)

But let's be "safe." We don't know how fine-grained we'll need to get! We might want to record data about the first events of our universe, for example. I would think that a "really ultimate", future-proof, archival format for time record-keeping, would probably be denominated in Planck time units.

Interestingly, counting in Planck time units instead of nanoseconds, "only" requires 144 bits for the fractional part!

108 + 144 = 252. Or, rounding up, 256. We could represent any instant of time we could ever care about, in fixed-point encoding††, in a 256-bit word.

†† (It wouldn't really be fixed-point encoding. It'd just be an integer number of Planck time units since the Big Bang. But you can right-shift by 144 to get something close to a count of seconds, which is useful.)

Finally: you still want more? Any time between the creation and heat death of the universe, even though nothing will be around to experience it? Well, that'd be 333 bits worth of seconds. 333 + 144 = 477, which would fit a 512-bit machine word.

One could assume, then, that any Matrix-like universe simulation substrate will have a 512-bit Time Stamp Counter register.

...presuming, that is, that it was built to simulate a universe like our own, and not one that has finer-grained Planck units or which goes on for longer. :)

----

Does all of this have any use? Maybe! If we could figure out precisely what time it is in some objective sense—how many Planck time units have passed here in our reference frame since the creation of the universe—that might be a useful "handshake format" to describe "when" something happened to aliens, when building things like Voyager's Golden Record. (After all, we can teach them about our calendar, but how do we describe+transmit when our calendar began in a way they can measure against in their reference frame, in order to know how to do the math to translate dates back and forth?)


    After all, we can teach them about our calendar, but how do we describe+transmit when our calendar began in a way they can measure against in their reference frame, in order to know how to do the math to translate dates back and forth?
Could this be done by sharing the distance between objects in space? No clue how accurately we could measure it, but if they could say hey year 0 was when earth was X LY from andromeda, and they can calculate how far they are now, and the rate of expansion, they might be able to calculate when year 0 was?


> One could assume, then, that any Matrix-like universe simulation substrate will have a 512-bit Time Stamp Counter register.

Maybe, though you can bet they don't have to deal with leap years and seconds. Let alone have to adjust the clocks as an earthquake slightly changed the speed of the rotation ever so slightly.

Makes you wonder what they would use as a reference point of time and what units of measure. Pulsars are great, but on the scale of the universe, maybe not that great.


Slow the definitional ticks down to be independent of (slowly but increasingly divergent from) Planck time and we can morph it to create as much time as we "need".


But any aliens' frame of reference is almost certain to be non-inertial with respect to our own :) That kills any notion of simultaneity.


> That kills any notion of simultaneity.

That's not actually true. Assuming there's a distinguished "initial" event [the big bang], and your universe does not contain any closed timelike curves [citation needed], you can define a Longest Proper Time from the initial event to any other event as the (obviously) longest proper time that any object or test particle could experience along a (timelike) path from the initial event to the later event.

Simultaneous events have equal LPTs and are always spacelike-separated (if there was a timelike path from A to B then B.lpt (>)= A.lpt + len(path) > A.lpt, so B.lpt!=A.lpt and they aren't actually simultaneous).


You can still line them up with many points of reference.

Would be pretty hard on a human timeline though.


This is why I love HN.


> An entire new development environment will be created so they can run a client based language as a server to handle the new date format

Is this in reference to something? Did some Y2K patches partially consist of putting Node.js-based gateways in front of mainframes to translate dates back and forth?


:%s/Node.js/Java/g

All hail CICS Gateway!!!


I'm pretty sure there was no Node.js in 1999/2000.


Not literally, but Netscape released server-side JavaScript a few months after adding it to Navigator: https://en.wikipedia.org/wiki/JavaScript#Server-side_JavaScr...


By 15-DEC-9998 Earth Corp will long be run by extra-terrestial hedge funds which forced theirs slav^employee on our planet to use their homeplanet's 2710.7-days-per-year timesystem...


> OpenVMS should have no trouble with time until: 31-JUL-31086 02:48:05.47.

I'd suggest others working on OpenVMS check their C compilers before forgetting about the 2038 bug.

It's fun when OpenVMS shows up here! Any other younglings fiddling with it?


VMS also "popularized" The microfortnight as a unit of time. In SYSGEN if you set the TIMEPROMPTWAIT value, it was in microfortnights. That would be how long the system would wait at console during a boot if it didn't have a valid time in its system clocks. Presumably your attentive operator would enter one during this interval.


Insurers and financial institutions had to deal with quite a few of their Y2K problems well ahead of the year 2000. Figuring out rates and percentages on a 30 year loan would have you calculating dates after 2000 as far back as 1970, when Unix was still being created.

I wonder how many of these companies started seeing problems back in 2008?


Lots. It wasn't catastrophic at the time -- in most cases. (There were a few!)

I remember when HP was trying to test their DC in Germany, they rolled the computers (and the VMs) up, and then back, but the DC locks, well, locked, and the staff couldn't get in to access the computers to resolve any RT logging info spewing out. Fun times.

Many of the Y2K problems were handled annually, monthly, weekly, daily, hourly, minutely and secondly well in advance of "the deadline".


I cut my teeth on a VAX/VMS system when I was younger and always had a fondness for it. I have an AlphaServer system at home with OpenVMS on it, and occasionally play around with it.


Anyone who's played the more recent Fallout (non MMO) games is familiar with a few commands, interestingly.


Oh! Nice find!

https://fallout.fandom.com/wiki/Terminal_commands

Have to revisit that, was a quarter my age when that came out.


I googled around and it seems OpenVMS does in fact have a 32 bit time_t. Unfortunate given that per the article it seems to use 64 bit quantities in the layer below.


But the article says:

> Given this base date, the 100 nanosecond granularity implemented within OpenVMS and the 63-bit absolute time representation (the sign bit must be clear), ...

It's unfortunate that an article about time isn't itself dated or timestamped!

I found a google groups pseudo-newsgroup comp.os.vms discussion dated 2017 which shows that OpenVMS 8.3 still had 32-bit time_t: https://groups.google.com/forum/#!topic/comp.os.vms/QCms9ZzR...

Pretty sure the posted article is wrong about 63-bit because at the intro it clearly says, "All Versions" yet here we see that as of 8.3 it was still 32-bit. OTOH, in that newsgroup article they show OpenVMS producing a time of year 2106, which is beyond the 32-bit range of the unix epoch. So I'm pretty confused.


> I found a google groups pseudo-newsgroup comp.os.vms discussion dated 2017

Saw that too in my googling, fair warning that several commenters on there seem misinformed. (The code sample at the start is incorrect and some comment rebuttals seem to misunderstand or get it partially correct. In particular some people seem to think time_t is a struct with int components or that sizeof(int) has something to do with time_t.)

> Pretty sure the posted article is wrong about 63-bit because at the intro it clearly says, "All Versions" yet here we see that as of 8.3 it was still 32-bit.

The point is that the kernel doesn't keep the time as time_t. It has a different epoch and unit, and the C runtime needs to convert it to time_t. So the fact that 63 bits (64 if you count signed bit) is used internally says nothing about time_t at the libc layer.

Windows has it similarly. There, the epoch is 1/1/1601 and the unit is 100ns, stored as 64 bits.


> The point is that the kernel doesn't keep the time as time_t.

Does the kernel (linux or VMS) care about absolute calendar time at all? time_t is a strictly userspace construct is it not?


Short answer is yes it does.

One example where the kernel needs to care is that the filesystem maintains the last write time on files. I am sure there are many others.

And if not having the kernel store a time somewhere, where must it come from? Userspace cannot fabricate it from nothing. Even if the kernel used some "relative" thing, where does the base offset come from? However you answer that, keep in mind all processes have to agree, writing it needs to be privileged, it might need to read and write a hardware clock, etc., all problems storing it in the kernel solves.

Also keep in mind that time_t is not a "calendar" construct, it is an integer, and it's not time zone specific. User mode can and does build calendars and time zones on top.


I would hazard a guess that the last POSIX/SuS VAX/VMS were coded against still specified a 32-bit time_t, which would have likely polluted their proprietary userspace. The OpenVMS guys were shooting for compatibility, so it stuck. Legacy is a PITA.


Windows uses January 1, 1601.

https://en.wikipedia.org/wiki/System_time


This ever-so-slightly simplifies the calculations. The pattern of dates has a period of 400 years. Your first day of year calculation can ignore the year divisible by 400 leap year rule if you start in the year after.


If you never need to represent dates after 2001, sure.


The calculation for 1/1/2001 is the same as the calculation for 1/1/1601 plus the number of days in 400 years. So it doesn't have to take into account the special case of February 2000. (Other than in the number of days in 400 years, but that's a constant.) I've written the code. It's a tiny bit simpler.


The "divisible by 400" rule means that 2000 wasn't a special case, in the way that 1900 and 2100 are.


Well it also has the DATE type which has December 30th, 1899 as epoch.

https://docs.microsoft.com/en-us/cpp/atl-mfc-shared/date-typ...


A little offtopic. But what kind of commercial use to VMS have at the moment. I know there’s a lot of big iron IBM systems around, but haven’t heard much about VMS (nor have I ever used it). What hardware does it run on? Who is using it? And for what purpose?


IIRC, the more vital & boring an industry is, the more likely that it still has VMS systems hanging around. The German post office & rail system were very large users until at least fairly recently.

There was enough interest in VMS such that HP spun off development to a startup near the old DEC HQ outside Boston, who are also working on the port to modern x86-64 hw.

Ah, apparently first boot on x86 has occured:

http://www.vmssoftware.com/updates_port.html


It runs on VAX, Alpha, and Itanium! Woo!

Most people haven’t seen a VAX. I saw them at my university and they were used for various administrative tasks and recordkeeping, but I never used them. I don’t know if they’re still there. The university only had a couple people who knew how to use them.

Generally you will see them hang around, powered on, for long periods of time running some software which has been working for years. You don’t mess with them because they work fine, and messing with a working system will more likely break it than anything else. Every once in a while, you’ll see a job posting for people with VMS experience but it’s tough, because everyone with VMS experience is retired.

See: https://news.ycombinator.com/item?id=20209869

Another company I worked at recently had a VAX, but it was part of the company’s museum!


My university had a VMS cluster (some VAX, some Alpha) available to the students at least until I graduated in 2010. Of course, nobody used it for web browsing or discussion forums any more, but it did provide the most reliable online interface to the class registration system. On registration day, it would be common to see over 200 simultaneous users on even the old VAX systems. We found that the web frontend fell over frequently, but once you were into the VAX, you were set.

At one point, I personally acquired both a MicroVAX and an Alphastation and installed VMS on both. I found it a very boring system, in that it seemed like something you'd configure for a task and it would just do that task forever--I doubt the sysadmins at my school spent a lot of time maintaining the VMS cluster, but it kept on ticking for years!


In the spirit of repurposing, there is the Vaxbar:

http://toyvax.glendale.ca.us/~vance/vaxbar.html

A VAX 11/780, where the Boot switch operates a blender. And the guy doesn't even drink!


Every once in a while, I go on eBay, search for VAXstations, and fantasize about buying one. Maybe one day I'll actually pull the trigger.

There's one for $650 now that says it's fully working and comes with a licensed copy of VAX/VMS. There are also a few in the $150-200 range but I don't know if they have an OS on them or not.


It was 20 years ago that I bought my first VAXstation 3100 on eBay for under $100. Later got a VAXstation 4000 and an AlphaServer 2100. Got rid of them all during a period of downsizing. Along with some nice terminals and other DEC equipment. Kinda regret that now.


I believe you can get a license from HP for non-commercial use for free.

https://www.openvmshobbyist.com/faq.php


I used to have a VAXstation 3100 on my desk. I was mostly porting software from VMS to Windows NT 3.5 but worked on optimizing the VMS version too.


They are still used in a lot of “mission critical” industries that would have been computerized in the 1970s/1980s. In the past 5 years I’ve seen them in finance, some government sectors, and in quite a few in hospitals.


I've heard first-hand accounts of them being used in manufacturing automation up until fairly recently, particularly in pharmaceutical manufacturing. But now a lot of that has moved to emulators running on standard x86 hardware.

You can imagine the resistance to shutting down a manufacturing line that's been running fine for decades simply because the hardware is old. A lot of expense goes into getting everything cleared with the FDA, so it's not a trivial matter to replace hardware and software.


> But now a lot of that has moved to emulators running on standard x86 hardware.

Not exactly:

https://logical-co.com/

Modern PCs cannot even drive a parallel port with the correct timing. PDP/VAX CPU, RAM, and storage are emulated, but Q-Bus and Unibus interfacing needs to be done with custom hardware.


A little off topic, but if you haven't read this story, it's kind of amusing (relevance is date functions in Excel)

https://www.joelonsoftware.com/2006/06/16/my-first-billg-rev...


    > The three cycles are 15, 19, and 28 years long.
Where did these numbers come from?


28 (solar cycle) × 19 (lunar cycle) × 15 (indiction cycle) = 7980 years

https://en.wikipedia.org/wiki/Julian_day

indication cycle seems to be like a Roman Egypt property tax assessment cycle, used to date medieval documents throughout Europe. I guess Joseph Scaliger must have been looking for notes about astronomical events in archival documents, and wanted to get them all on a consistent calendar.


https://www.britannica.com/science/calendar/Time-determinati...

Looks like 28 years is how often the calendar repeats, using their count of leap years. The other two seem to be tax and census related?

What an odd way to run a calendar. Every 7,980 years the tax man and the census man will both arrive on the same Monday? Gosh I hope I still remember the next time that happens. No actually I don't. Being alive for 8000 years sounds like torture.


I used a base year of 2000 in my compact date formats, one because it allows hundreds of thousands of years, and the other because years closer to zero can be stored in less bits:

https://github.com/kstenerud/smalltime

https://github.com/kstenerud/compact-time




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: