Hacker News new | past | comments | ask | show | jobs | submit login

This is a great illustration of why, once you've got a user base of any size, backwards-compatibility needs to be one of your touchstones.

(eg. if you add {macro}, continue to support !macro as well)

Sometimes you do find that you've painted yourself into a corner, though.




Certainly, backward compatibility is critical and has always been a weakness of mine. I've always been an idealist there.

But in xkas' case, this was code from before I really learned how to program; let alone design a proper parser. This would be like asking for strict compatibility between Linux 0.01 and Linux 3.0.

I understand how SMWC became trapped in a cycle of using xkas v06, but any new projects really have no sane reason to be using it anymore. And I'm not even saying, "use bass instead!", there's dozens of cross-assemblers that work much better than it ever did (it would seem that writing an assembler is a rite of passage for ROM hackers.)


I wasn't trying to comment directly on the specifics of xkas - I just think this is a perfect illustration of how backwards-incompatibilities that means users can't just upgrade and keep working introduce friction against upgrading, and once you get too much friction significant portions of your userbase will stop upgrading at all. I think this lesson is really valuable in a wider context than xkas.

Since you mention Linux, this is why Linus has the "no regressions" policy - if something works on kernel N then it should also work on kernel N+1, so that people can upgrade their kernel and not worry about something breaking in userspace. (I think there is actually a very good chance that an a.out binary compiled for Linux 0.01 would still work today.)


> I just think this is a perfect illustration of how backwards-incompatibilities that means users can't just upgrade and keep working introduce friction against upgrading, and once you get too much friction significant portions of your userbase will stop upgrading at all.

I definitely understand and respect your point.

In fact, I've lost about 80% of my userbase (about 80,000 users) to older versions (and forks galore) of my emulation software. And that's mostly due to lacking backward-compatibility with older game ROM formats.

I've since made efforts to accommodate using older media types, but the damage is done, so to speak.

> I think there is actually a very good chance that an a.out binary compiled for Linux 0.01 would still work today.

... seriously?? Jesus.

I don't even see how it's possible to produce good software that way. We don't know what we really want until we iterate over the design a few times. At least, that's the way it is with me. My first drafts have never been anything close to my ultimate designs.

You should see how amazingly better my GUI abstraction library (think wxWidgets but more like Qt with reference counting memory management and C++11 features) has gotten since the first version. I couldn't imagine being stuck with that first version in the name of backward compatibility. I'd rather be the only user of it than have that first version be it, and be as popular as wxWidgets is today.

I mean, I'm very much ashamed of xkas v06, and wish my name wasn't attached to it.


I don't even see how it's possible to produce good software that way. We don't know what we really want until we iterate over the design a few times. At least, that's the way it is with me. My first drafts have never been anything close to my ultimate designs.

After you've been burned a few times, you start to get a feel for building interfaces that can be maintained while the entire design behind them is switched out. If you do need to change the entire interface, sometimes you end up supporting old interfaces with a shim layer that translates them into calls to the new interfaces.

For example, the a.out binary format supported by the original versions of Linux has long since been superseded by the ELF format, because the deficiencies in the old format became too much to bear - but that doesn't stop the latest kernels from being able to support that old format still.


To me, legacy code has cost. Great cost.

You can't just leave old code alone and the features remain. Any time you go to restructure some code, or support some other mode, or do some other kind of action, it ends up affecting this old code and you have to go and make changes to it as well. Then that old code breaks on a new version of the compiler, so you have more fixes to make. And it increases compilation times, binary sizes, and lines of code. It's another vector for malicious attacks. It's more to keep mentally in your head all at once.

I suppose it's one thing if you have a steward willing to take over, say, the a.out branch of the binary loader. But it's quite another in my case where I'm the sole developer of my software, and no one shows any interest in maintaining the legacy UI, or versions of the emulation core that focus more on performance at the expense of compatibility, so I have to drop that stuff. (Yet it is a bit disappointing when people are willing to fork the entire codebase instead to achieve the same goals, but I digress.)

Again, I agree with what you're saying, and I'd say Linux or Mozilla Firefox or Qt certainly has the manpower for this sort of thing. But as a solo developer, I can see why people would sacrifice this backward-compatibility from time to time.

I'm not sure on your point about getting a feel for interfaces. Certainly, it seems that every year or two, I feel like I've made great strides and am finally on the cusp of stable interfaces. Yet invariably, a year or two later, I look back and think the same thing again. I've come to the conclusion that we never stop evolving and that thinking we've reached perfection is always self-delusion. But, maybe I'll get there in the future. I certainly hope so.


To me, legacy code has cost. Great cost.

Certainly, but it's the kind of cost that comes with having a lot of users, so it falls into the "nice problem to have" basket. If you've got no users, you have a lot more freedom to throw things away.

I understand where you're coming from, because I also do mostly-sole-maintenance on a large codebase with all the issues that come with a 20 year legacy. One strategy that can work in such a situation where you can't maintain perfect backwards-compatibility forever is to boil the frogs slowly - gradually deprecate and remove features over a longish time frame, because a single changed or missing feature is easier to adapt to for the userbase than a whole slew at once (many people probably don't even use the feature and won't notice).


The goal is of course to support the legacy interfaces without having to keep using the legacy code behind them.


I remember they even fixed the kernel to make sure some old viruses can still run.

See http://www.computerworld.com/article/2554467/linux/torvalds-...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: