Hacker News new | past | comments | ask | show | jobs | submit login

Sounds like any large company - no longer a fast-moving innovation engine. Instead a customer-preserving value-calculating incremental development shop. Just what you'd expect.



Having web apps that break my workflow every 2 months, and back end servers that change their API every 3 months, I'm very happy with a company where I can run software (that I still enjoy) 10-20 years later.


Very essence commercializing of web apps is working against them. Interoperability, open interfaces, open data structures and last but not least reliability seem to be forgotten these days where the lock-in-effect is the only thing keeping people.

I don't want to be held hostage, so web apps and the cloud can kiss my shiny metal behind. I'll keep my data and apps on my own machine until ideas like sandstorm find a middle way in which I can stay in control of my data.


Don't you think that the fact that webapps don't have interoperability, open interfaces, or datastructures, nor even a way to execute them under a debugger or own infrastructure or even read the files they create ...

Don't you think that for companies like Google, Microsoft, etc. this is a feature, not a bug ? Microsoft would lose too many customers if they went full-on webapps, but everyone else won't.

And customers like it ! Wtf ? But it makes sense : no install troubles, no running out of diskspace, no losing files (of course on occasion someone else loses your files. Like Amazon did a few months ago), ... But there's many false advantages, too. Webapps are most definitely not virus-proof (just ask any online banking security guy. Poor suckers), as a client side virus can change what's on the screen or just take over the browser window when you close it, or proxy the webapp, or ...

But if you sell someone a program that works and they install and use it, ... well that's it. You got paid but that's the end of it. You get someone to run their business on a webapp, in the next negotiation you have the power to kill the other company's access to their own financial data. Or designs, or ... You got them by the throat.


You are describing why it is the way it is. Yes, pretty much so. There is a lot of convenience involved. The downside is that it creates mostly invisible dependencies which will break at the worst of times.


20 year old Linux applications work just fine too, and it's still faster than NT.


> (You'd guess that 40 year old Unix applications could work too without recompilation, though I've never tried it)

40 years takes you to 1976, at which point there's no longer any ISA-level compatibility between the hardware we have now and the hardware Unix ran on then. Basically, nothing we have looks like a PDP-11 to software, and Unix didn't run on much else in the mid-1970s.

(Tidbit: 1976 is when the Lions book came out. Full source code, with apposite and useful comments, for Sixth Edition Unix, passed around as samizdat long afterwords due to Bell Labs enforcing copyright on the Unix source code not long after.)

https://en.wikipedia.org/wiki/Lions%27_Commentary_on_UNIX_6t...


Just the other day I got Bourne's original Bourne shell running on a modern system (as part of Fuzix development). I don't know which version, but it's badged as coming from V7.

I kept finding places which needed adjustment for a modern architecture... and then I looked more closer and saw that they were actually fine. About the only thing I needed to do in the end was fix some headers, rename a symbol which was clashing with a modern utility function, and change the directory reading stuff to use opendir() instead of read().

Everything else just worked on a modern 64-bit system. (It had already been converted from K&R to ANSI C in an earlier pass by someone else.) Good grief, that code is clean. Incomprehensible, but clean.


With DOS, win98, linux all running in asm.js inside browser, it should be long before PDP11 code or to run in here too?

BTW, I love to use original Lotus 123 with DOS inside asm.js if someone can use their supper power to make that materialized.

I still remember those "/FS" keystrokes after so many years.


Your wish is my command; preloaded with V6 Unix.

http://pdp11.aiju.de/


So you are saying I can install a modern Linux distribution, let's say Ubuntu, and run KDE 2 and Gnome 1.0 applications right away, say by copying over the binary? I don't think so. These won't even run with a recompile, unless you can find the appropriate kdelibs or whatever and even then, I'm guessing it will fail hard.

If you're just referring to eg the gnu tools or whatever, then you'll still need to recompile.


If the binary depends on userland libraries, then duh of course it'll fail, you're not copying over the entire application. But if it only makes syscalls then it should still work.


OC is making a point of one of Windows' strength, there are a lot of binaries from 20 years ago that will still run just fine in Win 10


Linux binaries will still run. There's a question of whether or not X has changed their API, but I doubt it given how old a lot of X applications are. So I guess that it should be possible with minimal tinkering.


Don't get me wrong; I have no problem with that. Its not fair (or very useful) to compare a successful multi-billion revenue company with a startup.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: