Capability-based operating systems must be the future. If they are not, then we are all doomed to continue to exist in a messy world where security problems crop up every minute. Capability-based access controls are one of the best options for getting out of our current mess, but they're also the type of thing that must be implemented very low in the system in order to work.
Hopefully, when we start ripping out *nix before 2038, Fuscia and other capability-based OS's can take over.
> If they are not, then we are all doomed to continue to exist in a messy world where security problems crop up every minute.
Side-tangent, but security enthusiasts need to calm down on the "world is ending" talk. Those of us who lived through Windows ME, where logging on IRC basically gave you a 25% chance of having your computer hijacked by some random script kiddie, think it's laughable to say that security is anywhere near as bad as it used to be. Night and day.
> Those of us who lived through Windows ME, where logging on IRC basically gave you a 25% chance of having your computer hijacked by some random script kiddie, think it's laughable to say that security is anywhere near as bad as it used to be. Night and day.
Those were people just messing around. Annoying and harmful to your local machine certainly, but the harm that was caused was localized.
Security vulnerabilities are now industrialized, with large-scale public attacks and widespread silent compromises used to gather and sell your information for profit. The situation is still bad, it's just bad in a different way.
Individually, computers today are more secure than they've ever been.
But there are zillions more computers, doing more things, with more interconnections than ever before. So the problems are compounded and I think the situation is worse overall.
But there are significantly more devices out there these days, that we don't traditionally think of as "computers". Cable boxes/DVRs, webcams, vacuum cleaners, mobile phones, routers, hell a directory traversal bug was found in a dishwasher.
Many of these devices aren't patched after they're sold, and can be compromised within minutes of being assigned a publicly routable IP - this is what happened with Mirai.
Security is better for desktop computers, but if you include everything with an operating system, it's much, much worse.
But it still feels like it. Security news should be “someone managed to breach one of the 20 layers that protect you,” not “someone got root, again.”
At the very least you should be able to stay secure if you know what you are doing. But even that’s not enough. You also need to be lucky && uninteresting && not actually using computers.
Security practices may be better now, but applications are a lot more complicated (meaning much more attack surface area) and there's a lot more at stake now.
I sometimes wonder if addition to a Ethics class, Computer Science students need to take a Computer Archaeology class. It might not be a bad thing to bring up a lot of the concepts that aren't in the main stream anymore that have been tried.
Surely, then younger generations would learn how Algol and PL/I were used to write OSes, how Xerox PARC and EHTZ managed to write OSes without a single line of C code and that UNIX wasn't the genesis of operating systems.
I definitely support that. Many tips I give to people for their projects came straight out of 1960's-1980's CompSci or industrial work. It has been a ridiculous amount of effort finding all of it, though. So many silos. It needs integrated on a per topic basis, cleaned up, an executive summary, and references available for follow-up. Preferably with FOSS tools if it's an analytical method, language, etc.
Then, people might quite reinventing the wheel or missing obvious things as much as they do now.
Looks like it's written in C++. Regardless of how "clean" the code is right now, it's only a matter of time because it becomes roughly as bad as any other OS written in C++ in terms of vulnerabilities found in it per million lines of code.
Android and ChromeOS also have several components written in C++, yet the majority of userspace is only available to Java or Web applications.
On modern OSes, that aren't plain UNIX clones, C and C++ are being pushed down the stack just to provide some hardware abstraction layer, with everything else being done in more safer, productive, languages.
Unix 'capabilities' (before Capsicum, iirc) were a different thing with the same name. Confusing. Unix actually does have a different construct that is like a classical capability: an open file descriptor that you can send to another process over a Unix domain socket.
I'm curious; do you have any details on exactly why the existing implementations are too complicated to be usable, and on whether this looks likely to be just some specific design flaws or inherent in the concept?
Hopefully, when we start ripping out *nix before 2038, Fuscia and other capability-based OS's can take over.